The Mock Objects Challenge Agile Testing

The first challenge we encountered with the mock objects approach to testing was finding a way to replace existing component dependencies with mock objects.For new code, this is relatively straightforward since it is good practice to separate units of code, allowing them to be assembled flexibly in different ways without necessitating internal changes.Such an approach supports unit testing quite naturally.

There are two popular patterns that enable this way of working:the service locatoranddependency injection patterns. With a service locator, the code to identify and create new components is defined in a single place – within the service locator.When one unit of code needs to call another, the service locator is asked to provide a service or component that supports the required interface. Unit tests are then able to reconfigure the service locator as required to substitute mock objects.

With dependency injection, the dependencies of a given unit of code are determined at the point of initialization or invocation and are passed directly to the unit for it to use.A unit test is therefore able to call this code directly, supplying mock objects for any dependencies that do not fall within the test scope.

We have used both approaches but tend to favor dependency injection for its simplicity and recent popularity. For existing code, some level of refactoring is often necessary and desirable, although it is sometimes possible to avoid change by employing a technology - dependent solution such as the Java/EJB - specific XJB test platform.When starting to examine the legacy code, we were lucky to find that there was a clean separation of technology-related(application server) code and core business logic. This helped enormously with the migration challenge but also made it possible to run unit tests against the system code without first deploying it to an application server, which is a slow and involved process to automate The second major challenge we encountered was that, even with a focus on unit testing, our automated regression tests eventually started to affect the ease with which system code could be modified. A number of contributing causes were identified:

  • In normal use, EasyMock performs a deep equality comparison on supplied parameters.If any part of the parameter graph differs from the expectation, the test fails.Although it is possible to write custom “matching” logic for EasyMock, it is cumbersome to do and makes tests harder to maintain if widely used.
  • Many of the existing system interfaces were(XML) message - based or were poorly factored,containing operations that could behave in many different ways depending on the parameters supplied. This resulted in a large number of tests exercising the same operations, which compounded the problem of EasyMock parameter matching.
  • When some of the tests became quite long and repetitive, it occasionally became difficult to tell which expectation was causing a test failure. The reason for this is that EasyMock, in common with other mock objects frameworks, evaluates the expectations itself. If an expectation fails then the only reference back to the code that created it is the resulting error message.

In order to improve the situation, we refactored system interfaces and tests where it was practical to do so,treating the tests as production-quality artifacts and putting a corresponding emphasis on their\design. We also took care to ensure that tests were linked back to documented requirements using comment tags so that they could be easily reviewed. This made it easier for developers to tell whether or not a test was failing for a genuine reason and for them to fix the test without compromising its original purpose.

All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd Protection Status

Agile Testing Topics