Define about Agile Testing Process Agile Testing

The development process introduced time-boxed iterations and full automation to our team for the first time. The key points include the following:

  • Time-boxed iterations
  • Each iteration had to deliver “testable entities;” content was agreed jointly between developers and testers.
  • Continuous integration(at least twice-daily builds).
  • Test automation included
  • clear direction from leadership team(good communications about what to test and when to test it);
  • dedicated automation infrastructure resource– people and hardware; and
  • developer testing, incorporating automated unit test, run as part of the build process; “stop the line culture” for unit test failures;rivalry between teams to have best code coverage of unit tests;charts to demonstrate results shown in department meetings;and function testing.
  • Focus on end-to-end test automation(not just the execution phase)._ Over time, the number of tests increased to a point where a regression run took ten days!
  • System test was largely manual for “consumability.”
  • Humans tend to understand variations better than machines.
  • User-like tests(documentation, GUI).
  • Laser-guided testing–invented to reduce the burden of our regression suite(laser guided testing is explained in detail in the next section). The test automation was separated into two parts, which I will call developertesting and functional testing.

Developer Testing

Developer testing was a big cultural change for our team. We put a huge amount of emphasis on finding defects as early in the process as possible, when they are the cheapest to fix. Under the banner of developer testing, I include the build process itself (compiling and merging the whole product twice a day), automated unit tests, and, finally, a build verification test(BVT)that was run against each driver to verify basic functionality.

For the first time we actually measured the unit test code coverage using the open-source tool EMMA[69]. This was a real revelation, as we now had evidence of the rigour of each developer’s testing. At first we exposed a number of groups who claimed they were thoroughly unit testing, but in reality had not added the tests to the automation. With leadership support this was quickly rectified. At each end-of-iteration department meeting, we displayed the coverage figures for each subcomponent team. At first this was used as a stick, but very quickly it became a carrot as teams openly competed with each other to achieve the highest figures.

Functional Testing

When we built the test team, we made sure that from the start we had an experienced automation specialist who was responsible for creating the infrastructure for running all our tests. We defined automation from the beginning to mean “end-to-end” automation. That meant that from a single button push, the infrastructure had to be capable of installing a new driver, configuring the environment as required, running the appropriate tests, evaluating the outcomes, and storing the results in a database. Too often I see teams interpreting automation to mean automate the execution step. While this has some benefit, the real costs of testing tend to come in the setup– especially in a multiplatform, multimachine environment such as our own.

Once we had the infrastructure in place it was up to the testers to write the material. We instigated a policy that a test was not deemed complete until it had run inside the infrastructure. This did lead to some heated debates with the project team, who of course wanted to see test progress. However, in the long run it proved to be the most effective way to ensure everything was correctly automated.

The principles of laser-guided testing.

principles of laser-guided testing

With a team of up to twenty testers we managed to accumulate vast numbers of tests over the course of several releases. While this mass of collateral was a real benefit to the project, it was beginning to take on a life of its own as it got to the stage where a complete regression run was taking in excess of ten days to complete. The overhead of feeding the automation beast was now becoming a considerable burden to the team.

This is when test specialist Ian Holden and test manager Dave Dalton devised the concept of cumulative test analysis –a way of identifying the most effective tests to run, based on information about how they have run in the past, what system code they cover, and what has changed in the last product build(see Ian and Dave’s paper describing the process [70]).

Since this was going to be a radical new approach to the way we did our testing, we needed to sell the concept to our leadership team. The first step was to rename “cumulative test analysis” to “laser-guided testing” or just “targeted testing,” which I felt conveyed better the exciting approach to what we wanted to achieve.

At its core, targeted testing is very simple. You need two pieces of information: first, a map of what product code each test covers, and second, you need to know what code has changed in the product since you last tested it. If you have this information then it is possible to select the most effective tests to run against a particular product driver.

Below graph provides a simplified graphical overview of the principles of laserguided testing.

In this example we can use targeting to select and prioritize the suites that need to run:

  • S1 and S2 must run.
  • S3 and S5 may add a small amount of value.
  • S4 and S6 need not be run as they add no value to this change.

Breakdown of defects found by testing phase.

Breakdown of defects found by testing phase

I have deliberately simplified how targeting works, and when I do this everyone points out that there may be changes elsewhere in the product that have an impact on a piece of code that hasn’t actually changed. This is of course true and has been taken into account in the targeting algorithms; however, those details would take another chapter at least and I won’t go into them here.



Face Book Twitter Google Plus Instagram Youtube Linkedin Myspace Pinterest Soundcloud Wikipedia

All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status

Agile Testing Topics