Scrum’s methodology in Agile Testing Process Agile Testing

The first Scrum project I was involved with(my role was test manager) was the project that was able to follow the principles of Scrum most closely.

The client was a large U.K.-based organization that was interested in developing a new delivery channel for its products. The process(and technology)required to achieve this aim could be split into three distinct segments.Thus, the client decided to assign development of each of these segments to a separate Scrum team and then to run the project as a Scrum of Scrums. I was test manager for the Scrum team responsible for one of these segments.

My team followed Scrum’s methodology by

  • completing all system testing(as opposed to cross-supplier integration testing)within the sprint, ensuring that each sprint ended with the production of aworking piece of software–a release candidate.
  • co-locating developers and testers in the same area. We couldn’t get a closed room, but we did get our own area. To ensure that the testers knew exactly what the developers were developing(and, hence, what they had to test), we initially tried inviting the testers along to the development design meetings that followed on from the sprint planning meetings. This tended to bore and confuse the testers, so we soon switched to an alternative approach whereby the development lead would brief the testers on the detailed design for a user story, once the developers had agreed the design.
  • encouraging frequent interaction between developers and testers –including through table tennis!
  • estimating pieces of functionality in terms of user story points. This was hard at first, but got easier as time went on. We used this to calculate a sprint velocity and hence to estimate when the product backlog(as it stood at any given time) would be finished. Crucially, we ensured that testing effort was taken into consideration during estimation.
  • having stand-up briefings every day and ensuring that people turned up! We also ensured that they were kept brief.
  • frequent unit testing. An automated set of unit tests ran every night to make sure that recent code changes had not broken previously developed functionality.
  • keeping the sprints short. The client suggested four-week sprints, but we opted for two-week sprints as we knew that the client’s development priorities would change rapidly. We were proven right and two weeks proved to be the optimum sprint length.
  • starting each sprint with a sprint planning meeting, which would be in three phases:
    1. Sprint review: The output of the last sprint was demonstrated to the product owner(see later).
    2. Sprint retrospective:The team discussed which behaviors they should startdoing, which they should stop doing, and which they should continue doing(and positively reinforce)to help make future sprints more successful.
    3. Sprint planning: The team received candidate user stories from the product owner and committed to develop and system-test some or all of these during the sprint.
  • not defining requirements too tightly. We had a product backlog of user stories and we had a solution architecture that described the basic components required in the system. Beyond that, it was up to the development team to decide how to implement requirements. Where clarification on technical issues was required, these would be taken to a fortnightly Design Authority meeting with the customer and the other suppliers. Once the development team had decided how to implement a user story, they would brief the testers, who could then start preparing test scripts.
  • having a Scrum board, on which was posted the product backlog, the sprint backlog, the burndown chart, and the latest “start–stop–continue” list.
  • insisting that we be left alone during a sprint to complete the tasks that we had been set. We reminded the customer that a sprint could not be altered–onlyabnormally terminated. One abnormal sprint termination did occur just after thesprint had started
  • not having one Scrum team with thirty or so members, but splitting this into a“Scrum of Scrums.” A central Scrum master coordinated our work with that ofthe other Scrum teams to ensure that we developed functionality in a coordinatedfashion and were able to put releases into cross-supplier integration testing atthe same time.My team differed from Scrum’s methodology by
  • using a “proxy” product owner. The development and test team were 30 miles away from the customer, so the business analyst represented the customer in sprint planning meetings. To make this work, the business analyst would spend two days per week with the customer, getting to know and understand the customer’s requirements.
  • developing a back-end system, which helped (i.e., without a user interface). The customer, therefore, didn’t need to see frequent demos:evidence from integration testing that what we were building was working was enough.

All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd Protection Status

Agile Testing Topics