Smarter ideas worth writing about.

Assuring Quality in the New Agile World

Today’s customers expect to get new features and capabilities delivered to them at a regular pace.  At the same time, they are used to being able to quickly and freely switch service providers.  When trying to satisfy these two facts, it is tempting to adopt agile for the development effort while attempting to carry forward the status quo approach to software quality.  

Software quality issues are a good way to push your customers away to a competitor.  But how do you assure quality while delivering software updates at such a frequent pace? Below I will identify some common missteps and give guidance for how to remedy them.  I aim to prove that you really can have your cake (frequent delivery) and eat it too (assure high quality).

The Cost of a Bug
In order to gain appreciation for what I will propose, it is important to understand the cost of a bug.  It is a well-known fact that bugs get exponentially more expensive to fix as time goes by.

What are the factors that lead to this?  First, software is very complex.  It is impossible for any developer on a team to keep track of all the intricate details about how a piece of software is built.  As time goes by, this understanding diminishes further.  If a bug is presented to a developer months, or even weeks, after the developer worked on the code, it can take a substantial amount of time to get re-oriented enough to be able to even approach the code.  

There is also the cost of promoting the fix through the stages of development and quality assurance.  In the worst case, a bug found in production, this represents significant effort.  Even after the development effort to fix the bug is complete, there are a series of steps required to get the fix into production; it needs to be deployed to a QA environment, a full regression tests is usually required, change control forms and sign-off needs to be acquired, and the deployment needs to be scheduled, possibly requiring customer communication of potential downtime.  These steps almost always involve a large number of people besides the developer(s) who make the fix.  Even if the bug is found prior to a production deployment, the farther the software is in the pipeline toward production, the more of these steps need revisited as a result of the bug fix.

Finally, we cannot ignore the intangible costs associated with bugs.  For example, confidence in the development team may have been damaged.  If the bugs cause delays, the organizations reputation may be tarnished with the customers.  

Quality Assurance in the Waterfall
In a project being developed using a waterfall approach, it is very common for the development and QA efforts to be taken up by different departments within an organization.  They have their own place in the waterfall life cycle.  The QA department usually doesn’t enter the picture until after the development team finished its implementation of the requirements.  The QA department then verifies the entire system all at once.  We often use the expression “throwing it over the wall” to refer to this.  One team owns the project for a while, then passes it off to another team.  No single team “owns” the project all the way through.  By reading this post, you likely understand the reasons why this is bad, but when transitioning from waterfall to agile, it is not always easy to know where QA should fit in.

The Agile Transformation
So, you’ve taken the plunge into the transformation to agile and are developing iteratively.  Your product owner is maintaining the backlog and the development team is completing chunks of it in 2 or 3 week increments, or sprints.  But where is your QA process during all this?  If you are stuck in the waterfall mindset outlined above, as many teams initially are, you probably have your QA department lagging a sprint behind your development team.  Often, the developers consider their work done when they’ve deployed their changes to a QA environment for testing purposes.  This is “throwing it over the wall” again, but in smaller increments.

A better strategy is to tightly integrate the development and QA efforts.  Testing should be incorporated into the core development cycle such that no team can ever call anything “done” unless it is fully vetted by thorough testing.  There are many factors working against you: loosely defined acceptance criteria, outdated quality standards, time-consuming regression tests, slow & error-prone deployments to the QA environment, and rigid organization charts can all derail us.  Below, I will address some of the common mistakes made by organization during their transition to agile.  

Inconsistent Quality Target
There are two main weapons that agile gives you to form the basis on how to verify the quality of a system; acceptance criteria and the team’s definition of done.
Acceptance criteria are expressions, usually written in a structured format called Gherkin, which define what is required for a piece of functionality to be accepted by the business and/or stakeholders.  For example, if the team is tasked with building an ordering system, there will be acceptance criteria that defines what happens when a new order is added.  Acceptance criteria can also be used to define the performance or other non-functional aspects of the software.  

It is absolutely critical that acceptance criteria be defined well before the development effort on the functionality ever begins.  The team should all be involved to some degree, but the testers are usually especially well-suited to helping the business define acceptance criteria.  The acceptance criteria tells that developers what they need to build by informing them of how it will be tested.  

A helpful analogy is taking a test in school.  It is easier to pass a test if instead of having to give the answers to the questions, you are given the answers and simply need to supply a question that yields that answer.  Knowing the criteria that needs to be satisfied means the developers need to only build software that yields the desired outcomes.

The definition of done is a list of quality checks that have to be satisfied before a piece of functionality can be considered done.  This is where you put non-functional quality requirements that the team must always adhere to as they work through the backlog, sprint after sprint.  Where acceptance criteria ensures the software is built to deliver the expected value, the definition of done ensures that the software is built with quality in mind.  

Lack of Automated Tests
Testing software takes a lot of time.  In fact, if you aren’t spending the majority of your development on testing, it is very likely that quality is suffering significantly.  With agile, software changes at a potentially very rapid pace, which means that regression testing (re-testing old functionality to make sure it hasn’t broken, or regressed) will be a frequent occurrence.  Every new release has the potential to break previously working functionality.  As the system grows, the amount of testing would necessarily grow.  The only sensible approach is to automate anything and everything that can be automated.

Your developers should already be writing unit tests throughout development.  It also isn’t uncommon to have integration tests verifying that the internal components of your software work together properly, and that your system integrates well with external components.  What is far too uncommon, however, are automated acceptance tests.

Acceptance tests are simply tests that verify the acceptance criteria are being satisfied.  You would want these tests automated to prevent the need to manually regression test the ever-growing system with each new release.  The timing of automating these tests is also important.  Ideally, you want to automate your acceptance tests before the development effort has even begun.  This is referred to as Acceptance Test-Driven Development (ATDD).  There are even tools that let you directly translate your acceptance criteria’s Gherkin statements into the tests themselves.  Even if you don’t follow ATDD, automating the acceptance tests should be part of your definition of done.  

Lacking Automation, continued…
Manually triggering your tests to execute and manual deployments are also common pitfalls.  

The best way to get early visibility to quality issues is to be notified immediately after committing the offending code into source control.  Having a continuous integration (CI) build that runs your suite of automated tests means that the developer can look at the issue with all the details of the problem fresh in his mind.
  
Combined with ATDD, this collapses the Code, DevT (Dev Test), and AccT (Acceptance Test) columns into just one, with an overall relative cost of 10x.  You can take this further by having a “gated check-in”, which means that code cannot even be committed into source control without first passing all required tests of the CI build.  
Because software can sometimes be environment-specific, you must also run the tests against an environment that as closely as is reasonable mirrors production.  One of the most frustrating things I’ve experienced is how difficult it can sometimes be to deploy an increment of work to a test environment for testing.  To put it bluntly, if you are required to fill out a change request form and wait days for authorization to enter another team’s deployment queue, you are doing it wrong.  

Now, combine these two ideas; CI build and automated deployment, and you get continuous deployment (at least to the test environment).  The latest code is automatically deployed to the test environment after each successful commit to source control.  This enables the test environment to continuously execute against the latest work by the developers.  Bugs are made visible significantly sooner in the development cycle, including the fickle “it works on my machine” bugs.

Break Down the Wall
The final challenge to an agile approach to QA is often the QA department itself.  As long as there is a part of the organization that owns a slice of the software development lifecycle, there is necessarily going to be a need to “throw it over the wall”.  A full adoption of agile means that the development team, as a whole, is cross-disciplinary, multifunctional, and fully owns the success (or failure) of the software they create.  

The ironic thing is that by adopting agile and making QA a fundamental part of the development process, the role and importance of QA is actually elevated! Consider the classic waterfall model diagram.


Software is the result of the implementation phase.  However, before that software can begin to deliver value to the business and/or its customers, it needs to flow through the next two phases.  What happens when market or budget forces put pressure on the development lifecycle?  Where is the slack going to be taken in?  It usually can’t be implementation, because then you won’t have the software.  Inevitably, it is verification, and by extension, QA that gets chopped.  This is very unfortunate and has led to innumerable failed projects.  By marrying QA tightly with the act of developing the software, it is all but impossible to cut QA out of the process.

The QA department is problematic when it makes itself part of the development lifecycle, but that isn’t to say that the QA department doesn’t have a role.  There are some aspects of quality assurance that is challenging for a development team that is primarily focused on delivering new functionality of high quality.   Security, for example, is a constantly moving target as new threats are discovered.  For a development team to constantly be testing for new security issues would be a highly disruptive.  Another example is load testing.  As the usage profile of a system changes, so should the tests.  These types of tests, although they can be automated, are so costly in time and processing power to run, that they can almost never be part of any automated test suite that runs as part of continuous integration.
 
Another category of testing that is definitely best suited for dedicated QA testers is exploratory testing.  Automated tests can only catch bugs in the predictable and designed behavior of an application, while exploratory tests catch the rest.  In my opinion, this is the most underutilized form of testing.  Everyone is so busy producing and running through formal test cases that they forget to just try to break the software.
 
These testing practices require a tremendous amount of skill and knowledge of various tools.  It is reasonable to have a QA department that coordinates and refines these practices, provided that the testers themselves are allocated to the development team for which they are testing.

Conclusion
The agile transformation is necessarily a disruptive process.  Many of the old tenets of the waterfall methodology need to be unlearned in order to fully realize the benefits of agile software development.  For many organizations, this is especially true when it comes to QA.  If you can unburden yourself from the old waterfall trappings, agile has a lot to offer.





Share:

About The Author

Senior II Consultant, Enterprise Microsoft Solutions
Nick is a senior consultant in the Columbus office’s Enterprise Microsoft Solutions practice. His passions are learning new ways to deliver software that impresses clients using the latest technologies and Agile methodologies.