Agile Software Testing: How Your SDLC Can Benefit From a New Strategy

By Nate Good, ScrumMaster on Dec. 8, 2015 View Comments

Agile_Software_Testing_Blog.jpgThe shortened sprint cycle is a significant change that occurs as the direct result of migrating the Software Development Life Cycle (SDLC) to an Agile development methodology. It brings a number of immediate benefits, but also introduces several challenges. One of those challenges is maintaining a consistent level of quality as part of every release without incurring exponential costs in testing.

With proper understanding of testing procedures, you can turn an arduous testing process into one that seamlessly adds to the team’s momentum. But first, let's talk quality.

Why is it important to build in quality?

Quality must be built into every story, not tested in at the end. A common but unfortunate practice is to introduce 'hardening sprints' prior to a release in which no new working software is delivered. Instead, the team goes into a defect fixing phase. Think of it this way: a perceived need for hardening just means that you have a quality problem to address. Instead, try adding multiple levels of test automation to your Definition of Done and expect to see several positive side effects:

  1. Supports and encourages ongoing refactoring by providing confidence to developers, testers, and management that the system still works after changes are introduced.
  2. Guides the design of the system and ensures it’s testable.
  3. Eliminates surprises at the end of a sprint or release. Risk isn’t completely removed, but it’s addressed earlier and more frequently.

Who’s responsible for quality?

The entire team is responsible for the quality of the product at every increment. In Agile, testing is no longer carried on the backs of quality analysts. The diverse skillset of the Agile development Scrum team is required to ensure that the automated test strategy is robust and sustainable.

Teams often get off to a rapid start with automation only to discover that they lose control when trying to maintain their tests. This can be avoided by applying engineering practices like incremental design, refactoring, and addressing specific concerns, on which we’ll elaborate below.

This shared responsibility of quality depends on a culture of continuous improvement. It’s better to slowly build up automation by listening to the tests and making necessary improvements than to attempt a blitz attack at automating all regression tests in a short period of time. I’ve seen entire organizations turn away from automation because of one bad experience where a team failed to account for one or more key requirements (e.g., test isolation or control over data).


What does testing look like on an Agile team?

Agile software testing should take on many forms. Test Driven Development (TDD) is a popular Extreme Programming (XP) practice that can be applied at each of the levels listed below. A highly collaborative team with good customer engagement may want to consider one of the Behavior Driven Development (BDD) frameworks. BDD is also known as Specification by Example, where requirements are captured as executable tests.

Most teams new to testing will rush straight to UI tests. Agile development Scrum teams with technical discipline may already be practicing unit testing. The majority of teams, however, completely disregard everything in between. Consider how you’d benefit from adopting a more comprehensive testing strategy and spreading the responsibility out across the team.

Whatever you decide, the organization needs to agree on the terminology you’ll use and commit to using it. Don’t let a term like ‘Unit Test’ take on several different meanings. I’ve noted typical definitions and use cases for several tests below:

UNIT TESTS
These are sometimes referred to as 'micro tests.' Unit tests should not be confused with manual testing performed by developers before handing features off to QA. Instead, unit tests are automated using a xUnit framework and are executed as part of every build. Here are two simple criteria you can apply as you start:

  1. They should run without a network connection.
  2. Each test should run in under 1ms.

COMPONENT TESTS
Component tests validate the interaction of two or more components within a system. These can still be executed during a build because they don’t interact with external systems. Also keep in mind that database connections may be made from these tests and that these tests favor in-memory embedded databases since they support test isolation and parallel execution.

INTEGRATION TESTS
These validate the interactions of the tested system with one or more of its external systems. Integration tests must be run post-build since they execute against a deployed application that’s running in container.

SYSTEM TESTS
A system test validates the functionality of a fully integrated, full-stack system. For example, these could be executed against a middleware service or MVC endpoint, and batch jobs fall into this category. Using Service Virtualization is an acceptable practice (i.e., to setup a controlled environment, simulate failure scenarios, or test SLA permutations) so long as you ensure test coverage of all integrated systems running non-simulated.

USER INTERFACE (UI) TEST
A UI test is one performed against the browser or thick client application. These tests may or may not exercise the full stack of services supporting the UI or they may leverage one or more layers of service virtualization to isolate the testing of the presentation logic. With modern advances in front-end (JavaScript) tooling, you’ll want to consider all previous types of testing for your client-side logic, too.

Caution: Remember that both the time and effort necessary to run and maintain tests will increase exponentially as you move from Unit Tests to UI Tests.


Where do you execute tests in the Agile SDLC?

In short, everywhere. Well-designed tests should be capable of running across all environments (i.e., development, test, production) and platforms (i.e., OS, browser). Data driven testing, for one, can be used to leverage a common test harness across a number of environments that don’t share common data stores.

Tests should be maintained in a source control management (SCM) system (i.e., Git) that the entire team has access to and should run with little to no setup after they’re fetched from the SCM repository.


When should you perform automated testing?

Because Agile development methodologies place high value on automated tests, they should be used at every opportunity. The faster and more stable the tests are, the higher their value will be. The goal here is to fail fast. Successful automation will catch regressions very shortly after they’re introduced, when the cost to fix them is near zero and impacts haven’t spread to a larger audience.

Use this list to determine what tests to run at certain times:

  • After Every Code Change: Unit
  • Before Developer Commit: Unit, Component, Integration, System, UI Smoke Tests
  • During Continuous Integration (CI) Builds: Unit, Component
  • Post Deployment (To Ensure Usable System): Integration, System, UI Smoke Tests
  • Nightly: Integration, System, UI - Full suite of tests (These validate all features.)
  • Weekly: Integration, System, UI - For large systems where all tests may take days to run; non-ideal
  • Release Milestone: Integration, System, UI (These may be tests that require manual intervention or carry a higher cost to execute.)

Note: Smoke tests represent a subset of the longer running System and UI Tests that ensure a stable runtime.


How can you get started?

Make sure your entire team, or organization, has a common understanding of what the testing goals are because efficient operation requires a fully disciplined team.

There’s no shortage of tools that promise to ease the burden of automated testing. The truth is that testing requires the same level of engineering rigor as your production system does. Be wary of vendors who promise trivial record and playback features. As you begin to develop a test strategy, take the following concerns into consideration:

How will you isolate your tests from one another?

  • Account for concerns like data conditioning, concurrent access to shared resources, and external dependencies.
  • Test Small: a single problem results in multiple test failures.

How will the team evaluate, plan, and ensure an acceptable level of test coverage in each of the testing levels mentioned above?

  • Be careful not to confuse line-level code coverage metrics with coverage of all acceptance criteria, and functional or non-functional requirements.
  • Consider how your test status can be derived from your automated tests passing, failing, or simply being implemented. Time that would’ve been spent reporting and tracking can be used to deliver value in the form of working software (tests).

Does the tooling you have or plan to use support engineering practices like object-oriented design, refactoring, and code quality analysis?

  • Since automation requires coding, make sure everyone has a working knowledge of house to use a modern Integrated Development Environment (IDE) and version control.


If you don't have individuals in your organization who have experience solving these kinds of problems, consider leveraging an outside professional for assistance.

Now that you know when and how to implement Agile software testing, you’ll want to get your teams and organizations on board. Check out our post on engaging employees during a shift to Agile, and learn more about our digital business solutions.

Agile_Development.jpg

Posted in: Agile Development Methodology