The pressure to do more with less, release quickly and often, learn from customers, pivot, adapt… has never been greater.
Application architectures are becoming increasingly modular and leverage more 3rd party services than ever before. The ‘luxury’ of controlling it all in one monolithic package is fading away.
In the quest to succeed in this changing world, many enterprises undergo large scale digital transformations which, at a high level, involves some of the following steps:
- Adopt a flavor of Agile (then bend it to fit their unique challenges, aka existing processes they can’t let go of)
- Restructure around smaller, sticky teams
- Change the job titles for a few key people in those teams and skill them up on a new way of working– Product Manager, Scrum Master, etc
- Define a new reference architecture, including amove to the cloud
- Create customer focussed KPI’s to measure success
- And then hit GO!
While all this sounds great and can be seen to work in any number of case studies, the reality is that it doesn’t work so well for many others.
When pushed to move faster, inefficiencies in existing processes and capabilities are exposed. None more so than the task of ensuring what you deliver to your customers works as expected – and doesn’t break something else unexpectedly! Speed comes from having a solid foundation to build upon, not at the expense of your working software or business processes.
When embarking on this journey a fundamental shift in the processes you employ to manage the quality of your software is required. Moving away from a model where one or two testers are responsible for finding mistakes in everyone’s work, to one where they are responsible for helping everyone improve the quality of the teams’ work is essential.
Many teams I have led overcome this challenge by adopting Continuous Delivery(CD) principles across both legacy and greenfield applications. CD is a significant step forward for traditional software engineering and is a key enabler of Agile ways of working. Whilst automated testing and deployments may seem like an insurmountable journey for a legacy application, each step forwards provides immediate (and measurable) benefits.
Those that are successful at CD do well at the following:
- Document how they will test a feature as part of the requirements gathering process (shifting the testing way to the LEFT, BDD is a method worth experimenting with)
- Automating the process of testing and releasing software, including change management activities
- Scripting the creation of infrastructure and relevant configuration changes, ensuring consistency between environments
- Release to dev, test, stage environments often. This helps to verify the release process ahead of a production deployment
- Maintaining a library of regression tests, built over time, to ensure new changes don’t break existing functionality
In this model, the role of the tester should transform from someone who acts as a testing gatekeeper to a QA lead, helping everyone in the team understand how they can contribute to improving the quality of their output throughout the SDLC. They can educate the team away from simply hitting coverage metrics to tests that ensure core business functionality is protected. From writing testable requirements through to high-quality automated tests, everyone has a role to play, and an effective QA lead can be the glue that brings it all together.
When done correctly, testing helps improve the development process and accelerates your digital transformation.