By Mario Matthee, Head: SQA Architecture and R&D, DVT
One of the most significant and common issues I’ve noticed with recent test automation projects is environment preparation – or rather lack thereof.
Too many companies seem to want to jump into test automation or performance testing without first getting their ducks in a row; in other words, while the will and the funding might be there, the optimal conditions for testing are not.
Some but not all of this lack of preparedness comes down to cost. It’s easier for a startup organisation writing its own app to take full control of the development and testing process, and therefore limit the cost of provisioning a dedicated testing environment. Whereas a larger organisation like a bank would need to provision multiple environments – development, QA, production and so on – all of which costs money.
To cut costs, some limit their environments to development and production, which makes testing extremely challenging. You ideally don’t want to be testing in a business-critical development or production environment, because test automation needs stability and predictability to work properly, neither of which can be guaranteed with ever-changing data sets in development and production environments.
Which brings up the next challenge – data. As testers, data preparedness consumes most of our (down)time. The best-case scenario is being able to reuse data for testing; otherwise, you’d need to obtain massive data sets for test automation and then provision the infrastructure, establish security access and get the different systems talking to each other to be able to use the data. This could cause significant delays. On a recent project, we encountered a three-month delay between the estimated start time and the data and environment being made available for testing.
Reusing data makes sense. For example, as a tester I can commit a transaction into the system, track the transaction through the system, and once confirmed successful by the test automation scripts, I can delete all traces of that transaction from the system as if it never happened. Of course, you’d want to have safety nets in place to ensure the scripts can’t operate in any part of the production system, but once these are in place, you are no longer subject to the delays of procuring massive data sets or setting aside large chunks of your physical resources purely for testing.
From an Agile perspective, it’s also important to get your environment ready before embarking on a Scrum process. And by environment, I mean your full environment – don’t make the mistake of thinking you won’t need certain parts of the environment, like QA, to get the process started, because as soon as the release cycle begins and starts changing and you suddenly realise QA is needed now – not tomorrow or next week – the delays kick in, which can have a domino effect on the viability of the entire project.
Getting your environment ready in advance means getting your business and IT and infrastructure teams involved from the start, and also getting a maintenance team in place checking the environment as often as needed to keep it optimally functional for testing. My suggestion here is to focus on the ‘golden threat’ – if you have one system that supports three different systems, make sure they’re all communicating with each other. Remember the domino effect and delays it can cause if you don’t.
Also, getting your environment ready doesn’t necessarily mean having all the functionality in place – remember you’re just making sure the conditions and connections are set up and ready for testing, so as your developers start importing code and making changes to that code, you’re ready with your scripts and (reusable) data sets to test it.
Yes, there are alternatives available if reusing data isn’t an option. Solutions like Docker and Kubernetes come to mind, where you can spin off your environment for testing purposes. These solutions bring with them their own costs and challenges, but at least we’re moving in the right direction. As soon as we make environment initiation and preparedness part of the total solution, we immediately have a better handle on the challenges we face and make it easier for testers and developers to do their work.
So, before you set off on your next project, ask yourself: are my ducks in a row? What is my strategy around environment and data management? Ask this after the fact, and you risk the seemingly endless purgatory of project delays and, ultimately, project failure.
DVT
DVT is a software development and testing company that focuses on digital transformation technology solutions for clients globally. Its services include custom software development for mobile, Web and traditional platforms, software quality assurance, automated regression testing, UX/UI design, cloud application services, BI and data analytics solutions, project management, business analysis, DevOps and Agile training and consulting. Founded in 1999, DVT has grown to over 700 staff with offices in the UK (London) and South Africa (Johannesburg, Centurion, Cape Town and Durban). DVT is a company within the software and technology group Dynamic Technologies. www.dvt.co.za