Whether you've already made the decision to move to automated testing or you're still considering it, it is important to know what the best practices are for making the transition, and which strategies are best suited to your organization. Each application of any complexity is likely to have its own combination of testing requirements, and no two development teams are exactly alike.
In this post, we'll discuss best practices for planning a test automation strategy, and suggest ways for determining which strategies will work best for your application and for your team.
Let's start with the most fundamental questions: What is automated testing, and what makes it important?
Manual software testing is repetitive. It is, in fact, one of the most repetitive and time-consuming types of task associated with either software development or use. For most kinds of software, only manual data entry requires more time or more repetition—and large-scale, high-volume data entry is typically one of the first tasks to be automated.
Automating testing isn't quite as simple as automating data entry, of course, but the basic principle is the same: identify repeated actions, create a framework that allows those actions to be performed with a high degree of efficiency, then automate both high-repetition and low-repetition actions so that manual intervention is either eliminated entirely or reduced to a minimum.
For testing, much of this repetition consists of performing the same basic test operations on different platforms (operating systems, browsers, mobile devices, etc.) and under different conditions (stress, load, resource availability, and so on). Tests may also be repeated a large number of times in order to capture gradually developing or transient problems.
Automated test systems are script-driven, with automated test data entry and automatically recorded results. Scripting can be used to control the number of repetitions for each test, and to apply variations of both test procedures and data gathering in order to accommodate different platforms and conditions.
For the most part, automated testing is also virtualized testing. Typically, unless you need to specifically test your application's interaction with a hardware platform, you can run most or all of your tests on virtual machines.
Testing on virtual machines makes it much easier to automate test system setup, as well as input and output, and it eliminates time spent waiting for hardware-based test systems to become available. It also speeds up the process of testing itself, often by one or two orders of magnitude. This reduces overall test time, and allows you to include types of testing which would be difficult to fit into a manual testing schedule.
Before you make any basic choices regarding automated test design and infrastructure, it's important to understand what is available, and what is required for a first-rate automated testing regime.
Needless to say, automated testing is script-driven. Test scripts may be written in a general-purpose programming language, or in a domain-specific test-scripting language. A test-scripting language will typically be part of an automated testing framework (such as Appium or Selenium, both of which are open source). Such frameworks generally include major elements of the testing infrastructure, along with their own APIs. They may also allow you to record test steps, then edit them in the built-in scripting language, simplifying the process of building up a test script library.
When you test using virtual machines, they typically do not need to be on-premises (unless specific security or configuration requirements make on-premises testing a necessity). Cloud-based virtual test systems will do the job just as well, and by testing in the cloud, you avoid the constraints imposed by limited on-premises resources.
A good cloud-based testing platform will often provide a full-service infrastructure for deploying and testing virtual machines in the cloud, along with analytics, dashboards, security, and all of the APIs required to integrate these services with your existing project management infrastructure.
In addition, a cloud-based testing service such as Sauce Labs can actually make it easier for you to automate hardware testing for key mobile platforms or other devices. With real-device cloud testing, you use a cloud-based testing infrastructure to run automated tests on a set of real devices which are maintained by the testing service.
Ultimately, the test automation strategy you put together should be the one that works best for your organization and your product line. In order for this to be the case, your planning process should include the following elements:
Bring the key stakeholders on board, at least to the degree that they have a voice, if not final decision-making privileges. For testing, this will include your testing staff, developers, designers, and very possibly service desk personnel. Developers and testers need to be actively involved in the process, and all stakeholders should give input (regarding unmet testing needs, for example) and be kept up-to-date.
Ask yourself and key stakeholders the following questions:
What Do You Want To Test?If you were not faced with physical or schedule-based constraints, what would you test? What would your testing priorities be? Which functional areas would you like to test, and what potential performance problems would you like to test for? Which platforms and which combinations of conditions would you like to include in your test regime? What kinds of tests have you set aside as impractical because there was not enough time or equipment to include them?
At this point, don't worry about which tests may or may not be practical in an automated test environment. Right now, all you need to do is compile a list of things that you would like to test, given the opportunity.
What Have You Been Testing?What does your current test regime actually consist of? What have you been testing, and what tests have you designed, but either deferred due to lack of resources, or put on a "run if there's time" list? What test results do you collect, and what do you do with them?
Then, ask two more questions: Given the constraints of your manual testing regime, are you generally satisfied that it tests for the right things in the right way? Are you generally satisfied with the design of your current tests — the individual steps and the overall test process?
Think About DesignIf the answer to these two questions is "Yes," then you will probably be able to use the core elements (test requirements and individual steps, if not overall procedures) of your current test regime as the basis for designing much of your automated regime. If you are not satisfied with most or all of your current test regime, you may be better off designing your automated test regime from scratch.
In either case, however, your automated regime design is likely to (and in many ways should be) based at least as much on key elements of the "wish list" that you compiled as it is on your current testing practices.
In-House or Out?Which elements of your automated test regime do you want to handle in-house, and which elements do you want to take care of using outside services or resources?
Should your developers write the test scripts? Or conversely, can your QA team handle scripting and automation engineering? If you use a limited domain test script language and start with recorded tests, these tasks may have a much easier learning curve.
Will it be easier and more practical to manage virtual machines and an automated testing framework on-premises or in the cloud? In many ways, the answer to this question depends on scale. You can, if appropriate, start with a small, on-premises automated testing regime, with the option of later migrating to the cloud.
Do you want to use a cloud-based testing platform? This will relieve your in-house staff of the task of managing test-automation infrastructure. As described above, services of this type are also very useful for managing a large volume of tests, and for automated testing on both virtual and real devices.
Is it better to outsource all testing to a third-party testing service? Doing this will free up in-house staff and resources for non-testing tasks. It may, however, involve significant up-front costs, and provide less control over the test process.
These questions all involve tradeoffs between testing needs, available staff and resources, budget, and time. The best answers for your team will depend on the conditions within your organization.
And that may be the bottom line when it comes to planning your test automation strategy. The most basic best practice is to clearly understand your testing needs, your resources and constraints, and the resources and services which are available, and to act on that understanding.
Michael Churchman started as a scriptwriter, editor, and producer during the anything-goes early years of the game industry. He spent much of the ‘90s in the high-pressure bundled software industry, where the move from waterfall to faster release was well underway, and near-continuous release cycles and automated deployment were already de facto standards. During that time he developed a semi-automated system for managing localization in over fifteen languages. For the past ten years, he has been involved in the analysis of software development processes and related engineering management issues. He is a regular Fixate.io contributor..