Containers help you deploy apps faster and integrate better. But containers only work well if you have an effective testing strategy in place.
At many organizations, this is not the case. Even though containers have been around for a long time (Docker was introduced in 2013, and other container platforms were around earlier than that), testing strategies for containers have not evolved as rapidly as containers themselves.
Fortunately, by borrowing some ideas from Infrastructure-as-Code (IaC) tools, and taking advantage of an automated testing grid like Sauce Labs so you don’t have to maintain your own testing infrastructure when testing containerized apps, it’s possible to build effective software testing strategies for Docker containers. I explain how in this post.
Let's take an existing application using IaC to deploy an application on a developer, staging, or production environment. The continuous integration (CI) pipeline has several testing guards at each stage of the pipeline. The continuous testing approach will run static analysis, unit, integration, performance, security, and more on IaC against the application code. As you can see, this does a great job of testing an application, but falls short when testing container-based Dockerfiles.
How could this container-based shift impact the team? Arguably, one of the most critical components of a software application is a repeatable and disposable infrastructure with testing. The container-based application has no exceptions. This only means we need to ask some of the following questions when entering uncharted territory.
How do we run static analysis on Docker container images?
How would we run existing application tests, such as unit, Selenium, and more?
How should we test a Docker container after a Docker build?
How do we address scanning Docker container images for security vulnerabilities?
Are there any security concerns when using public Docker containers?
Does my existing CI solution work with containers?
Let's start devising a testing strategy for container-based applications.
If you are thinking to yourself, “Why don’t we just use Docker to run application tests?”, then you are a step ahead of the game! More or less, the Docker testing setup revolves around creating a test Docker image during build time based off the application Docker image, and running tests inside the test container. That allows all the dependency management to move inside the existing Docker container, and for the tests to be self-contained.
An important pre-check validation adds a linter (Dockerlint) to help build consistency and best practices for Dockerfile development. To take it a step further, we should think about scanning our Dockerfiles for security vulnerabilities. We want to verify and report that the Dockerfiles are free from security vulnerabilities and security exposures. You can find security scanning with an opt-in option for private repos on Docker Cloud and Docker Hub for a price. Clair and Dockscan are possible open source options for free.
After containerizing the application using 'docker build,' we need to run the application container to run server integration tests. This allows us to verify the application server configuration to ensure we are consistent and dependable after running Docker containers. The testing proves that the correct packages were installed, configured correctly, and tested on various platforms (CentOS, Ubuntu, and more). Personally, I am always on the lookout for tools and technologies that help my team build high-quality applications. Goss, Dockerspec, RSpec and ServerSpec are a small list of server testing solutions.
Your continuous testing strategy for CI/CD with containers should include a standard to clean up the test containers after you are done using them. To delete the container after the test is complete, add the --rm flag to the 'docker run' command.
'docker run --rm -u root ${testImageTag} acceptance-test'
Before closing, it’s worth noting that containers are not a testing panacea. They can improve your existing test workflow, but they are not a replacement for it.
This is because testing exclusively using containers would require setting up your own test grid, which is a huge amount of work. While it’s theoretically possible to do this, it’s not a good use of time or money.
This is why you are better off using a test grid like Sauce Labs, which doesn’t require you to own and manage your infrastructure. You can still use containers in conjunction with the Sauce grid, but you don’t have to set up all of the infrastructure to go with it.
We should almost always be testing our Dockerfiles and applications as described above. It allows us to remove the dependencies from the build server into the container, and also makes it possible to test our Dockerfile and the application inside the Docker container simultaneously. This setup can be applied to any type of application. #testALLthings
Greg Sypolt is a Fixate IO Contributor and a Senior Engineer at Gannett | USA Today Network, responsible for test automation solutions, test coverage (from unit to end-to-end), and continuous integration across all Gannett | USA Today Network products.
In the last two years, he has helped change the testing approach from manual to automated testing across several products at Gannett | USA Today Network. To determine improvements and testing gaps, he conducted a face-to-face interview survey process to understand all the product development and deployment processes, testing strategies, and tooling. He provides a formal training program for teams still performing manual testing that allows them to transition to automated testing.