This week's guest post comes from the Savings.com, who use Sauce Labs to run tests as as part of their continuous deployment process. Read on to learn about how they did it:
Savings.com is the leading coupon and deal site on the web. We were recently ranked by Inc Magazine as one of the fastest growing companies in the US, and our US and UK businesses have have seen phenomenal growth every year. In the past 6 months alone, we have localized into another 7 countries and we are adding new services and features all the time.
The Savings.com development team needed to implement new features and ship these to the production site right when they are completed and checked in. One or more features or bug fixes can define a new release to production. To accomplish this, we would have to implement continuous integration and deployment to our production sites multiple time per day. And when using an agile release cycle, there is only time to manually test the new features being rolled out per release. All regression testing of existing features would have to be automated using Selenium.
We had already been running automated browser tests on our own servers and virtual machines using an in-house Selenium Grid implementation for a few months. This was working "somewhat ok" when we were on a less agile development release cycle (about every 3 weeks). But when we adopted continuous deployment, many pain points that already existed with our locally maintained Selenium implementation became all the more painful:
We weren't testing all major the browsers. Especially with IE, you need to test every version your site supports.
Due to our limited hardware availability, a maximum test concurrency could not be achieved. As a result, running all the Selenium test suites would take up to 1 hour. Double that time if the tests needed to be rerun due to a regression test failure.
It was difficult to debug and fix flaky tests that needed to be re-written due to unidentified race conditions. Selenium WebDriver didn't have a nice command logging interface out of the box. Also, screenshots were only being taken on a test failure, which sometimes missed a key step that would help identify a problem.
An aborted or canceled test run on our own Selenium installation usually required some manual intervention to close running browsers and make sure the selenium grid pool was idle.
The maintenance overhead to keep up to date with the latest OS/browser combinations as well as maintain the selenium grid was taking too much time.
I initially came across Sauce Labs via their blog while searching for Selenium tips and coding best practices. While looking at all the services and support they offered, I thought I would give them a try. I quickly realized Sauce Labs had already identified and resolved all of the pain points listed above:
An exhaustive list of OS/browser combinations to test on that is updated when new releases are out.
We could finally achieve a maximum test concurrency and reduce the total test run time from 1 hour to 10 minutes. This scalability is very crucial to a
Plenty of support for test debugging: Sauce Breakpoints, detailed command logging, video recording and screenshots at every step of your test.
Sauce Labs deploys a clean virtual machine for every test.
We no longer have flaky tests. A failed test run is either a bug or an unidentified change to an existing feature that requires a test to be updated.
Also, excellent support for running in a CI environment, ours being Atlassian Bamboo, Selenium/Java using Sauce Connect.
Sauce Labs has really become an important component to our success in adopting a continuous integration and deployment release cycle. I am currently working on implementing additional features they have to offer such as the Sauce Rest API and the Bamboo OnDemand Sauce plugin.