Ashley Hunsberger, Greg Sypolt and Chris Riley contributed to this post.
Software testing tools are a vital resource for every successful QA team. But with so many tools and testing frameworks out there - from Selenium and Protractor to Espresso and Xcode - how do you choose which are best? How should your toolset vary depending on whether you do desktop testing, mobile testing, or both? And how do you make the most of software testing tools?
Below are answers to these questions from the panelists of a recent Sauce Labs webinar focused on software testing and QA. The webinar was hosted by Chris Riley, with Ashley Hunsberger and Greg Sypolt serving as panelists. You can also find their recommendations on software testing tools below.
Which tools has Greg used for test automation at Gannett?
Greg: Here’s an inventory of testing frameworks and tooling used across Gannett products (technology alignment):
Espresso for Android
EarlGrey for iOS
Datadog KPI Dashboards
What are some tools used for automation across the industry? I've heard of Selenium, but is there anything else?
Here are some testing frameworks to know:
Browsers: Capybara/Cucumber, Capybara/RSpec, NightwatchJS, Behave, and Protractor
API: NodeJS, Jasmine, and Mocha
Mobile: Android Espresso, iOS EarlGrey, Appium, KIF, and XCode 7
I often feel like the DevOps infrastructure problems have to be solved before I can do test automation. Is that true?
Greg: Check out this post about infrastructure planning. It discusses how QA and Dev should share responsibilities for infrastructure. The team also should share responsibilities for DevOps tasks. The modern QA position has become a technical role, the gatekeeper of quality, and QA engineers may continue to take on more DevOps responsibilities and tasks.
Ashley: Every company is different (such as mine and Greg's), but I do work closely with our DevOps team more and more as we transition. We definitely still have our kinks, but we are still doing test automation. We are still working to be in the CI pipeline, but it doesn't prohibit you from still having meaningful tests.
Ashley: Once you have the right technology alignment, demonstrate how and why they are used. We want fewer GUI tests, but we have roughly 40 tests that we consider critical workflows that we always want passing. We are able to quickly identify when something breaks in the UI. Show that you have deterministic results. For example, we were able to identify which commit broke our tests quickly and discuss with the developer. Without these tests, this bug would not have been caught for two more weeks. Since this was still during the development period, overhead was low and we got a fix in within a few hours.
Greg: I agree with Ashley. The best buy-in from developers for me has been technology alignment. Now the developers can help write and review test code. The key to automated GUI testing is reliable processes for developing automated GUI tests. Work as a team to determine the right GUI tests needed, best practices for test code, and continue to focus on ways to eliminate flaky tests and build confidence in the test results.
Greg: We use Jenkins, Datadog, and CloudWatch to measure the health of the Android project to determine if it is on the track to success and identify where improvements need to be made to meet our goals and deadlines. It’s on our roadmap to explore open source Capital One Hygieia and New Relic Synthetic Monitoring.
There’s no shortage of software testing tools out there for both automated and manual testing. Selenium WebDriver remains a staple, but depending on your particular needs, you may want to take advantage of other testing tools, too. A major goal should be to seek technology alignment. That helps to assure that your testing strategy is as efficient as possible, while also facilitating better communication between QA and Development.
Chris Riley (@HoardingInfo) is a technologist who has spent 12 years helping organizations transition from traditional development practices to a modern set of culture, processes and tooling. In addition to being a research analyst, he is an O’Reilly author, regular speaker, and subject matter expert in the areas of DevOps strategy and culture. Chris believes the biggest challenges faced in the tech market are not tools, but rather people and planning.
Ashley Hunsberger is a Quality Architect at Blackboard, Inc. and co-founder of Quality Element. She’s passionate about making an impact in education and loves coaching team members in product and client-focused quality practices. Most recently, she has focused on test strategy implementation and training, development process efficiencies, and preaching Test Driven Development to anyone that will listen. In her downtime, she loves to travel, read, quilt, hike, and spend time with her family.
Greg Sypolt (@gregsypolt) is a senior engineer at Gannett and co-founder of Quality Element. The last 5 years focused on creation and deployment of automated test strategies, frameworks, tools, and platforms.