When interviewing for a new job on a mobile development team, information about their development strategy, future feature plans, technology and quality focus is hiding in plain sight — if you know where to look. You’ve done your due diligence by reviewing sites like Glassdoor, Monster, LinkedIn, and Fortune, so you have a general idea of the company culture and how the world perceives it. But, as a technologist, it is in your best interest to dig deeper and interpret the clues that are readily available about the development culture you wish to join. A friend of mine was recently looking for a position on a mobile team and turned to me for advice, knowing that I manage a mobile QA team. We put on our detective hats and went to work.
Most people looking for work review the job posting and use it as a checklist for their qualifications. Obviously that’s important, but it’s more important to take an additional step and read between the lines. Look at the company’s job postings for the development team more holistically. Check out the other open positions on the team as well as the one you are applying for, and ask yourself:
Are there a lot of positions open in the group? Does this mean attrition or growth?
What technologies are they requiring versus their wish list? Are they actually using the tech in the wish list? Or are they hoping someone can teach them?
Are the required technologies state of the art? Are they an indication of your chance for staying abreast, or ending up mired in older tech?
The more you review job postings, the more questions you will have ready for the interviewer.
Now the fun begins. For research, we downloaded the apps from all of the platforms that specific companies supported, both for phone and tablets. We went to websites and explored. This allowed us to learn a lot about a company's development strategies and quality focus. We created a checklist that would tell us more, and worked our way through:
We checked apps for a consistent user experience (UX). Was the look and feel the same? Did the apps look like they came from the same company? What about the flow?
Differences in the UX could be an indicator of platform teams not working with the same designers, or they may be caused by developer talent level.
What bugs can you find?
Coming from a QA background, we tested the apps like we would our own. It’s always fun to find bugs in production code, and we did. On one platform, we discovered issues with filters.
Are there accessibility (AX) problems?
During our detective work, we determined that one company (across apps) didn’t do a very good job testing for AX. As an example, I like to use a larger font, and found that they had a lot of overlapping fields. This can be an indicator that testers are not using the same tests against the apps, or are not doing side-by-side testing with the same data. They may not even be on the same team. The AX issues could be a conscious decision by the team, or an indicator of holes that need to be filled.
Compare the apps to the website. Is there value added? What features do you think should be included in the apps?
Side-by-side comparison testing not only allows you to glean information, but prepares you if you are interviewing with managers across teams. (Make sure to include this in your investigation.)
Next, we checked out apps’ ratings across platforms. If one app has a significantly different rating, it will definitely jump out at you. Ask why. Is it because one platform team is weaker than the other? Are there complicated features in the app that are tougher to support on one platform than the other? You can also look for other clues:
What are the commenters saying? Are you seeing a consistent gripe, or are there problems across the board?
Is the general mood positive or negative? A happy customer makes for a happy development team.
Are people finding the app useful? Does it provide a real benefit over the website?
What is the customer's’
wish list? Teams often rely on app store feedback to guide them on future feature releases.
A simple review of the comments for each platform for one app showed us that there were a lot of performance problems, but overall people liked the app and found it useful. The discovery of performance issues was a good find. It showed a weakness on the team, and pointed to an area to focus on, where my friend could add value.
Performance complaints lead us to release notes. This is a treasure trove of evidence!
Looking at just one app from a specific company, over the course of 12 months, both teams released concurrently, every two months. This gave an indication of the team’s cadence, and their ability to consistently release on schedule.
The releases had feature parity. It’s difficult for platform teams to remain in sync with features across sprints. Unless a platform is using a feature switch to hide features on release, one team always seems to be lagging behind the other.
One platform regularly had more bug fixes delivered. (This could be an indicator that the team that might be ahead of the game is larger or more talented.)
Reviewing release notes can show if a team is dedicated to working on new features or maintenance. Release notes show if customer concerns are being addressed, and, like performance issues, whether teams are continually trying to patch them.
Now that you have reviewed the evidence, it’s time to come to a decision. Review all of the clues. Do they point to a strong development team and process? Are the apps in a state of maintenance? Does the company seem to focus on quality? Your due diligence will allow you to walk into an interview with a trove of intelligent questions, and the confidence that you understand the type of culture you may be entering into. Try this exercise on your own company for practice. Every company leaves behind clues.
Joe Nolan (@JoeSolobx) is a Mobile QA Team Manager with over 12 years of experience leading multinationally located QA teams, and is the founder of the DC Software QA and Testing Meetup.