Hiring is tough. I think finding a good candidate is sort of like searching for the Holy Grail. I'm reminded of the final scene of Indiana Jones and the Last Crusade. "Choose wisely,” the knight warns, “for while the true Grail will bring you life, the false Grail will take it from you."
Oh, so no pressure then.
Wise words. Bad hires can be disastrous. So, the pithy version is indeed “Choose wisely." The long version requires some thought.
Before starting, let's take a moment to recognize that even the most analytical, data-obsessive companies still routinely mess it up. Hiring is tough.
I'll try to avoid a "10 ways to make a great hire!" post. Instead, what follows are elements that help you make great testing hires. Mix and match as you please.
Let's start at the beginning. Does a candidate deserve face time? Let's get rid of those folks who really have no business testing. A great filter for this is asking a FizzBuzz question over the phone.
What's FizzBuzz? It's a super-simple question about programming, usually asked over the phone or on a collaborative editor. This can be silly and abstract like “Write a program that prints the first 100 prime numbers.” The purpose of FizzBuzz is to get a demonstration of basic competence. We'll assess the more advanced stuff further on in the process.
Okay, but what if we're hiring a manual tester? For that, I like to ask questions about popular websites. Imagine we're interviewing Sue for a job at Yangtze.com. Yangtze's goal is to sell any product you could possibly dream of. It's a giant Internet warehouse. Eat it, Amazon.com.
If Yangtze.com is in stealth mode, how do we test it? Well, Amazon.com is popular and probably a great proxy for any e-commerce site. My FizzBuzz for Sue is: "Here's a random product page from Amazon.com. How would you test it?"
I'll also time-box Sue's answer. My preference is between 10 and 15 minutes. When I do this, I make sure I give every candidate the same amount of time. It's an important control for bias.
Assuming we like Sue's answers to the above and the other perfunctories ("What interests you about Yangtze.com?"), I'll invite her to the office. For the in-person interview, I want to see Sue at her best. Conversely, I also want to see her struggle. Seeing both ends of the spectrum helps me get a sense of how Sue works normally and how she performs under pressure.
Let's start with a question that shows Sue at her best. I like the classic “Tell me about a recent project that you're proud of." Detail questions like “Why?" and “How?" are important follow-ups. I want to hear about something that stood out in Sue's previous work. Why was that interesting? If it's a generic response like "I checked a sign-up for for error messages, but this one time the engineer forgot to add error messaging," that's a red flag. That's boring, everyday work. That hints that Sue hasn't done a lot of interesting work.
That doesn’t define Sue, though. She's proud of her work, and shared a story about a sign-up flow with a lot of edge cases. Great! She should be feeling confident. It's time to see her under a bit of stress. Here, I like posing difficult questions or asking for help with an unsolvable problem.
Before I dive in, it's important that I make it crystal clear that this is brainstorming, and it's okay for Sue to ask clarifying questions. If she gets stuck, I'll offer to help with some of the basics. It's also important to define beforehand what I won't help with. Again, I want to keep my personal biases in check.
So, what’s a good difficult question? In testing, metrics are notoriously difficult to assess. I can ask her “Should I measure quality by the number of bugs found?” No, that rewards testers paired on difficult projects (or with more junior engineers). “What about the priority of the bugs found?” Again, testers on more difficult projects will naturally find more of these.
I might also ask Sue how she communicates product or test-suite quality to engineering teams. “What makes a good tester?” “What are good metrics for our specific app?” “Does it change depending on the type of app?” Hopefully, these questions make Sue sweat a little.
The goal is to get detail. Before these questions, I know Sue has the basics of testing covered. Afterwards, I should have a few insights into how she handles complex problems. That's the dream, at least.
Finally, personality is important. My imaginary e-commerce site, Yangtze.com, has a lot of salty engineers. You know the type: curmudgeonly, sardonic, but ultimately lovable programmers. There's no crystal ball that will tell me how Sue will fit in with this crowd. So, I like to be candid about culture, and trust the emotional side of my brain to get a sense for this throughout the interview.
Yangtze.com and Sue were a fun “what if” case for hiring a manual tester. In reality, there are many different types of tester. Worse, most businesses call them by different names. The monickers I'm aware of are:
These are job titles for people who manage quality. Some have engineering talent. Some don't. At the end of the day, what's right for you differs from what's right for me. Companies employ different names and each name is associated with a different list of attributes. Draft your own list and define your own role.
A few attributes I frequently find on quality job listings are:
These mostly make sense. A sloppy tester is a bad tester. Equally important, a gentle touch helps when the core responsibility of your job is telling people their code sucks.
Every job is unique, so let's carve out a few unique interview questions to match. If the interview doesn't assess the traits we've listed with the role, then our job description is basically just pablum.
A bit of trivia here: I'm a reformed math teacher. I find crafting good inteview questions similar to creating a good test. The primary thing any test does is assess how well the test taker can exploit the test. This is unavoidable. Hopefully, the second-most-powerful assessment is whatever quality you're actually trying to assess.
Oof — that's an awkward fact. As a teacher, I quickly learned that all quizzes, homework, classwork, and any other kind of assigned work needed to line up with my test format. By presenting work in the same format as the test, I prime students for success.
This strategy helps reduce some of that awkwardness. How good a student is at exploiting tests doesn't matter if I train everyone to exploit the test equally well — or at least, the exploitation should matter less. This results in clearer answers on the “Can you actually do this?" question.
Let's apply that reasoning to interview questions. Designing an interview question should be straight forward. The core tenet is to design questions that are as close to real world job challenges as possible. Forget the "How would you weigh a truck with a bathroom scale?" brain teasers.
If I'm hiring for a mobile-app company, I'll ask candidates how they might prepare for the next version of iOS. That's a decent question. How can I make it great?
A great question has predefined criteria for success. For my iOS question, I'm specifically looking for non-obvious interactions between new iOS 9 features and our app. I'd also look for current features that need to be regressed (and how to mark specific priority). Is there a way for them to reduce their testing burden or constrain scope? Great!
It's important to leave time for questions at the end of the interview — I mean questions from the candidate. It's easy to forget that you're being interviewed, too.
My last bit of advice is that fit is an ongoing thing. Someone might be a great for your company as it was two years ago, but a poor fit today. Conversely, someone who doesn't fit today might be a great asset a year down the line.
It seems appropriate to close with more advice from the The Last Crusade: "You have chosen... wisely. But, beware: the Grail cannot pass beyond the Great Seal, for that is the boundary, and the price, of immortality."
In other words, hiring is tough. But, the rewards are great.
In this post, we'll cover the most common types of web application testing, which ones to prioritize, and tools to use.
Learn how to write a software test plan. Get started with our practical guide and free template.
Many quality assurance metrics evaluate QA activities rather than results. These are the 5 most helpful QA metrics.
Learn the basics of agile regression testing and five strategies to streamline your software testing process.