Balancing competing objectives is a recurring theme on this blog - it's the central challenge of all management decisions. Hiring decisions are among the most difficult, and the most critical. The technical interview is at the heart of these challenges when building a product development team, and so I thought it deserved an entire post on its own.
In this post I'll follow what seems to be a pattern for me: lay out a theory of what characterizes a good interview, and then talk practically about how to conduct one.
When I train someone to participate in a technical interview, the primary topic is what we're looking for in a good candidate. I have spent so much time trying to explain these attributes, that I even have a gimmicky mnemonic for remembering them. The six key attributes spell ABCDEF:
- Agility. By far the most important thing you want to hire for in a startup is the ability to handle the unexpected. Most normal people have a fairly narrow comfort zone, where they excel in their trained specialty. Those people also tend to go crazy in a startup. Now, we're not looking for people who thrive on chaos or, worse, causes chaos. We want someone who is a strong lateral thinker, who can apply what they've learned to new situations, and who can un-learn skills that were useful in a different context but are lethal in a new one. When talking about their past experience, candidates with agility will know why they did what they did in a given situation. Beware anyone who talks too much about "best practices" - if they believe that there are practices that are ideally suited to all situations, they may lack adaptability.
To probe for agility, you have to ask the candidate questions involving something that they know little about. - Brains. There's no getting around the fact that at least part of what you should screen for is raw intelligence. Smart people tend to want to work with smart people, so it's become almost a cliche that you want to keep the bar as high as you can for as long as you can. Microsoft famously uses brainteasers and puzzles as a sort of quasi-IQ test, but I find this technique difficult to train people in and apply consistently. I much prefer a hands-on problem-solving excercise, in a related discipline to the job they are applying for. For software engineers, I think this absolutely has to be a programming problem solved on a whiteboard. You learn so much about how someone thinks by looking at code you know they've written, that it's worth all the inconvenience of having to write, analyze and debug it by hand.
I prefer to test this with a question about the fundamentals. The best candidates have managed to teach me something about a topic I thought I already knew a lot about. - Communication. The "lone wolf" superstar is usually a disaster in a team context, and startups are all about teams. We have to find candidates that can engage in dialog, learning from the people around them and helping find solutions to tricky problems.
Everything you do in an interview will tell you something about how the candidate communicates. To probe this deeply, ask them a question in their area of expertise. See if they can explain complex concepts to a novice. If they can't, how is the company going to benefit from their brilliance? - Drive. I have most been burned by hiring candidates that had incredible talents, but lacked the passion to actually bring them to work every day. You need to ask: 1) does the person care about what they work on? and 2) can they get excited about what your company does? For a marketing job, for example, it's reasonable to expect that a candidate will have done their homework and used your product (maybe even talked to your customers) before coming in. I have found this quite rare in engineers. At IMVU, most of them thought our product was ridiculous at best; hopeless at worst. That's fine for the start of their interview process. But if we haven't managed to get them fired up about our company mission by the end of the day, it's unlikely they are going to make a meaningful contribution.
To test for drive, ask about something extreme, like a past failure or a peak experience. They should be able to tell a good story about what went wrong and why.
Alternately, ask about something controversial. I remember once being asked in a Microsoft group interview (and dinner) about the ActiveX security model. At the time, I was a die-heard Java zealot. I remember answering "What security model?" and going into a long diatribe about how insecure the ActiveX architecture was compared to Java's pristine sandbox. At first, I thought I was doing well. Later, the other candidates at the table were aghast - didn't I know who I was talking to?! Turns out, I had been lecturing the creator of the ActiveX security model. He was perfectly polite, not defensive at all, which was why I had no idea what was going on. Then I thought I was toast. Later, I got the job. Turns out, he didn't care that I disagreed with him, only that I had an opinion and wasn't afraid to defend it. Much later, I realized another thing. He wasn't defensive because, as it turns out, he was right and I was completely wrong (Java's sandbox model looked good on paper but its restrictions greatly retarded its adoption by actual developers). - Empathy. Just as you need to know a candidates IQ, you also have to know their EQ. Many of us engineers are strong introverts, without fantastic people skills. That's OK, we're not trying to hire a therapist. Still, a startup product development team is a service organization. We're there to serve customers direclty, as well as all of the other functions of the company. This is impossible if our technologists consider the other types of people in the company idiots, and treat them that way. I have sometimes seen technical teams that have their own "cave" that others are afraid to enter. That makes cross-functiona teamwork nearly impossible.
To test for empathy, I always make sure that engineers have one or two interviews with people of wildly different background, like a member of our production art department. If they can treat them with respect, it's that much less likely we'll wind up with a silo'd organization. - Fit. The last and most elusive quality is how well the candidate fits in with the team you're hiring them into. I hear a lot of talk about fit, but also a lot of misunderstandings. Fit can wind up being an excuse for homogeneity, which is lethal. When everyone in the room thinks the same way and has the same background, teams tend to drink the proverbial Kool-Aid. The best teams have just the right balance of common background and diverse opinions, which I have found true in my experience and repeatedly validated in social science research (you can read a decent summary in The Wisdom of Crowds).
This responsibility falls squarely to the hiring manager. You need to have a point of view about how to put together a coherent team, and how a potential candidate fits into that plan. Does the candidate have enough of a common language with the existing team (and with you) that you'll be able to learn from each other? Do they have a background that provides some novel approaches? Does their personality bring something new?
My technique is to structure a technical interview around an in-depth programming and problem-solving exercise. If it doesn't require a whiteboard, it doesn't count. You can use a new question each time, but I prefer to stick with a small number of questions that you can really get to know well. Over time, it becomes easier to calibrate a good answer if you've seen many people attempt it.
For the past couple of years I've used a question that I once was asked in an interview, in which you have the candidate produce an algorithm for drawing a circle on a pixel grid. As they optimize their solution, they eventually wind up deriving Bresenham's circle algorithm. I don't mind revealing that this is the question I ask, because knowing that ahead of time, or knowing the algorithm itself, confers no advantage to potential candidates.
That's because I'm not interviewing for the right answer to the questions I ask. Instead, I want to see how the candidate thinks on their feet, and whether they can engage in collaborative problem solving with me. So I always frame interview questions as if we were solving a real-life problem, even if the rules are a little far-fetched. For circle-drawing, I'll sometimes ask candidates to imagine that we are building a portable circle-drawing device with a black and white screen and low-power CPU. Then I'll act as their "product manager" who can answer questions about what customers think, as well as their combined compiler, interactive debugger, and QA tester.
You learn a lot from how interested a candidate is in why they are being asked to solve a particular problem. How do they know when they're done? What kind of solution is good enough? Do they get regular feedback as they go, or do they prefer to think, think, think and then dazzle with the big reveal?
My experience is that candidates who "know" the right answer do substantially worse than candidates who know nothing of the field. That's because they spend so much time trying to remember the final solution, instead of working on the problem together. Those candidates have a tendency to tell others that they know the answer when they only suspect that they do. In a real-world situation, they tend to wind up without credibility or forced to resort to bullying.
No matter what question you're asking, make sure it has sufficient depth that you can ask a lot of follow-ups, but that it has a first iteration that's very simple. An amazing number of candidates cannot follow the instruction to Do the Simplest Thing That Could Possibly Work. Some questions have a natural escalation path (like working through the standard operations on a linked-list) and others require some more creativity.
For example, I would often ask a candidate to explain to me how the C code they are writing on the whiteboard would be rendered into assembly by the compiler. There is almost no earthly reason that someone should know about this already, so candidates answer in a wide variety of ways: some have no idea, others make something up; some have the insight to ask questions like "what kind of processor does this run on?" or "what compiler are we using?" And some just write the assembly down like it's a perfectly normal question. Any of these answers can work, and depending on what they choose, it usually makes sense to keep probing along these lines: which operations are the most expensive? what happens if we have a pipelined architecture?
Eventually, either the candidate just doesn't know, or they wind up teaching you something new. Either way, you'll learn something important. There are varying degrees of not-knowing, too.
- Doesn't know, but can figure it out. When you start to probe the edges of someone's real skills, they will start to say "I don't know" and then proceed to reason out the answer, if you give them time. This is usually what you get when you as about big-O notation, for instance. They learned about it some time ago, don't remember all the specifics, but have a decent intuition that n-squared is worse than log-n.
- Doesn't know, but can deduce it given the key principles. Most people, for example, don't know exactly how your typical C++ compiler lays out objects in memory. But that's usually because most people don't know anything about how compilers work, or how objects work in C++. If you fill them in on the basic rules, can they reason with them? Can those insights change the code you're trying to get them to write?
- Doesn't understand the question. Most questions require a surprising amount of context to answer. It doesn't do you any good to beat someone up by forcing them through terrain that's too far afield from their actual area of expertise. For example, I wold often work the circle-drawing question with candidates who only had ever programmed in a web-based scripting language like PHP. Some of them could roll with the punches and still figure out the algorithmic aspects of the answer. But it was normally useless to probe into the inner workings of the CPU, because it wasn't something they knew about, and it can't really be taught in less than a few hours. You might decide that this knowledge is critical for the job you're hiring for, and that's fine. But it's disrepectful and inefficnet to waste the candidate's time. Move on.
Let me return to my topic at the top of the post: using the interview to emphasize values as well as evaluate. The best interviews involve both the interviewer and the canddiate learning something they didn't know before. Making clear that your startup doesn't have all the answers, but that your whole team pushes their abilities to their limits to find them is a pretty compelling pitch. Best of all, it's something you just can't fake. If you go into an interview with the intention of lording your knowledge over a candidate, showing them how smart you are, they can tell. And if you ask questions but don't really listen to the answers, it's all-too-obvious. Instead, dive deep into a problem and, together, wrestle the solution to the ground.