paint-brush
How to Run a Tech Interview Without the Hazingby@arjunchandar
305 reads
305 reads

How to Run a Tech Interview Without the Hazing

by Arjun ChandarMarch 29th, 2022
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Jason Truesdell, Principal Software Architect at IndustrialML, has conducted countless interviews and believes there is a way to fix tech interviews for good. Success or failure in the interview is often not well-correlated to success in the actual role, he says. Curiosity, tenacity, and persistence are better predictors of success in most teams he's worked in, rather than whiteboard coding exercises, credentials, or confidence. Interviews are often more capricious than they appear at first glance and can easily filter out excellent engineers.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - How to Run a Tech Interview Without the Hazing
Arjun Chandar HackerNoon profile picture

Tech job interviews are awful. We all know it, but many of us don’t want to admit it. They are a complex beast because of the many facets that need to be considered to find the perfect candidate — coding exercises, take-home projects, and technical questions are all commonplace. However, we are making them worse than they need to be.

Candidates often have to fight their way through the process, facing questions and problems that they have either never encountered before or are unlikely to ever see in a real-life situation. Given that 61% of HR professionals report that finding qualified developers is their biggest recruitment challenge, why are we gearing the interviews against our candidates?

Jason Truesdell, Principal Software Architect at IndustrialML, has conducted countless interviews and believes there is a way to fix tech interviews for good. That way, companies can attract better candidates without making them jump through unnecessary hoops. 

I conducted my own interview with him to find out more:

You aren't a big fan of the typical tech interview process. Why not?

This is mainly because success or failure in the interview is often not well-correlated to success in the actual role. Objective-sounding criteria are often more capricious than they appear at first glance and can easily filter out excellent engineers, indicating who did the best interview prep more than anything else.

How does the conversation in these tech interviews normally go in your experience? 

In many mature industries, job interviews are business conversations, a mutual discussion to discover the ways the employer's needs and the interviewee's experience and goals converge. When I've done software consulting, the conversations that lead to contracts are usually like that too. 

However, when it comes to full-time and agency temp software jobs, interviews are more like hazing rituals — alienating experiences that vaguely feel like alpha geek peacocking displays, built on an implicit assumption that a bunch of people are faking their way through the industry and must be exposed. Curiosity, tenacity, and persistence have been better predictors of success in most teams I've worked in—or hired for—rather than whiteboard coding exercises, credentials, or confidence. 

So I have optimized my interview process to identify those attributes. I also try to steer the conversation to the actual work we're doing and our business objectives, not a bunch of abstract problems. A good candidate will recognize the overlap between things they've worked on in the past and talk about how that context could help them make sense of the work in your organization.

What are you trying to find out with your interview questions?

I try to be as open-ended as possible while trying to get a sense of what makes a person tick. I mostly want candidates to talk about work they're proud of, especially when it wasn't a straight line to get it done. 

Talking about both the challenges and the easy bits, the missteps and the flashes of insight, the hallway conversations with coworkers, the research and technical spike tangents they might have taken, will give me a good sense of both their approach to problems and their experience. 

Most importantly, it also demonstrates communication skills. 

A person who can explain how something works will almost certainly have the capacity to do it. I can ask probing questions to improve my understanding of their solution, and this demonstrates their own proficiency.

What kind of questions should one ask?

In terms of technical questions, I might ask:

"Walk me through the approach you took to solve a challenging technical problem."

"What's your biggest frustration about your favorite programming language or framework?"

"Tell me about a time that you had to build out a feature whose requirements were still ambiguous or incompletely specified."

"Is there a skill you'd like to improve right now? How would you go about learning and developing it?"

Then to find out about their approach to the everyday negotiations that are part of team software development:

"Tell me about a time you had to convince your colleagues to go a particular direction they were resistant to."

"What did you do to convince them?"

"Did it work?"

"What would you do differently if faced with the same situation again?"

Do you have any interesting tricks to uncover a candidate’s knowledge on a particular topic?

I've got experience in a lot of different programming languages and frameworks, so occasionally, I'll ask a candidate if they've ever experienced a problem that I've encountered in the past with the tooling they're most comfortable in, and then I'll probe them about their approach to solving those issues. I'm not going to ask trick questions or anything, but we can commiserate a bit in an average case, and sometimes I might even learn something.

You brought up the idea of the “everyday negotiations” of a software team. What traits are you trying to identify with those questions?

In the most effective teams I've worked in, people develop a theory of mind for their coworkers and try to explain things in terms of what that person thinks is crucial. It's pretty important to me to find people who are comfortable making a case for consequences rather than preferences. 

I need people to think in terms of the context of both the project as it exists, what its goals are, rooted in an understanding of the team's dynamics. It's hard to have a real discussion with people who mostly appeal to external authorities, or to use hand-wavy phrases like "best practices".

It’s not foolproof, but asking questions about how someone has tried to sell their team on an idea, or even asking them to sell me on a specific architectural or tooling change, can shine a light on their approach to collaboration.

How do you assess someone’s software design skills?

I’m basically trying to gauge their ability to take an abstract problem and turn it into a defensible software design. In that case, I might ask a question about a closely-related domain to things that I am actually working on.

When I worked on a product with some shipping logistics issues as a subdomain, I gave candidates a scenario, asking how they'd model a system for retrieving schedules and pricing for air couriers, ground-based freight, and private couriers, leading with examples of some of the scenarios the system would need to handle.

What I'm really assessing in questions like this is the ability to listen, reflect on the problem, maybe quickly sketch a defensible solution, and then explain it. They could do this either by relying on the language of commonly known design patterns or with sufficient clarity that they were painting a reasonably clear picture even if they didn't know the names of those patterns. I want them to ask questions about aspects of the problem statement they find ambiguous and hopefully demonstrate at least an adequate degree of curiosity for the problems my team has to work on.

Do you have any examples of your own experience of being the interviewee and what it uncovered?

On one occasion, when being interviewed, I was asked how I'd design a system that had to communicate with external systems that limited direct network access but needed to transfer large amounts of data on-demand. I asked follow-up questions about the nature of the network boundaries and basically described two options—a "they call us" push model or a "we regularly check in with them" pull model. The interviewer was basically looking for an awareness of what the options were, and we drilled down on the details, and it mapped neatly to two actual subsystems that that company actually uses. Although I didn't know all of the constraints they had to solve, I presented a defensible set of options that demonstrated I was listening and had some idea of what realistic options were.

Listening and communication skills are sufficiently predictive of application design skills in my experience that I don't need to ask candidates to write code. Ultimately, software design is an act of communication. I might write code on my own, but it's rarely an act done in isolation, as someone almost always needs to integrate with it.