In the not too distant past, I found myself in the position of hiring some of the core engineers that would drive a fledgling telehealth startup towards what I hoped would be a glorious and successful future. Since the founding team of a company is so critical to its long-term success,  I based my ambitions on the collective wisdom of the Internet (what could go wrong?), and of course looked towards the FANG interview process, and especially Google. I was the first full-time engineer hired at the video interviewing pioneer HireVue, and have a decade of experience in the HR and interviewing technology space. This experience made me a firm believer in the structured interview. Line up a set of candidates, give them a set of standard questions in a structured fashion and look for genius in the answer of routine questions. What could be fairer and more organized than lining up candidates one by one, question by question, rating each, and just hiring the best rated out of all?

Deep inside I had always suspected coding interviews were in large part bullshit; haven't we all?

I hired a senior-level engineer who absolutely aced the canned online coding challenges we gave him and all of the other stages of our highly designed and structured interview process. I was very excited and convinced that he would be a solid contributor to our team given his extensive technical knowledge, and demonstrated ability.

The first signs of trouble came when he had issues just checking out our codebase. He then spent a week tailoring his IDE and development tools before he could start working on the project. As a fan of highly-tailored vim and shell configuration and environments, I chalked this up to quirky perfectionism. The real revelation came when I went to pair-program with him on his first story: a fairly basic time zone problem. This is when I noticed it seemed to take ages for him to type out a single line of code. He would seemingly get lost navigating around the project and was confounded by reading other people's code. After just twenty minutes of sitting side-by-side with him, it was clear that while he was extremely academically savvy with the subject of programming, he was not savvy with the act of building software. The skills I had tested for would likely have made him the perfect fit for some type of deep engineering research project, but it was clear he wasn't the right fit for us. I lament this decision to this day, because given our early stage, I was forced to let him go, and I feel my poor vetting meant consequences for him.
Even after following all of the best recommendations and best practices at the time, how could the result end up so very far off my intended mark? Deep inside I had always suspected coding interviews were in large part bullshit; haven't we all?

How could I have designed the interview better to discern practical skills over purely academic skills? It's something I asked myself a lot in the following years, and something I have only recently started to begin to understand by interviewing entirely remotely through the pandemic using CoScreen.

Developer Interviews: the status quo.

Much has been written on the current state of developer interviews. When you think of software engineering interviews many things come to mind: From the much-maligned Fizz-Buzz coding challenge (which determines if a candidate knows the modulo operator exists), obscure pot-shot domain-specific questions being asked during a classic whiteboard interview (bipartite graphs was a favorite at HireVue), to implementations of basic data structures like Red-Black trees. Everything now seems to boil down to coding challenges, where developers are asked to go through an array of canned hacker rank challenges and produce some code to solve an eclectic walled-garden problem. Either that or it's a battery of in-person whiteboard interviews, where candidates answer a barrage of fanciful questions, often the same questions over and over, which seem to be more a test of basic human endurance, willpower, and mental stability, than any ability in the actual field of software engineering. This has gotten to the point where some in the industry have designed multi-stage interviews that last for weeks. Some of these interviews are more useful to interviewees in detecting internal dysfunction within the organizations that purvey these gauntlets as a means of hiring; I myself once left a job interview after having a Project Manager basically straight out say he hated his job, hated the interview process, didn't respect his boss, wanted to quit, and that the organization he worked for was completely dysfunctional. When you make your candidates run these gauntlets, you're making your employees enforce, guard, promote and purvey this human misery. Eventually, it starts to affect them too.

Engineering Job Interview

Everyone seems to have an opinion of that one thing that will be able to identify a developer who is going to be a solid contributor to the organization, versus one that won't pull their weight, or in the worst-case scenario actually slow the team down.

Throughout my career, this has lead to having engineering managers perform increasingly bizarre programming challenges and methods. Everyone who has done coding interviews knows the ultimate truth about the matter: You really can't divine much in one of these standard programming interviews.
Even if a developer aces the company's curated hacker rank, they could be incredibly unmotivated, slow, lacking in skills around architecture, research, debugging, deployment, reading other people's code (to just name a few), and just well-primed and trained on interview challenges.

I have seen incredibly talented, hard-working, engineers, fall apart in whiteboard interviews because the skills to reason about a brain-teaser on a whiteboard happen to not exactly align with the same skills they used for years writing actual software and solving real-world problems. I have also seen developers who absolutely ace these canned interview questions turn out to be lazy, unmotivated, unable to work on real-world problems, and ultimately incompetent.

I would say the rate of finding a good engineer versus a bad one based on traditional interview techniques has been no better than a coin-flip, and I would wager the plurality of technical interviewers would tell you the same thing. I would say given the sheer amount of money and time being spent on this, it's a pretty sad state of affairs. The truth is doing the job itself is the only thing that actually proves whether your hiring intuition was correct or not.

What are bad hires costing my company?

How much money is your organization wasting year after year on making bad software engineering hires? How many capable engineers is your process inappropriately weeding out who would be stellar contributors to your company?

Engineering interviews themselves constitute time that your developers aren't working on improving or maintaining your product, which is also a major hidden cost-sink (~22k/hire) just for the interview process. The truth is that there is an extreme long-term cost of onboarding the wrong people to your organization.
If you onboard a single bad software engineer, often the effects of the chaos they introduce into the system will take years to fully flesh their way through your codebase.

Another unfortunate truth about this is that bad software hires can sometimes be far on the left side of the Dunning Kruger curve, in that they will make quick and bold architecture decisions. Sometimes this looks like fast progress, especially to those savvy enough to only be capable of seeing the top of the technical iceberg. They often will produce quick thoughtless changes of the existing code base, spend a lot of time criticizing the code and pieces they haven't spent enough time to understand, and make large vestigial contributions that look great on the outside but which are prototype quality, inefficient, poorly architected, or poorly integrated to the rest of the code-base.

The truth is often, the veneer of this progress they make doesn't immediately start peeling off or falling apart. Often, these systems will have maintainers and long-term senior engineers who put in the extra effort to refine and integrate these components into the system behind the scenes and silently patch over these issues. In other scenarios, some of these engineers may instead understand their lack of ability to contribute and try to "fly under the radar," basically making minimal or easy contributions where possible but avoiding dipping into any of the actual challenging work.
Ultimately an organization that doesn't deal with this situation risks having the core members of its engineering team choosing that it just isn't worth working for an organization that doesn't address issues that continually cause them to have to work harder and clean up pointless messes.

It's a challenge that can eventually erode an engineering organization to the point the only people who remain are these mediocre contributors. The eventual effect of Price's Law, basically stated that 50% of the work is done by the square root of the total number of people who participate in that work, is that if an organization doesn't struggle to keep their high performers, they will eventually be left with only low performers.

Refactoring code before deployment

How do I interview more effectively?

Now that the costs of continuing with the status quo are obvious, how do you find top-quality engineers and make your interview process produce better results than a coin flip? I think it really comes down to getting to that moment I identified above with my personal wrong hire sooner than later: pair programming with the candidate on a real problem, using their tools, in their environment, in their comfort zone.

Probably one of the best recent techniques that go at least in this general direction has been to ask candidates to complete an actual project of some kind.

Why is this a good technique?

It comes down to several serious advantages:

  • It is holistic in that it actually identifies that the developer can create and build a piece of software from scratch.
  • It tests that the developer can appropriately architect a project.
  • It gives the developer a warrant to be creative and solve the problem in their own way, using their own tools.
  • If the problem is good, it can test a multitude of actual development skills and abilities, from repository management to API research and usage, to algorithms; and even ask for a demonstration of something extremely relevant to the actual software the company writes.
  • It allows the developer to work on the problem in their own time, in their own way.
Programming Interview Challenges

Why are these things important? Most of the best software engineers are gainfully employed. Are you filtering out our best potential hires by making the interview unappealing and disrespectful to them? Dedicating time to coming to an interview, or even brushing up on these tangential coding challenge skills so they can perform as well as a fresh out of the boot-camp engineer on your algorithmic coding challenge, is a considerable time suck.

Doing a project is a time-suck too if you make it too demanding, but they aren't wasting their time brushing up on skills that don't help them at all with their actual day-to-day work. They can through a whole project demonstrate their competence in a multitude of areas, and not just their ability to solve brain-teasers or demonstrate high-school computer science class level skills of understanding basic data structures. Accomplished developers are very proficient with their own tools. You wouldn't hire a mechanic based on the brand of their wrench or tool-set. Why are we testing developers in these bullshit environments, on bullshit problems, with bullshit tools almost exclusively? Why are we testing them on a whiteboard with an eraser, or in some web-based approximation of an IDE?
Interviewing with a project lets them solve the problem using the host of knowledge and tools they have assembled over their career; and it allows you to focus on what matters: the code.

You can have a competent developer on your staff look over their project, their code, profile it, read their documentation, run their tests, and come to a much better than a coin-toss prediction about the engineer's ability that produced that work to contribute to your company. Do this instead of wasting the time of your best developers evaluating canned coding challenges that don't even help you make good hiring decisions. In many ways, it will eliminate the implicit bias of some of these interviews and make it entirely about the actual performance of their core job function.

So what are the downsides of these types of interview challenges?

Ultimately they can be easily faked or cheated. Your staff member will accurately assess the ability of the engineer that produced the project to contribute to the company, but maybe that person isn't the person you're actually hiring. People can get help from more talented peers, or have them write the project entirely for them. They can mutate and forge their git commit history to foster an impression of a competent workflow or make it seem as if a simple problem that took them days of research took only minutes. A developer also has the gut intuition when working with another experienced engineer that can't be quantified, but is extremely effective at determining competence.

It comes down to a problem of trust, and ultimately there will always still need to be an in-person component to test for soft skills, team fit, and other skills that aren't coding-related. Many tools exist for performing live and in-person interviews, but very few exist for doing live coding interviews. Most are based around online IDEs, and thus have few of the benefits of allowing developers to use their tools, and solve the problems using their core abilities and environment. What really would cut to the core of the issue is if you could fast-forward to that moment in real-time, immediately, with the actual engineer, on their home court.

This is why CoScreen is of vital importance for doing an effective developer interview. It is immediately available by just sharing a link with a candidate. It isn't asking your candidate to share anything that may produce implicit bias or violate their privacy. You won't see the background picture of their desktop, compromising desktop icons, photos, or pictures of them or their friends, partner, or family. You can join a session, share notes, web pages, a presentation, and problems, directly with them, using any piece of software you see fit. They can then share their IDE, tools, terminal, and documentation as they research it.

They can make a live custom presentation of their actual workflow for you in real-time. You can driver/navigator pair-program a small real-project with them and immediately answer the core question: are they proficient at writing software? Can they actually work with your engineers?

There are no more whiteboards or awkward web-based IDEs with minimal language support but actual real-time software engineering in an interview. And since it is live, it gives you a one hundred percent genuine experience that can't be faked, can't be passed on to a friend, or forged in any way.

You can let your emacs experts regale you with their expert navigation of a source base using LSP mode or Helm.
I'm not just speculating on this, this is actually how we hire our engineers at CoScreen, and I'm speaking of a specific engineer we interviewed and hired. There has been little contention with our hiring process due to immediately getting down to coding, pairing, and working together.

We have attracted some of the best technical talents I have worked with in my entire career. With everything moving remote, this all should have become harder, but in reality, CoScreen is better than in person for coding interviews. The approximate in-person analog would be three or four people interviewing someone sitting on the same side of the table, looking over a single screen with a single keyboard and mouse.

CEO of GitLab - CoScreen

They can show you all of their fancy zsh plugins and navigate a sample problem git repository. You can share a browser window with them with instructions on how to check out a git project and then watch their actual workflow as they go about doing so. Cutting through all of the nonsense and getting straight to focusing on the developer and the code immediately brings you to that moment of sitting side-by-side as they work and knowing for sure if they can do the job or not.

The concurrent real-time sharing abilities of CoScreen eliminate screen sharing as a single mutex. It's no longer "Can you let me share this," "Can I control that?" Instead, anyone shares what they want, when they want it. They control what they want when they want to control it. They move their windows wherever they want, whenever they want to move them. This is vital for having a remote interview that naturally flows, rather than a coached one-way presentation.

Ultimately, my interviewing process's time and effectiveness have improved dramatically during the pandemic because of having access to the right tools for the job. Knowing what matters is getting straight to the business of coding. Honestly, having been on both sides of the software interviewing table, I have long been desperate for any improvement in this status quo.

In my humble opinion, the descent towards toy programming challenges seems like a slide in the wrong direction. I think the organizations moving towards projects have been a fundamental step in the right direction. And organizations that start to adopt real-time projects, with live pair-programming with some of their trusted engineers through tools like CoScreen, will benefit both from candidate and employee happiness and ultimately just making better hiring decisions.

But don't take it from us, take it from our customers:

SepApp technical interviewing with CoScreen
SetApp regarding their use of CoScreen for technical interviewing

Try CoScreen for free!