Last year, the image below was making its rounds on social media. It’s an excellent visualization of how even a simple solid object can be seen from unique perspectives. Getting two parties to agree on whether this was a cube or a cylinder, would be nearly impossible.
Candidates are complex.
When I think about today’s interview process to phone screen and assess talent, to determine if someone’s “a culture fit” and qualified for a given role – it makes my head spin. We’re acting out the telephone game, with a process chock full of bias and misinterpretation.
The candidate screening process begins with recruiters, who spend 20-30 minutes on the phone interviewing candidates. This conversation is full of incredible candidate insights, going much deeper and way beyond the text in a resume, but the conversation takes place in a vacuum. What do we get? Scribbled interview notes to save in our ATS and share with a hiring manager. We’re taking a rich, interactive, multi-dimensional experience – and then reducing it back to text..
Let’s look at the misinterpretation & bias from just the first step of today’s interview experience.
- What a candidate said versus what a ‘non-technical’ recruiter heard.
- What a recruiter heard versus what a recruiter scribbled down during the call.
- The scribbled interview notes versus the summary (write-up) sent to the hiring manager.
- The summary that was sent versus how a hiring manager interprets what they read.
How do we accurately and efficiently assess talent with so much bias in the first step of the interview process? It’s no wonder today’s interview process requires at least another one, two, even three more phone interviews with the same candidate, before they get to visit the office and meet the team.
It’s 2016 – isn’t it time we go beyond scribbled interview notes? Finally we have the tools to capture and share actual/factual real-time candidate data, to accelerate the interview experience, remove misinterpretation and bias, and make collaborative hiring decisions.