When Ty landed an introductory phone interview with a finance and banking company last month, they assumed it would be a quick chat with a recruiter. And when they got on the phone, Ty assumed the recruiter, who introduced himself as Jaime, was human. But things got robotic.

“The voice sounds similar to Siri,” said Ty, who is 29 and lives in the DC metro area. “It was creepy.”

Ty realized they weren’t speaking to a living, breathing person. Their interviewer was an AI system, and one with a rather rude habit. Jaime asked Ty all the right questions – what’s your management style? are you a good fit for this role? – but she wouldn’t let Ty fully answer them.

“After cutting me off, the AI ​​would respond, ‘Great! Sounds good! Perfect!’ and move on to the next question,” Ty said. “After the third or fourth question, the AI ​​just stopped after a short pause and told me that the interview was completed and someone from the team would reach out later.” (Ty asked that their last name not be used because their current employer doesn’t know they’re looking for a job.)

A survey from Resume Builder released last summer found that by 2024, four in 10 companies would use AI to “talk with” candidates in interviews. Of those companies, 15% said hiring decisions would be made with no input from a human at all.

As Laura Michelle Davis wrote for CNET: “Today, it’s not uncommon for applicants to be rejected by a robot before they’re connected with an actual human in human resources.” To make the grueling process of getting hired even more demoralizing, many people are afraid of generative AI – which uses data sets to create text, video, audio, images and yes, robo-recruiters – will take our jobs altogether. But could AI help us find a new gig in the meantime?

The human element is lost

Ask an employer who uses AI in their hiring process about the choice, and they’ll share a common refrain: these systems are used to cross quotidian tasks off a recruiter’s daily schedule; AI helps weed through the top 1,000 applicants, but once you get to the top 10, it’s a strictly human process.

In 2019, ZipRecruiter noted that AI systems could streamline boring tasks – like writing job descriptions or scanning résumés – and give hiring professionals “back time to spend on more strategic tasks”. (Imagine how it feels to be on the other side, applying to dozens of jobs a day.) A survey conducted in 2020, sponsored by the AI ​​interview system Sapia.ai, found that 55% of companies were increasing investment in automated recruiting measures .

“I don’t use AI to write job descriptions, but I know many employers who do,” said Julia Pollak, chief economist at ZipRecruiter. Pollak said that a manager might also ask an AI program to give them a list of reasons why they should, or should not, hire a final-round candidate. “Employers use it as a coach, as a guide, as a friend to bounce ideas off,” he added.

Adele Walton was recently interviewed by AI. Photograph: Courtesy Adele Walton

But, as applicants like Ty now know, employers’ use of AI goes beyond it being a sounding board.

Experts advise applicants to act as though they’re speaking to a human during AI-led interviews, although that’s easier said than done. Adele Walton, a 24-year-old journalist and content creator from Brighton, England, recently sat through an AI interview that felt extremely unnatural. “I expected a person or a panel,” she said. “When I clicked on the call, I was surprised to enter a chat room with just myself.”

Questions flashed on a screen that also showed her own face. Walton had 60 seconds to answer. “I was looking at how my face was moving, looking at how I looked on screen,” she said. “As someone who’s struggling with body dysmorphia, I found that my face was an unnecessary distraction in the interview process. I know I would have done better if there was another person there.”

Walton did not get another interview. “In an in-person meeting, you get more social prompts from the other person,” she said. “In this case, I was just talking to myself – or an AI system – with no measure of how well I was doing. I couldn’t read anyone’s face, body language, or see them nod yes. That small type of human reassurance that you get in a real interview is completely lost when companies outsource interviews to AI.”

Applicants are gaming the system

If employers use AI to make their hiring process easier, why shouldn’t applicants? That’s Fanta-Marie Touré’s mindset. The 24-year-old, who lives in Atlanta and works in cybersecurity, has used AI tools to tailor her résumé, write cover letters and even auto-apply to jobs. She does this through a program called Massive.

“It’s very expensive to hire someone to help you with your résumé,” Touré said. “A lot of people charge $150 an hour to do résumé reviews. That’s a lot, so why not use a tool that costs me maybe $30 a month?”

Touré maintains that it’s still important to “personalize” application materials – adding relevant anecdotes, for example. “Otherwise, everybody who uses AI is getting the same result,” he said. “You have to tweak.”

Sometimes, supposed hacks on how to game the AI ​​recruitment process hit social media. A few years ago, Touré heard one trick: a TikTok creator advised applicants to copy and paste a job description on to their résumés, and then change the font of that description to a tiny size that matched the résumé’s white background. A human wouldn’t be able to see it, but AI would scan it, recognize the text verbatim and send it to the front of the pack. Or so the theory went.

A 2019 survey found that by 2024, four in 10 companies would use AI to ‘talk with’ candidates in interviews. Photograph: AsiaVision/Getty Images

“I never got any hits from that,” Touré said. “That was a couple of years ago, and I bet the systems are smarter now.”

It’s a move that’s familiar to Pollak, the ZipRecruiter economist. “That tactic has been used so widely that most job site algorithms now know to penalize it. Don’t try to be too smart with your résumé: if a match is too close, it will be kicked out.”

skip past newsletter promotion

According to Pollak, employers are more suspicious of AI-generated résumés and cover letters these days. “I’m hearing that employers are now discounting a lot of the information they receive that’s in written form, and want to get to a face-to-face conversation as quickly as possible with the candidates so they can properly vet them,” she explained.

Even on video interviews – with humans or robots – applicants can still call on AI programs to assist. Michael G is the founder of Final Round AI, an “interview co-pilot” that listens to recruiters’ questions and personalized prompts answers in real time, based on the résumé and cover letter uploaded by the interviewee. (Michael asked that his last name not be used because Final Round AI is still in “stealth mode”, a startup’s temporary state of secretiveness to quell competition.)

“Users can quickly glance through the AI’s response, and then develop their own response to the question,” Michael said. “It’s not like they need to read the text verbatim off the screen. It’s natural: you have celebrities, TV hosts looking at teleprompters all the time. Why can’t ordinary people use a teleprompter?”

According to Michael, Final Round AI has over half a million users, some of whom send him jobs offering them credit in part to the service. (That’s tough to factcheck – Final Round AI has less than 2,000 followers across its social platforms.) Michael added that about 40% of users come from the tech industry, and 30% work in the banking or finance fields. The cost to use it can range from $0 to $100 a month. Michael says all personal data is deleted at the end of the call, but generally, AI systems learn from the data they are fed over time.

Is it cheating? “I believe that because of AI, there are new boundaries and it’s hard to determine whether someone is cheating or not,” Michael said, somewhat terrifyingly. “And, if I were an employer, I would prefer candidates who know how to use AI, because they bring value and a productivity boost to the company.”

Michael said he used Final Round AI at points while answering questions for this interview; yes, you could tell.

Who’s hit the hardest by AI hiring?

AI systems encode bias. You don’t need to be a wildly imaginative person to consider how that might affect job-seekers – Amazon reportedly scrapped an in-house hiring algorithm, trained on data submitted by applicants, that favored men and penalized résumés that included the word “women ”. Will automated hiring processes also reject people who are not white, male and able-bodied?

“The current wave of AI uses algorithms that are probabilistic models, which means they simply rely on patterns in past data to make likely predictions,” said Rory Mir, associate director of community organizing at the Electronic Frontier Foundation. “The problem is, patterns in past data include the patterns which emerge from systemic bias.”

Racial bias is present when humans do the hiring: studies have found that hiring managers are likely to pass over names that are perceived as Black when reviewing résumés. Pollak said that to avoid such bias, ZipRecruiter strips “any kind of identifiable information” like names and zip codes from résumés before putting them through an AI system. But if a résumé still includes education, AI systems could theoretically learn to favor applicants who went to Ivy League schools. Pollak defended the practice of retaining education because “employers do want to know something like [that]”.

Higher-ups in the corporate world rarely apply for jobs – at a certain pay grade, you network instead. Mir believes that early-stage or low-level applicants will “bear the brunt” of AI hiring: “AI won’t pick a CEO. It might filter applicants for management, and it may have the final say on hiring a gig worker,” they said. “Automated hiring will have an outsized impact on marginalized folks who rely on more difficult lines of work.”

Case in point: 404 Media recently reported on a company called Paradox.ai, used by FedEx, McDonald’s and the company that operates Olive Garden, which recruited people in customer and food service jobs with a “long and bizarre personality quiz” illustrated with “ blue humanoid aliens”. The goal was to discover how candidates “rank in terms of ‘agreeableness’ and ’emotional stability’”.

Ty never heard back from anyone at the company after speaking with its AI recruiter; they assume they won’t. They aren’t surprised that the interview process is becoming more artificial.

“I’ve been in interviews where human recruiters asked me directly how I felt about using apps like ChatGPT for creative work, and if I would feel comfortable doing that at my job,” they said. “So I guess it was only a matter of time before I got straight-up interviewed by a robot.”