Jobseekers and recruiters are all using AI. It’s chaos.

Jobseekers and recruiters are all using AI. It’s chaos.

When Josh Holbrook, a software engineer in Alaska, was laid off in January, he didn’t expect to spend too much time looking for a new job. He certainly didn’t think he’d need to relearn the job-hunt process.

A few weeks into his search, however, Holbrook found himself out of his depth. Instead of speaking with a human recruiter at a local healthcare organization, he was screened by an AI chatbot. His résumé, created nearly a decade ago in a technical format popular among academics, was incompatible with new automated recruitment platforms. He signed up for a professional service to update it in an AI-friendly format.

“The experience was completely novel,” Holbrook told me. “I’ve never seen that before.”

Over the past couple of years, job seekers have been forced to contend with incessant layoffs, a brutal recruitment market, and days of unpaid assignments. They can now add AI recruiting systems to that pile. In 2022, the Society for Human Resource Management found that about 40% of the large-scale employers it surveyed said they were already deploying AI in HR-related activities like recruitment. Rik Mistry, who consults on large-scale corporate recruitment, told Business Insider that AI is now leveraged to write job descriptions, judge an applicant’s skills, power recruiting chatbots, and rate a candidate’s responses. Ian Siegel, the CEO of ZipRecruiter, estimated in 2022 that nearly three-fourths of all résumés were never seen by humans.

Some job hunters have decided to fight fire with fire, turning to programs that use AI to optimize their résumés and apply to hundreds of jobs at a time. But the emerging AI-versus-AI recruitment battle is bad news for everyone. It turns hiring into a depersonalized process, it inundates hiring managers, and it reinforces weaknesses in the system it’s designed to improve. And it only seems to be getting worse.

Automation in recruitment isn’t new: After job sites like Monster and LinkedIn made it easy for people to apply for jobs in the early 2010s, companies adopted applicant-tracking systems to manage the deluge of online applications. Now most résumés are first seen by software designed to evaluate a person’s experience and education and rank them accordingly.

The automation has helped ease the burden on overstretched recruiters — but not by much. As the stacks of digital résumés have grown amid frequent changes to remote-work policies, the recruitment hamster wheel has spun ever faster.

We know AI isn’t perfect, but we have to use it as there’s pressure from the higher-ups.

AI is supposed to fix this mess, saving companies time and money by outsourcing even more of the hiring process to machine-learning algorithms. In late 2019, Unilever said it had saved 100,000 hours and about $1 million in recruitment costs with the help of automated video interviews. Platforms like LinkedIn and ZipRecruiter have started using generative AI to offer candidates personalized job recommendations and let recruiters generate listings in seconds. The Google-backed recruitment-tech startup Moonhub has an AI bot that scours the internet, gathering data from places like LinkedIn and Github, to find suitable candidates. On HireVue, employers can let a bot with a set questionnaire conduct video assessments to analyze candidates’ personalities. Newer startups combine these abilities in a centralized service, allowing firms to put “hiring on autopilot.”

But hiring experts Business Insider spoke with weren’t convinced it’s all for the best. Many fear that over time AI will make an already frustrating system worse and spawn fresh issues like ghost hires, where companies are misled into recruiting a bot masquerading as a person.

Several seasoned recruiters told me they hadn’t incorporated AI into their workflow beyond auto-generating job descriptions and summarizing candidate calls. Tatiana Becker, who specializes in tech recruiting, said software that claims to match résumés with jobs lacked the nuance to do more than keyword matching — it couldn’t, for instance, identify desirable candidates who came from top schools or had a history of earning strong promotions. Chatbots that Becker’s boutique agency experimented with would frequently mismatch prospects and roles, ultimately pushing the prospects away.

This résumé matching “might work for applicants of more entry-level jobs,” Becker told BI, “but I would worry about using it for anything else at this point.”

Despite the problems, many companies are marching forward. “We know AI isn’t perfect, but we have to use it as there’s pressure from the higher-ups,” said a recruiter in a Fortune 500 firm who spoke on the condition of anonymity to candidly discuss his company’s hiring process.

Pallavi Sinha, the vice president of growth at Humanly, a startup that offers a conversational-AI hiring platform to companies like Microsoft, said that “AI in recruiting, similar to other industries, is very much at a nascent stage.” But she predicted it would continue to be incorporated into hiring.

“AI isn’t here to replace human interactions,” she said, “but to make our jobs and lives easier — something that we’ll see more and more of over time.” Sinha declined to share how many applications Humanly had processed but said its chatbot service had over a “million conversations last year alone.”

For candidates, though, AI has been a nightmare. Kerry McInerney, an AI researcher at the University of Cambridge, said AI increases the amount of labor for applicants, forcing them to complete puzzles and attend automated interviews just to get to the selection stage. She argued that it makes a “depersonalized process even more alienating.”

Holbrook, the software engineer, wrote on LinkedIn about his frustration. “AI is too stupid to recognize transferrable skills among tech applicants,” he said, adding, “I had a resume bounced because I don’t have c# listed as a fluent language, even though I’ve dealt with c# in my jobs and have worked with plenty of languages that are 90% the same as c#.”

Danielle Caldwell, a user-experience strategist in Portland, Oregon, was confused when an AI chatbot texted her to initiate the conversation about a role she had applied for. At first, she thought it was spam. After the exchange, she was left with more questions.

“There was no way to ask questions with the bot — it was a one-way experience,” Caldwell said.

The University of Sussex has found that AI video interviews can be disorienting for job seekers, who behave much less naturally in the absence of a reassuring human presence. Plus, Mclnerney said, assessing a candidate’s personality based on their body language and appearance is not only “reminiscent of 19th- and 20th-century racial pseudoscience but simply doesn’t work.” Her research has demonstrated that even things like wearing a headscarf or having a bookshelf in the background can change a personality score.

With AI you are only accelerating the brokenness of recruiting.

A cottage industry of tools has sprung up to help candidates game AI systems. One called LazyApply, for example, can apply to thousands of jobs online on your behalf for $250. “Anyone who has had to review over 50 résumés in one sitting wants to put a toothpick in their eyes,” said Peter Laughter, who’s been in recruitment for about three decades. With AI, he said, “you are only accelerating the brokenness of recruiting.”

Bonnie Dilber, a recruiting manager at Zapier, said these services contributed to problematic behaviors on both sides of hiring. Candidates’ submitting hundreds of applications leaves recruiting teams struggling to keep up and respond, which in turn pushes applicants to feel that they need to submit even more applications to stand a chance.

A more pressing issue, Dilber added, is that often these bots submit poor applications. Dilber and other recruiters told BI that some cover letters say only, “This has been submitted by [AI tool], please contact [person’s email] with questions.” When Aki Ito, a BI correspondent, tried using AI to apply for jobs, the system got her race wrong, made up that she spoke Spanish, and submitted an outdated cover letter.

“We had some people accidentally include ChatGPT’s entire response,” Hailley Griffis, the head of communications and content at Buffer, told BI. Griffis, who said she reviewed an average of 500 applications per open role, added, “There is a very distinct tone with a lot of AI content that hasn’t been edited, and we saw a lot of that.”

Recruiters and researchers also worry about AI’s tendency to reinforce many of the recruitment industry’s existing biases. Researchers from the Berkeley Haas Center for Equity, Gender and Leadership said in 2021 that of about 133 biased AI systems they analyzed, about 44% exhibited gender bias. Other recent studies have found that AI systems are prone to screening out applicants with disabilities and de-ranking résumés with names associated with Black Americans.

Unlike a human, an algorithm will never look at past hiring decisions and rectify its mistakes, said Sandra Wachter, a tech and regulation professor at the University of Oxford. It will always do as it has been taught.

Wachter, whose team developed a bias test that’s been adopted by companies like IBM and Amazon, believes it’s possible to use AI to make fairer decisions — but for that to happen, employers need to address systemic issues with more inclusive data and regular checks. Moonhub, for example, has a human recruiter who vets the AI’s recommendations and speaks with candidates who prefer a human over a chatbot.

But until AI improves enough, humans will remain the most effective hiring systems. Becker, the tech recruiter, said humans were still critical for “getting to the heart of the candidate’s decision-making process and helping them overcome apprehensions if they’ve gotten cold feet,” as well as for “dealing with counteroffers.”

David Francis, a vice president at Talent Tech Labs, a recruitment research and advisory firm, said that while “recruitment is still and will always be a very human-focused process,” AI “is a tool and can be used effectively or poorly.”

But when it’s used poorly, both candidates and recruiters suffer. After four months of job searching, Josh Holbrook has yet to find a job.

Shubham Agarwal is a freelance technology journalist from Ahmedabad, India, whose work has appeared in Wired, The Verge, Fast Company, and more.

>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : Business Insider – https://www.businessinsider.com/jobseekers-recruiters-hiring-broken-ai-bots-resumes-interviews-spam-bias-2024-5

Exit mobile version