Choose your bot wisely.
Job seekers desperate for a new gig now have to contend with a thoroughly modern hurdle: applicant tracking systems that automatically give preference to candidates who used AI to write their resumes.
A recent study found AI-powered applicant tracking systems, or ATS, not only prefer AI-written resumes over those composed by humans — they’re more likely to put candidates on the short list if they used the same large language model the company already employs.
“LLMs, when used as evaluators, systematically prefer resumes they generated themselves over equivalent resumes written by humans,” according to the study, “AI Self-preferencing in Algorithmic Hiring: Empirical Evidence and Insights.”
The bias discovered by researchers Jiannan Xu of the University of Maryland, Gujie Li of the National University of Singapore and Jane Jiang of Ohio State University means strong candidates could get left in the dust.
“Left unaddressed, this bias can distort hiring outcomes by systematically advantaging candidates who use the same LLM as employers, while disadvantaging equally qualified applicants who do not,” they wrote in the August study, which was published on the research sharing platform arxiv.org in February.
Companies and job hunters have already been using AI in the hiring process, but the study reveals a new wrinkle, said Boston University Professor Emma Wiles, who studies information systems and AI’s impact on the labor market.
“Instead of AI tools being used to find the applicant’s true abilities, you’re gonna find applicants that the AI thinks sounds like itself,” she said.
The researchers used 2,245 human-written resumes in their study, then created “multiple counterfactual versions using a range of state-of-the-art LLMs,” including GPT-4o and Deepseek-V3.1.
They then simulated hiring pipelines for 24 occupations, and found the AI evaluators were 23% to 60% more likely to select candidates which used the same LLM.
The problem was “most severe” in accounting, sales and finance, the research found.
“Such dynamics raise fairness concerns for job seekers and pose risks for employers, who may inadvertently overlook strong candidates,” they added, calling it “a novel form of bias.”
“If left unchecked, self-preference could subtly distort evaluative processes across hiring, education, publishing, and more.”
More than 300,000 job cuts were announced between January and April 2026, particularly in the tech sector, research firm Challenger, Gray & Christmas found.
Workers should strive for tools that help improve human writing, not replace it, Wiles suggested.
“It assists, but it doesn’t take the place of your effortful writing. It’s improving what you’ve already written and it’s not crowding out your true self,” she said. “Your self is going to be visible in what you’ve written.”












