A person infront of their laptop while using AI on their mobile device.
Khyati Sundaram says there are subtle signs that an applicant used AI to write their résumé or test answers.
  • Khyati Sundaram, the CEO of Applied, uses similarity detection to identify AI-written applications.
  • Sundaram says small formatting and language choices can strongly indicate that AI was used.
  • She says that sometimes using AI to complete an application isn't a big deal.

This as-told-to essay is based on a conversation with Khyati Sundaram, a 39-year-old CEO of Applied based in London. It's been edited for length and clarity.

I'm the CEO of Applied, a recruitment platform that currently oversees the hiring processes of over 200 customers. Our hiring approach relies on skills tests rather than traditional résumés.

Since joining Applied in 2018, I've reviewed thousands of applicants' skill-based test answers, and I've noticed multiple subtleties that indicate AI was used. AI can be a powerful tool for improving efficiency and formatting, but AI-written answers often sound generic and structurally look the same.

Beyond that, using AI can also strip an application of personality and diminish an applicant's ability to set themself apart among potentially thousands of other applicants. Here's how we spot AI and how it can impact the decision to hire an applicant.

Small details can show that an applicant has used AI

I've noticed many candidates are now trying to strike a balance between using AI for inspiration while still writing their test answers themselves. In those cases, it's unlikely I, or any other person, could spot the usage of AI when looking at a piece of text in isolation.

Also, I don't know of any AI models that can decipher human versus AI-generated content with 100% accuracy. In fact, a recent study showed large AI models falsely classifying things like the Bible and the US Constitution as AI-generated. This leaves us in a very cloudy atmosphere with no foolproof way of deciphering whether AI was used.

Therefore, at Applied, we've begun utilizing a similarity detector. In this method, a human from our team compares several answers to the same question and flags any recurring similarities between syntax, paragraph structure, and language.

We've found it much easier to see patterns emerging when examining 20 to 30 applications simultaneously rather than just looking at one in isolation. After doing countless rounds of similarity detecting, we've found that it's often small details, like formatting and punctuation, that giveaway an applicant's usage of AI.

Dead giveaways AI was used

We've found that text is more likely to be AI-generated if every word in a sentence is capitalized or if there are unnecessary capital letters in phrases.

AI-written text also tends to generate extra punctuation that doesn't read in the way people speak. I've also noticed, when examining multiple applications for a specific role, a lot of AI-assisted résumés have the exact same number of headings, paragraphs, and bullets.

Sometimes using AI to complete an application isn't a big deal

We give power to the human reviewer to decide whether each application is worth flagging for AI, and even then, we won't automatically reject someone if one person believes AI was used. It's a lot more nuanced.

I believe in the power of the crowd, so if someone flags a skills-test answer, we'll have other team members review it and come to a collective decision that considers the candidate's entire application and skillset. We allow for a lot of control and debate within each hiring team.

It's important to mention that many organizations operate in different ways. They may not mind the use of AI, or they might discard an applicant entirely if there is even suspicion that AI was used.

AI isn't going anywhere, so use it to your advantage

I believe that AI will continue to be used in applications. I've had conversations with several people on the job hunt who've told me they use AI to even out the power dynamic between themselves and employers, many of whom use AI in their own hiring processes. I think that's fair. But, if you're going to use it, use it well.

Large language models like GPT are great at synthesizing information, writing concisely, and creating structure, which can be incredibly useful for candidates whose native language isn't English or who aren't fluent. People applying to software engineer and data science roles might be asked to use AI during practical tests. In these cases, don't be afraid to put your digital skills on show.

Overall, rather than use AI to complete job applications, I advise candidates to use AI to help identify the key skills and attributes that hiring managers will be looking for. Run the job description through ChatGPT, and ask it to summarise and share examples of relevant skills and how they might be used in practice. This valuable information can then be used to inform your own responses to application and interview questions.

If you're an employer who would like to share your experience with AI in the hiring process, email Tess Martinelli at tmartinelli@businessinsider.com.

Read the original article on Business Insider