High-income students may be less likely to use AI in their college essays because they have more access to other writing supports.
Photo illustration by Justin Morrison/Inside Higher Ed | Gazanfer and InspirationGP/iStock/Getty Images
Admissions offices have been contending with essays generated by artificial intelligence for more than three and a half years now; according to a 2024 survey, about half of college applicants use AI to brainstorm their college essay and one in five use it to create a first draft.
A recent study by researchers at Cornell and Carnegie Mellon Universities dug deeper into which students are using AI in their essays and how it impacts the content and effectiveness of those essays.
The study analyzed tens of thousands of essays submitted to an unnamed selective institution over four years, starting before the introduction of generative AI tools. The researchers found that lower-income students—represented in this study by those who received an application fee waiver—were more likely to use AI in their essays, as were students who were ultimately rejected from the college.
Most Popular
Jinsook Lee, the lead author on the study and a Ph.D. candidate at Cornell, said she has been interested in variations in who uses AI and how it correlates with socioeconomic differences.
She had hypothesized that lower-income applicants would be more likely to use large language models to help with their college essays because they have less access to other resources for help. Even among the cohort of students who had used AI, lower-income students were more likely to be rejected than higher-income students, the researchers found. This may be because higher-income students can afford better versions of AI tools and might be working with counselors or essay coaches who understand how to use AI most effectively, she said.
“High-income students have a lot of different resources; they have counselors, they have teachers, they have more support on top of ChatGPT,” Lee said. On the other hand, lower-income students “might only be able to use the free tier instead of the $200-per-month [version of] Claude, and the quality of the outcome of what free-tier ChatGPT gives us is really poor.”
Impersonal Personal Essays
The study also evaluated the homogenization of language in the essays, or how similar they are to one another. Lee and her co-authors found that homogenization increased significantly after the launch of AI platforms, with the most convergence among lower-income students and students who were rejected from the college.
Though this report did not investigate exactly which linguistic features have become more common in college essays in the AI age, AJ Alvero, a professor in Cornell’s sociology department and a co-author on the paper, said that it’s concerning to think that admissions essays are becoming less personal.
“The essay is designed to give applicants an opportunity to highlight … the idiosyncrasies of their life, how they became who they are, these highly individualistic kinds of experiences and narratives,” he said. “If it’s pushing all these applicants towards the same type of essay, the same template … it could be that students are inadvertently losing that opportunity.”
Previous research, also out of Cornell, has shown that application essays written by AI are generic, easy to spot and don’t sound like a real person’s writing.
The new report’s conclusion argues that, as AI usage becomes increasingly prevalent, college admissions offices should consider wealth disparities when evaluating essays.
“Viewed through the digital divide framework, our results suggest a shift from inequalities in access to inequalities in returns, underscoring the need for institutions to reassess how essay-based evidence is interpreted as AI-assisted writing becomes common,” the authors wrote. “Future research should combine experimental, qualitative, and multi-institutional approaches to identify how AI tools interact with existing systems of educational stratification and to inform more equitable evaluation practices.”
In future research, Alvero and Lee said, they hope to investigate what linguistic and topic choices are most common in AI-generated college admissions essays. Alvero noted that the linguistic characteristics most common in higher-income students’ essays are also the ones that AI appears to mimic. Lee, meanwhile, has observed that LLMs tend to want to include irrelevant information about a student’s identity, such as opening a sentence on the essay of an applicant who identifies as Asian with “as an Asian woman”—even if the next clause has nothing to do with being an Asian woman.