AI tools deepening divides in graduate outcomes (opinion) – Inside Higher Ed

This post was originally published on this site.

Since OpenAI first released ChatGPT in November 2022, early adopters have been informing the public that artificial intelligence will shake up the world of work, with everything from recruitment to retirement left unrecognizable. Ever more cautious than the private sector, higher ed has been slow to respond to AI technologies. Such caution has opened a divide within the academy, with the debate often positioned as AI optimism versus pessimism—a narrow aperture that leaves little room for realistic discussion about how AI is shaping student experience.

In relation to graduate outcomes (simply put, where students end up after completing their degrees, with a general focus on careers and employability), universities are about to grapple with the initial wave of graduates seriously impacted by AI. The Class of 2025 will be the first to have widespread access to large language models (LLMs) for the majority of their student lives. If, as we have been repeatedly told, we believe that AI will be the “great leveler” for students by transforming their access to learning, then it follows that graduate outcomes will be significantly impacted. Most importantly, we should expect to see more students entering careers that meaningfully engage with their studies.

The reality on the ground presents a stark difference. Many professionals working in career advice and guidance are struggling with the opposite effect: Rather than acting as the great leveler, AI tools are only deepening existing divides.

  1. Trust Issues: Student Overreliance on AI Tools

Much has been said about educators’ ability to trust student work in a post-LLM landscape. Yet, when it comes to student outcomes, a more pressing concern is students’ trust in AI tools. As international studies show, a broad range of sectors is already placing too much faith in AI, failing to put proper checks and balances in place. If businesses beholden to regulatory bodies and investors are left vulnerable, then time-poor students seeking out quick-fix solutions are faring worse.

This is reflected in what we are seeing on the ground. We were both schoolteachers when ChatGPT launched and both now work in student employability. As is common, the issues we first witnessed in the school system are now being borne out in higher ed: Students often implicitly trust that AI will perform tasks better than they are able to. This means graduates are using AI to write CVs, cover letters and other digital documentation without first understanding why such documentation is needed. Although we are seeing a generally higher (albeit more generic) caliber of writing, when students are pressed to expand upon their answers, they struggle to do so. Overreliance on AI tools is deskilling students by preventing them from understanding the purpose of their writing, thereby creating a split between what a candidate looks like on paper and how they present in real life. Students can only mask a lack of skills for so long.

  1. The Post-Pandemic Social Skills Deficit

The generation of students now arriving at university were in their early teens when the pandemic hit. This long-term disruption to schooling had a profound impact on social and emotional skills, and, crucially, learning loss also impacted students from disadvantaged backgrounds at a much higher rate. With these students now moving into college, many are turning to AI to try and ameliorate feelings of being underprepared.

Such a skills gap is tangible when working with students. Those who already present high levels of critical thinking and independence can use AI tools in an agile manner, writing more effective prompts before tailoring and enhancing answers. Conversely, those who struggle with literacy are often unable to properly evaluate how appropriate the answers provided by AI are.

What we are seeing is high-performing students using AI to generate more effective results, outpacing their peers and further entrenching the divide. Without intervention, the schoolchildren who couldn’t answer comprehensions questions such as “What does this word mean?” about their own AI-generated homework are set to become the graduates left marooned at interview where they can no longer hide behind writing. The pandemic has already drawn economic battle lines for students in terms of learning loss, attainment and the very awarding of student grades—if we are not vigilant, inequitable AI use is set to become a further barrier to entry for those from disadvantaged backgrounds.

  1. Business Pivots, Higher Ed Deliberates

Current graduates are entering a tough job market. Reports have shown both that graduate-level job postings are down and that employers are fatigued by high volumes of AI-written job applications. At the same time, employers are increasingly turning to AI to transform hiring processes. Students are keenly attuned to this, with many reporting low morale that their “dream role” is now one that AI will fulfill or one that they can see becoming replaced by AI in the near future.

Across many institutions, higher education career advice and guidance is poorly equipped to deal with such changes, still often rooted in an outdated model that is focused on traditional job markets and the presumption that students will follow a “one degree, one career” trajectory, when the reality is most students do not follow linear career progression. Without swift and effective changes that respond to how AI is disrupting students’ career journeys, we are unable to make targeted interventions that reflect the job market and therefore make a meaningful impact.

Nonetheless, such changes are where higher education career advice and guidance services can make the greatest impact. If we hope to continue leveling the playing field for students who face barriers to entry, we must tackle AI head-on by teaching students to use tools responsibly and critically, not in a general sense, but specifically to improve their career readiness.

Equally, career plans could be forward-thinking and linked to the careers created by AI, using market data to focus on which industries will grow. By evaluating student need on our campuses and responding to the movements of the current job market, we can create tailored training that allows students to successfully transition from higher education into a graduate-level career.

If we fail to achieve this and blindly accept platitudes around AI improving equity, we risk deepening structural imbalances among students that uphold long-standing issues in graduate outcomes.

Sean Richardson is a former educator and now the employability resources manager at London South Bank University.

Paul Redford is a former teacher, now working to equip young people with employability skills in television and media.