This post was originally published on this site.
When schools first became aware that new versions of generative artificial intelligence tools could churn out surprisingly sophisticated essays or lab reports, their first and biggest fear was obvious: cheating.
Initially, some educators even responded by going back to doing things the old-fashioned way, asking students to complete assignments with pencil and paper.
But Michael Rubin, the principal of Uxbridge High School in Massachusetts, doesnât think that approach will prepare his students to function in a world where the use of AI is expanding in nearly all sectors of the economy.
âWeâve been trying to teach students how to operate knowing that the technology is there,â Rubin said during a recent Education Week K-12 Essentials Forum about big AI questions for schools. âYou might be given a car that has the capacity of going 150 miles an hour, but you donât really drive 150 miles an hour. Itâs not about the risk of getting caught, itâs about knowing how to use the technology appropriately.â
While students shouldnât use writing crafted by AI tools like ChatGPT or Gemini and pass it off as their own, generative AI can act as a brainstorming partner or tutor for students, particularly those who donât have other help in completing their assignments, he said.
Rubin recalled that his daughter recently needed his assistance with a history assignment. âShe has me to go to, and some kids donât,â he said. âWe do believe that the AI chatbots can sometimes be that great equalizer in terms of academic equity.â
But he added, âI did not do the work for my kid. So I want to make sure the AI chatbot isnât doing the work for anybody elseâs either.â
Rubinâs school uses a tool that helps teachers get a sense of how students composed a document they later turned in for an assignment. It allows teachers to see, for example, if a student did a lot of cutting and pastingâwhich could indicate that they took chunks of AI writing wholesale and passed it off as their own work.
If a teacher at Rubinâs school suspects one of their students plagiarized content from an AI tool, the teacher doesnât launch into an accusatory diatribe, he said.
Instead, theyâll use it as a âlearning opportunityâ to talk about appropriate uses of AI, and perhaps allow the student to redo the assignment.
âItâs not just about giving a zero and moving on,â Rubin said.
Never assume AI-detection tools are right about plagiarism
Those conversations are important, particularly when a teacher suspects a student of cheating because an AI detection tool has flagged work as potentially plagiarized, said Amelia Vance, the president of the Public Interest Privacy Center, a nonprofit organization that aims to help educators safeguard student privacy. Vance was also speaking during the Education Week K-12 Essentials Forum on AI.
Most AI detection tools are wildly inaccurate, she noted. Studies have found that commercially available detection tools tend to erroneously identify the work of students of color and those whose first language is not English as AI-crafted.
Programs that look at whether a student copied and pasted huge swaths of textâlike the one Rubinâs school usesâoffer a more nuanced picture for educators seeking to detect AI-assisted cheating, Vance said. But even they shouldnât be taken as the final word on whether a student plagiarized.
âUnfortunately, at this point, there isnât an AI tool that sufficiently, accurately detects when writing is crafted by generative AI,â Vance said. âWe know that there have been several examples of companies that say, âWe do this!â or even experts in education who have said, âThis is available as an option to deal with this cheating thing.â And it doesnât work.â
The kind of technology that Uxbridge High School relies on gives educators âa better narrativeâ to work with than other types of detection tools, Vance added. âItâs not just, âIs this student cheating or not?â Itâs, âHow is this student interacting with the document?ââ
Thatâs why Uxbridgeâs practice of talking to students directly when AI cheating is suspected is an important first step.
If a student admits to cheating using AI in those conversations, âyou need to make it clear to the student that is not acceptable,â Vance said. But teachers should never take the word of an AI detectorâor even the type of product Rubin describedâas gospel.
âAvoid ever assuming the machine is right,â Vance said.