UK students flock to AI to help them cheat
No need to plagiarize if you can have AI do it for you
A series of Freedom of Information requests shows that students in British universities are increasingly getting busted for using AI to cheat.
After getting responses from 131 universities, The Guardian found that between 2023 and 2024, there were 7,000 cases of students caught using AI to cheat - 5.1 for every 1,000 students. The previous year there were just 1.6 cases per 1,000 students.
When it comes to academic fraud, plagiarism is still the most common offense, the data showed. But since the introduction of easily available AI tools, plagiarism rates have fallen dramatically, and are expected to halve again during this current academic year.
Surprisingly, over a quarter of those halls of higher learning reported that they didn't collect stats on AI cheating as a separate category in 2023–24. Other forms of academic misconduct have remained broadly flat, even as confirmed AI-cheating cases are projected to climb to around 7.5 per 1,000 students this academic year.
AI companies are well aware that students are increasingly using their tools, and are actively courting them with special offers. Last month, OpenAI was tempting students with a .edu email address two months free subscriptions to get them using the system. Microsoft, meanwhile, is offering students three months' free use of Copilot and a 50 percent discount on future subscriptions.
Google has gone one step further, giving a year's free access to Gemini 2.5 Pro and the Veo 2 video creator. It's also throwing in 2TB of free storage to keep their creations in. Anthropic, too, is gunning hard for the academic market and has signed up the London School of Economics for students to use the Claude bot.
Last year, Perplexity was doing something similar and offered 45 selected universities free access to its Pro tool for a year. And Reclaim.ai is offering students a 50 percent discount for up to 12 months.
AI companies aren't doing this out of the goodness of their own hearts. As any brand marketer knows, getting a target market used to your product when they're younger means they are much more likely to remain with that product in later years.
A global issue for different cultures
It's not just a British problem. Pew Research in January found that 26 percent of US teens (ages 13–17) have used ChatGPT for schoolwork, double the usage level of the previous year. Interestingly, the majority of those surveyed thought it was wrong to use AI to actually write their essays for them, or at least that's what they told the interviewers.
Last year, there was an interesting case where parents actually sued their school for penalizing their son for using AI to help with a school assignment. The court eventually dismissed their case, but not before the boy was reinstated into the National Honor Society.
- University students recruit AI to write essays for them. Now what?
- AI programming assistants mean rethinking computer science education
- Generative AI is not replacing jobs or hurting wages at all, economists claim
- Cheat codes for LLM performance: An introduction to speculative decoding
It's an increasing problem for teachers trying to manage the use of such tools in the classroom. Some teachers are sanguine about it, likening the software to using a calculator instead of mental arithmetic. But some have been cracking down hard - not always correctly since the tools to identify AI-generated content are still fairly primitive.
One solution is to go old-school, and sales of the traditional US blue books used for handwritten exams are on the rise. For example, UC Berkeley's campus store saw an 80 percent jump in blue-book purchases over the past two academic years, as professors revert to handwritten, in-class essays to guard against AI-generated submissions.
China takes it even further. During the gaokao National College Entrance Examination, an annual event that could mean the difference between success and failure in life for students, Deepseek and ByteDance both reportedly shut down access to their services during exam hours to reduce cheating.
Indeed, state media reports AI is used to keep an eye on students and proctors during the examinations. Personal devices like phones are banned from the classroom, and in some cases, radio signals are completely blocked in the exam halls.
While Western practices are unlikely to reach that level of discipline, it's clear that students are perfectly willing to use AI tools, and there's a case that they should be able to - given that they are likely to be using them on a daily basis once they graduate. But the core knowledge learned at school is also vital for spotting when AI makes mistakes, or hallucinations as tech companies prefer to say. ®