At the moment, it seems like AI (artificial intelligence) is everywhere. It’s on our devices, in our infrastructure and now, it’s making its way into schools too. The problem? Students can’t seem to tell when it’s telling the truth and when it’s not.
According to a new study from Oxford University Press (OUP), nearly 1 in 3 pupils (32%) aged 13-18 say they can’t tell whether AI-generated content is true and 1 in 5 (21%) say they are regularly unsure.
As a whole, less than half of pupils (47%) say that they feel confident when it comes to spotting misinformation created by AI.
The issue is that the study found that students do use AI a lot, with 80% using it for homework and other school work. But if they can’t accurately spot misinformation, what does that mean for their learning?
AI Hallucinations
But the concept that AI doesn’t always tell the truth isn’t new. These ‘AI hallucinations‘ as they have been named, occur when large language models like ChatGPT or Gemini present false information as true.
In an OpenAI report, the company found that ChatGPT’s o3 and o4 models hallucinated more than the previous ones. o3 hallucinated 33% of the time while o4-mini did so 48% of the time. For 04, that means it hallucinated nearly half of the time.
And whilst companies like Google and OpenAI are working on reducing this, the fact that they occur so regularly and yet are so hard to spot, raises important questions about reliance on AI.
More from Artificial Intelligence
- How Does AI Enabled Pricing Work?
- Why The Next Wave Of Agentic AI In Government Won’t Come From Silicon Valley
- Can Artificial Intelligence Run Data Centres Better Than Humans?
- The AI Race Isn’t About Algorithms – It’s About Chips, And China’s One Step Ahead
- How Are AI-Driven Brand Citations Reshaping Search Visibility?
- Taylor Swift’s “Life Of A Showgirl” Scavenger Hunt Sparks AI Backlash Among Fans
- Which AI Skills Are Skyrocketing And Which Are Stalling?
- Experts Comment: Is The AI Bubble About To Burst?
Students Use AI As A Shortcut
For many students across the UK, AI has become a way for them to access more personalised learning and have a helping hand with revision.
The problem is, many are using it to do their work, not just help them with it.
“Some are using it really effectively and are getting that extra help,” said Dan Williams, Assistant Headteacher at Bishop Vesey’s Grammar School. “But many are copying and pasting from the AI. They don’t yet have the knowledge to test whether something is correct.”
Williams, who works in the school as an AI lead, admitted that even he finds it hard to work out when videos are generated by AI. His worry is that students are becoming dependant on AI without the skills to actually work out whether it is pulling in accurate information.
Conversely, Dr. Alexandra Tomescu from Oxford University Press had a slightly different view. “We hear a lot about how AI is all doom and gloom and how it’s going to make young people very dependent on it,” she said. “But when asked, actually nine out of ten students have said that they have benefitted from AI, especially in skill development.”
The Importance Of AI Literacy
The study by the OUP also found that despite not being able to spot AI mistakes, students genuinely wanted to have better AI literacy.
48% said they wanted more support from their teachers to help them identify which AI content they can trust, whilst 51% said they wanted more clarity around when they could use it to help them with homework.
As the rise and adoption of LLM models has been so quick, especially amongst young people, it can be hard to understand just how much students are able to use them without being penalised. After all, most students will likely end up using AI as part of their future work, so growing their skills early is important.
The problem? Teachers don’t know either. In fact, a third of student said their teachers lack confidence using AI themselves, and almost half (47%) think teachers can’t spot when AI is being used for homework and other assignments.
AI: Help Or Hindrance?
When it comes to whether AI has been a help or a hindrance, it seems students are a little divided. 6 in 10 said AI has hurt their skills in some way and 26% said it makes their school work too easy. Others say it limits creative thinking (12%) or hinders problem-solving (8%).
However, it’s not all bad. 9 in 10 students said that AI has helped them with problem-solving (18%), creating new ideas (15%) and exam preparation (13%).
And whilst it is true that some level of critical thinking is important, the fact that AI can provide more personalised tuition is a positive – as long as it is not hallucinating in the process.
ChatGPT: Classmate or Cheat?
The truth is that many students will grow up in a world where AI is all around them. Banning it is not a smart option, but teaching them how to use it responsibly definitely is.
As Erika Galea, Director of the Educational Neuroscience Hub Europe and co-author of the report puts it: “The true challenge ahead is not mastering technology but safeguarding the depth of human thought in an age of…artificial intelligence.”