- A professor warned that AI is making students dependent on Big Tech's algorithms for knowledge.
- Anthropic data shows students already use AI to write, edit, and even solve assignments directly.
- Kimberley Hardcastle of Northumbria University says education risks losing to algorithmic authority.
Generative AI isn't just changing how students learn — it's changing who controls knowledge itself.
That's the warning from Kimberley Hardcastle, a business and marketing professor at the UK's Northumbria University.
She told Business Insider that the rise of ChatGPT, Claude, and Gemini in classrooms is shifting education's foundations in ways few institutions are prepared to confront.
While schools and universities focus on plagiarism, grading, and AI literacy, Harcastle said the real risk lies deeper: in students and educators outsourcing judgment to algorithms built by Big Tech.
Students are outsourcing the thinking process
Data from Anthropic, the company behind Claude, shows just how deeply AI has entered the classroom.
After analyzing about one million student conversations in April, the company found that 39.3% involved creating or polishing educational content, while 33.5% asked the chatbot to solve assignments directly.
However, Hardcastle said this isn't just a case of students "not doing the work." She said it's also about how knowledge itself is constructed.
"When we bypass the cognitive journey of synthesis and critical evaluation, we're not just losing skills," she said. "We're changing our epistemological relationship with knowledge itself."
In other words, students are beginning to rely on AI not just to find answers but also to decide what counts as a good answer.
"This affects job prospects not through reduced ability, but through a shifted cognitive framework where validation and creation of knowledge increasingly depend on AI mediation rather than human judgment," she said.
The 'atrophy of epistemic vigilance'
Hardcastle said her biggest concern is what she called the "atrophy of epistemic vigilance" — the ability to independently verify, challenge, and construct knowledge without the help of algorithms.
As AI becomes more embedded in learning, she said, students risk losing the instinct to question sources, test assumptions, or think critically.
"We're witnessing the first experimental cohort encountering AI mid-stream in their cognitive development, making them AI-displaced rather than AI-native learners," she said.
"We're witnessing a transformation in cognitive practices," she added.
That loss could ripple beyond classrooms. If people stop practicing independent evaluation, society risks becoming dependent on algorithms as the arbiters of truth.
Big Tech's growing control over knowledge
Hardcastle warned that the deeper danger isn't just cognitive but structural.
If AI systems become the primary mediators of knowledge, Big Tech companies effectively control what counts as valid knowledge.
"The issue isn't dramatic control but subtle epistemic drift: when we consistently defer to AI-generated summaries and analyses, we inadvertently allow commercial training data and optimization metrics to shape what questions get asked and which methodologies appear valid," she said.
That drift, she said, risks entrenching corporate influence over how knowledge is created and validated — and quietly shifting authority from human judgment to algorithmic logic.
The stakes for education
Hardcastle said the question isn't whether education will "fight back" against AI, but whether it will consciously shape AI integration to preserve human epistemic agency — the capacity to think, reason, and judge independently.
That requires educators to move beyond compliance and operational fixes, and to start asking fundamental questions about knowledge authority in an AI-mediated world, she said.
"I'm less concerned about cohorts being 'worse off' than about education missing this critical inflection point," she said.
Unless universities act deliberately, she said, AI could erode independent thought — while Big Tech profits from controlling how knowledge itself is created.