The Quiet Erosion: How AI Is Quietly Weakening Our Critical Thinking

Something is happening to our capacity to think. It’s not dramatic. There’s no single event to point to. It’s quiet, cumulative, and it’s already here.

A survey of 9,000 secondary school teachers in England, reported in The Guardian on April 2, 2026, found something that should concern everyone: two-thirds of teachers have observed a decline in critical thinking among pupils who use AI. Two-thirds. Not a small sample. Not a marginal effect.

“Students are losing core skills — thinking, creativity, writing, even how to have a conversation,” one teacher told the National Education Union. Another was more blunt: “AI is destroying what ‘learning’ — problem-solving, critical thinking and collaborative effort — is.”

The Students See It Too

Education Week reported in March 2026 that nearly 7 in 10 middle and high school students say they’re concerned that using AI for schoolwork is eroding their critical thinking skills. And here’s the kicker: they’re more worried than before.

They’re not alone in noticing. The SAGE journal published research on generative AI’s impact on critical thinking, noting that the integration of tools like ChatGPT has raised concerns that students may become dependent on AI-generated solutions, potentially stifling the development of the very skills needed to evaluate those solutions.

It’s a vicious circle: we need critical thinking to evaluate AI, but we’re using AI in ways that atrophy critical thinking.

What’s Actually Disappearing

The Guardian report highlighted specific skills on the decline:

Basic reasoning — When AI handles the “thinking” part, the muscle that does the thinking weakens. Like any skill, critical thinking requires practice. If you outsource every problem, you lose the ability to solve them.

Writing — Not the typing, the thinking that precedes it. The structuring of ideas, the selection of words, the editing process that sharpens thought. When AI generates the first draft, the human goes straight to submission.

Spelling — One teacher noted students “no longer feel the need to spell because of voice-to-text technology.” That’s a small symptom of a larger disease: if AI handles it, why learn it? But spelling is to writing what foundation is to building — skip it and everything above becomes unstable.

Conversation — Real-time exchange of ideas, the back-and-forth that sharpens thinking. When we replace dialogue with prompts and responses, we lose the training ground for reasoning with others.

The Teacher’s Dilemma

Here’s what makes this complicated: 76% of teachers now use AI for their day-to-day work (up from 53% last year). They’re using it to create resources (61%), plan lessons (41%), and do admin (38%). Only 7% use it for marking.

So the people responsible for teaching critical thinking are themselves becoming dependent on AI. Not necessarily a bad thing — but certainly a complicated one.

And 49% of schools lack any policies governing AI use by staff or students. 66% have no policy specifically for students. There’s no framework for how much is too much, when AI helps and when it harms.

The Real Problem

The issue isn’t AI itself. It’s what AI makes easy.

When you can get an essay in seconds, you stop learning how to write one. When you can get an answer instantly, you stop learning how to find one. When you can get a solution on demand, you stop learning how to solve.

Critical thinking isn’t a passive skill — it’s active muscle. It requires:

  • Questioning assumptions
  • Evaluating evidence
  • Considering alternatives
  • Recognising bias
  • Forming independent judgments

All of these require effort. And effort is exactly what AI makes optional.

What We Lose

Education exists to develop minds, not just fill them. The point of learning isn’t the information — it’s the thinking that happens while acquiring it.

When we skip the thinking because AI handles the content, we lose:

  • The ability to tell good information from bad
  • The confidence to form our own conclusions
  • The practice of rigorous reasoning
  • The capacity to question what we’re told

That’s not about exams or grades. That’s about functioning as an informed citizen in a world where information is abundant but truth is contested.

The Counter-Argument

AI proponents will say: we’re not replacing thinking, we’re freeing minds for higher-order work. Students can use AI for the mechanical parts and focus on the creative, analytical parts.

That’s true, in principle. In practice? The Guardian survey shows teachers aren’t seeing that. They’re seeing atrophy, not elevation.

The research in the SAGE journal noted that Bloom’s Taxonomy — the framework used to design educational goals — fails to address the cognitive changes that AI introduces. The framework assumes the student does the cognitive work. When AI does it instead, the taxonomy breaks down.

The Path Forward

This isn’t an argument against AI. It’s an argument for boundaries.

The teachers who responded to the NEU survey were almost evenly split: 49% opposed the government’s plan for AI tutors, with just 14% in agreement. The rest were uncertain.

Their concerns weren’t about technology — they were about what replaces the human in the learning relationship. “Students who need tutors often need more than academic support. AI will not give them that.”

The question isn’t whether students use AI. It’s what they’re doing while AI does the rest.


We don’t stop thinking because we have answers. We stop thinking because we stop asking questions. That’s what AI makes too easy.