AI tools are entering classrooms faster than education systems can adapt. Most of the public conversation focuses on whether AI will lift test scores, ease teacher workload, or widen access. These are real questions. They are not the most important ones.
A growing body of research suggests that uncritical adoption of AI could undermine the very thing education is meant to develop: learner agency. The ability of students to make intentional choices about what, how, and why they learn. Without careful design, AI risks producing the opposite of what it promises.
What the research is showing
A recent integrative framework on AI in education identified several distinct risks once these tools become routine in classrooms: over-reliance on AI responses, diminished critical thinking, emotional disengagement, and a narrowing of cognitive skills when AI is used as a shortcut rather than a scaffold. The authors argue that AI without careful design can reduce learner agency and critical thinking, and even undermine trust in the education systems where it is deployed. Their conclusion is that responsible implementation requires human-centric governance, not just better tools.
Student voices echo this tension. Surveys of learners show that while AI tools are widely used, many students themselves feel the tools encourage passivity, reducing the opportunities to develop the deep thinking and creative problem-solving they want to leave education with.
Three principles for design
So how do we reconcile AI's potential with the need to protect learner autonomy? The evidence points to three core principles for educators, school leaders, and policymakers.
Design for agency, not dependency. AI tools should support self-regulation rather than replace it. That means systems that prompt students to reflect on AI feedback before accepting it, that show their working, and that make it possible for learners to disagree with the model.
Integrate AI literacy into curricula. Teaching learners how to prompt, interpret, and critically evaluate AI outputs is now a core literacy. Action research from Oxford has shown that critical AI use enhances learning-to-learn skills, increases autonomy, and safeguards against misuse.
Prioritise human-AI collaboration over substitution. AI should amplify human capacities, curiosity, judgment, and purpose, rather than replace them. The goal is not to outsource thinking to machines, but to free students to do the thinking that matters more.
The future of education is not about AI or autonomy. It is about AI that strengthens autonomy.
The brain rot question
There is a wider concern sitting alongside all of this. When AI systems are optimised for speed, efficiency, and instant response, they reinforce the same logic that drives short-form content. The same logic that has surfaced in recent global concern over "brain rot": shallow engagement patterns, shortened attention spans, and a diminished tolerance for ambiguity or complexity.
The real response to brain rot is not more tools. It is more time to think out loud, more time to sit with confusion, and more classrooms where finishing a thought matters more than producing a quick answer. Schools can respond by designing learning that rewards patience over speed, longer projects, fewer interruptions, and a culture in which complexity is embraced rather than abbreviated.
The bigger picture
The OECD's Trends Shaping Education work identifies four major forces shaping the next decade: AI and emerging technology, inequality and polarisation, changing skill demands, and the relationship between people and the environment. Education systems are being asked to prepare learners not just for jobs, but for participation in society and resilience in the face of rapid change.
That is the real bar. It is not about whether AI can boost test scores. It is whether AI is helping students think, decide, and grow on their own terms. The design choices we make now will shape what an entire generation can do, and choose, with technology in their lives. As practitioners and leaders, the task is to ensure that technology empowers learners rather than diminishes them. Innovation alone is not enough.