Students Aren't Obsessing About AI
Teachers Can't Stop
All summer, I’ve been reading, writing, and talking about AI in schools. I’ve connected with educators, strangers, friends, journalists, colleagues, and anyone who’d listen. Tomorrow I step back into a classroom of students who, thankfully, haven’t been thinking about AI these past few months. They’ve been working summer jobs, going to camps, traveling, pushing boundaries, maybe even making a few bad choices - all the things teenagers are supposed to be doing. None of them spent August parsing the latest LLM update or dreaming up ways to outsmart their teachers. That gap between their indifference and our obsession is what teachers need to reckon with this fall because once school starts, student AI use will surge again.1 For them, it’s a non-issue because, like Tik Tok, texting, or surfing the internet, AI is just a part of their daily reality.
Why does this matter? Because 99 percent of school should have nothing to do with AI. Like insurance and umpires, the AI policies I’m trying to develop will work best if we don’t have to think about them. They should be mostly invisible - in place for guidance, decisive when something goes wrong.
The best outcome this year is if AI policies serve as helpful guardrails, coming up only when needed. I don’t want to spend valuable class time policing ChatGPT or running an academic-dishonesty boot camp. If we reach January without spinning our wheels again over AI shortcuts, that will be a win.
But the only way to get there is to convince students that completely outsourcing work to AI hurts their long-term growth. Many will return to find teachers have restructured classes to emphasize in-person writing. If most students are going to use AI outside of class, it shouldn’t surprise them that teachers require them to produce more authentic thinking in a live setting.
That does not mean there are no classroom uses for AI. Any use needs to be intentional and tied to clear learning goals. Using AI just to use AI is a bad bet, especially with younger students. This is why I keep pressing for teacher training and AI literacy. The more comfortable adults are with the tools, the better they can spot where student use veers off track.
This also means meeting students where they are and letting them lead parts of the conversation, starting from the premise that most want to learn. If we engage them early with meaningful work and give consistent feedback, institutions have a chance to course correct from the past year. To re-cement the educational contract, we have to persuade students that relying solely on AI can impede their development. The only honest way to do that is to give them better alternatives and reduce the temptation.
I realize this sounds Pollyanish but, at least at the HS level, I’m holding out hope that adults still have some sway over student opinions. They are still forming their values and work habits and, I have to believe, it is possible to impart how much we care about their academic development. Will it be perfect? I doubt it. But I’m confident that honest and real conversations with students about AI, which includes sharing both our fears and our hopes, may reach some of them who know deep down how much AI off-loading is robbing them of valuable skills.
None of us know how the broader AI story in education will unfold this year. Will professors and teachers share success stories of how AI helped students learn - either by advancing an idea, enhancing an activity, or broadening a project? Or will the headlines keep circling back to cheating scandals, hallucinations, and broken promises?
Most unsettling are recent stories about the dire consequences when vulnerable students form relationships with AI chatbots. That trend is already forcing tech companies to respond faster than any concerns about plagiarism.
I wish I had a crystal ball.
Three Thoughts That Frame the Year Ahead
The past week of AI reading offered a few lines that stuck with me. Together, they offer a glimpse where we might be headed:
The question is not whether AI can do things that experts cannot do on their own - it can. Yet expert humans often bring something that today’s AI models cannot: situational context, tacit knowledge, ethical intuition, emotional intelligence, and the ability to weigh consequences that fall outside the data. Putting the two together typically amplifies human expertise:” A Better Way to Think About AI, The Atlantic, August 24th, 2025
“AI stands for amplified intelligence. It can only extend and empower that expertise you already have. If you haven’t yet built any, chances are that AI will deceive you into thinking you’ve got it when you don’t, rather than helping you build it.” Josh Brake
You might think our undergraduate studies were a poor preparation for the fast-approaching world of the Internet. But they were not bad, because at root Oxford taught us … fungible skills:
1. To read copious amounts (five books, five articles a week).
2. To think (I learned that long walks helped with that).
3. To write (one or two handwritten essays a week).
4. To debate (in tutorials, often one-on-one with a “don”).
5. To remain clear-headed under stress (our final exams consisted of ten three-hour papers within a week).
The Cloister and the Starship, Niall Ferguson (referring to his education in the 1980’s)
What these quotes make clear is that AI is not a substitute for human expertise. Used as a shortcut, it produces poor results and erodes the competencies students most need: the ability to read and write clearly and independently, and the discipline and flexibility to carry a project from an idea to a final product. These are not optional skills. They have always been foundational, and, as Ferguson suggests, will prepare students for any future, including one reshaped by AI.
Perhaps even more important are the so-called “soft skills.” Gen Z has been called out for lacking them, with good reason. The ability to empathize, listen, and be present, already distinguishes today’s students fortunate enough to have cultivated them. Walk down the street in any American city without staring at your phone or hiding behind headphones and you’ll see how rare that focus has become. In a world where we constantly create our own private digital cocoons, the ability to offer sustained and undivided attention towards another person might be the rarest trait of all.
The path forward is, at least for me as a high school teacher, starting to come into focus: recognize what AI can do, but keep the emphasis on the skills and habits that have always made students engaged, competent, and capable citizens. That means eliminating shallow assignments in favor of work that privileges reading with discernment, writing under pressure, sustained and personal projects, and authentic oral communication.
If that sounds like a return to basics, so be it. A wise colleague once told me the pendulum always swings back. The history of education suggests we know what works - inspirational and relevant tasks thoughtfully scaffolded on skills of increasing sophistication and difficulty, done together, in real time2 - and the pressures of modern schooling have eroded those tendencies. AI can be an acceleration of the downward spiral of the current model or an opportunity to drill down on what matters most.
It doesn’t mean school should not adapt or avoid the realities of AI’s enormous potential. The truth is, the most meaningful experiences in school involve human connection, and maintaining that connection in the spirit of inquiry and guided assistance, is the most precious thing we have.
Teachers won’t get to choose how AI impacts the outside world our students will inhabit. What we do have control over is how and whether to use the AI “crisis” as a means to rethink our practice and strip down our curriculum to emphasize the fundamental skills AI will never replace.
In many ways, especially for someone teaching as long as I have, it’s a truly exciting time to be in the classroom. Despite the challenges, I’ve been re-energized by what AI is forcing us to confront. I’m still hopeful that advances in AI will lead to more learning, more abundance, and more opportunities long term. But what all my reading and reflecting this summer has done the most is remind me to center students, not AI, as the reason we do what we do.
Clay Shirky does an excellent job tracing the history of the modern university back to the Middle Ages which, perhaps unsurprising, did not involve much student writing.



Great post! The real educational challenge isn't about blocking AI, but recognizing that life skills - resilience, critical thinking, and authentic communication - are now more crucial than ever. As technology makes knowledge instantly accessible, the true differentiator becomes a student's ability to wrestle with complex problems, develop deep empathy, and maintain intrinsic motivation. Our future depends not on fighting technological shortcuts, but on cultivating the human capabilities that transform information into meaningful understanding.
"For them, it’s a non-issue because, like Tik Tok, texting, or surfing the internet, AI is just a part of their daily reality."
I mean, it is fun to picture them creating their own Substacks in the summer debating ethical usage and optimal policies from a student perspective!
(In other words, this is our TikTok, then?)