Discussion about this post

User's avatar
Bette A. Ludwig, PhD 🌱's avatar

Well, with Microsoft making its play and Google already in some school districts, it does seem like that might end up happening. It's going to be a hot mess. But that's what happens when you implement huge systems too fast. Mass chaos.

Expand full comment
Roi Ezra's avatar

This capture what I've seen in corporate AI adoption, the same pattern of mistaking the tool for the goal.

In my work helping engineering teams adopt AI, the breakthrough comes when we flip the question. Instead of 'How do we integrate AI?' we ask 'What kind of thinking do we want to preserve and amplify?'

Your Alpha School example is fascinating, 2 hours of AI-driven core subjects feels like optimizing for the wrong metric. Speed isn't learning. What if those extra 6 hours weren't just 'practical skills' but space for the kind of messy, unoptimizable thinking that AI can't do?

The rubric example hits something: when we expect 'flawless' output, we're training students to optimize for AI's strengths rather than develop their own. That's not augmentation, that's replacement.

What would curriculum look like if we designed it around what humans uniquely bring, curiosity, ambiguity tolerance, meaning-making and used AI to handle the friction that prevents deeper work?

Expand full comment
4 more comments...

No posts