What I Want Parents to Know About AI
Schools Still Don't Know What To Do With It.
You’ve heard it in carpools, on the sidelines, and in late-night kitchen conversations: “Kids are using ChatGPT to finish homework in five minutes.” “Our school just banned AI completely.” “My daughter says her professors can’t even agree on whether it’s allowed to organize her notes.” Student use of AI has taken off over the past year, and schools haven’t kept pace. We’re nearly three years removed from the release of ChatGPT, and most parents still don’t have straight answers. What’s the policy? What’s allowed? How should parents monitor, support, or discourage their children’s use of AI to complete their schoolwork?
This is the third in a series of Open Letters about the impact of AI in schools. The first was to students, the second to teachers, and now to you, parents. As both a 30-year classroom teacher and the parent of three (two recent college graduates and a rising sixth grader), I see the confusion from every angle.
Parents have a front-row seat to what’s happening behind the scenes - the shortcuts, contradictions, and experiments. As this school year begins, here’s what to know and what to ask for.
Dear Parents,
It’s the last week of August. College students are settling into dorms, many K–12 districts are already back, and the rest return right after Labor Day. I saved this letter for now because, as the 2025–26 school year begins, questions about AI will likely be top of mind for many families.
If you have a child under 18 or one enrolled in college, then generative AI will play a role in their education, whether intentionally or by default. If you don’t know that already, you should.
For parents of middle and high schoolers, the worry is how kids will adapt to a classroom where AI tools are always available. Writing is the obvious concern, especially in the years when students are still learning how to organize and communicate their ideas. If schools keep assigning high-stakes written work outside class with little guidance, AI can easily short-circuit that process.
At the college level, the anxiety shifts to the future. Parents hear predictions about the collapse of entry-level jobs, the demand for entirely new AI skill sets, and doubts about whether a traditional degree still holds its value. The educational and career paths we once trusted now feel less certain, and fear mixed with hype drives much of the conversation.
Add to that weekly headlines about students using ChatGPT to cheat, universities divided over how to respond, and teenagers turning to AI “companions,” and it’s no wonder parents are all over the map. Some are alarmed, others curious, while most are unsure what to think.
While many of these stories are designed to drive clicks, the reality is a mess, with no clear path forward. That confusion is shared by teachers and administrators as well, but given that powerful large language models have been in students’ hands for nearly three years, it’s fair to ask: why don’t schools have more clarity?
Where’s the Plan?
AI usage and awareness vary dramatically across states, districts, schools, and even individual classrooms. Depending on where you live and who is in charge, you’re likely to see very different responses to how AI is handled or even talked about with respect to your child’s education.
The good news is that more states are at least starting to establish policies around AI. Some are experimenting with direct integration, making tools available through platforms like Google Workspace for Education, MagicSchool, or Flint. Others are moving more cautiously, acknowledging AI’s impact without embedding it into the curriculum. And some have gone in the opposite direction, blocking the technology altogether for both students and teachers. Yet as of July 1, 2025, nearly half of U.S. states still had no public guidance on AI at all.1
The bigger concern is that many of these decisions are being made with little consultation from a key constituency - parents. Families often find themselves guessing how their schools are approaching AI, because policies - if they exist at all - are rarely communicated in plain language, and when they are, they’re still vague enough to be meaningless. Diligent parents may end up discovering an individual teacher’s specific rules for an assignment, an email from an administrator about “responsible use,” or, if they’re lucky, a policy buried deep on a school website.
As Mike Whitaker, a parent in Colorado, puts it: Can your school leaders explain their AI policy in 30 seconds or less? That doesn’t seem like an unreasonable request.
For parents of college students, the communication gap is even wider. I’ve heard countless stories of families dropping their kids off and asking about AI policies, only to be met with blank stares or half-hearted reassurances. The most detailed information often comes secondhand from student guides, upperclassmen, or rushed orientation sessions. Universities may be scrambling behind the scenes to manage plagiarism and policy, but what parents and students actually hear is contradictory, confusing, and it almost never reflects a clear institutional position.
This lack of communication leaves families piecing together their own understanding of how AI is being addressed. And when institutions avoid transparency, it’s not merely frustrating. It leaves students unsure of what’s expected and parents unable to help. If colleges are still this muddled nearly three years in, it’s no wonder K–12 schools are struggling too. Simply stating “we’re working on it” isn’t going to cut it this year.
Parents notice these gaps and inconsistencies, which frequently pop up in online discussions:
“My child’s essay was flagged by an AI detector, but even the teacher admitted those tools aren’t reliable. How are we supposed to trust the process?”
“Our school’s AI policy is literally one sentence: ‘Use best judgment.’ What does that even mean?”
“I just found out my kid’s paper was graded by AI. No one told us beforehand.”
“Teachers don’t even agree. One bans ChatGPT outright, while the classroom next door encourages it.”
They are understandably frustrated. Given that powerful AI tools are freely available almost everywhere, parents deserve more specifics. What’s allowed? What isn’t? Who’s making these decisions, on what basis, and how often are they being revisited?
First Up: Get Informed
AI tools have changed since 2023. LLMs don’t just draft text. They can code, generate audio and video, turn notes into slide decks, and even carry out multi‑step tasks on their own. Most students aren’t using all these features, but they will, and many apps your kids already use now come with AI built in.
Where does that leave parents? Some may already recognize these trends from their own workplaces, where AI is reshaping everything from marketing to medicine. Others may be encountering it mainly through their children. Either way, it’s worth getting oriented.
The first step is to get informed - not necessarily about the models, but on the issues. There are plenty of resources. Some of the best are right here on Substack, where educators and researchers are debating not just the latest AI tools but the broader cultural and educational shifts they bring. I recently pulled together a list of voices worth following, all deeply engaged in the AI and education debates. The goal isn’t to become an expert with every tool but to get a clearer sense of where things stand today.
If you’re more interested in the bigger picture, the legacy media has been paying close attention. The New York Times, The Atlantic, The New Yorker, and a host of other sources have all published in-depth pieces on AI in education over the past several months, and new stories appear almost every week. These articles don’t capture the granular back-and-forth you’ll find on Substack, but they do show how the issue has broken through into the mainstream conversation.
Next: Talk to Your Kids
Getting informed on the AI landscape, even if you just scratch the surface, is a start. Then listen to your child. The more you understand the basics of what AI can (and can’t) do, the better questions you’ll be able to ask them. It’s not about shaming or catching them (or their friends), but about comparing what you’ve read to what they’re actually experiencing.
Ask about what they’ve seen, how they’re using AI, and what they’ve heard from teachers and peers. Some younger students may only think of it as a “cheating tool,” because that’s how the media and their teachers often frame it. Older ones are grappling with more serious questions about AI - the environmental impacts, economic implications, and existential risks. That context will help you ask for detailed information and more clearly express your concerns to school leaders.
What Parents Should Be Asking This Fall
As your children head back to school, the most important way to learn about its AI policies is to ask. District leaders, principals, and teachers should be able to answer concrete questions about what’s being done to establish rules around AI that reflect the values of your institution. If you’ve taken time to get informed, you’ll be prepared to dig a little deeper. But at the very least, you should come away with a solid understanding of your school’s overall AI philosophy.
The point isn’t to demand final answers. We all know this is a unique moment in education and even experts are trying to figure out best practices. But you are entitled to expect schools to take these issues seriously and not simply kick the can down the road. These questions aren’t exhaustive, but they’re a solid place to begin, whether at the next PTA coffee, board meeting, or back-to-school night:
1. Ask About Policy and Transparency
Does the district or school have a written AI policy that parents can access?
Are students allowed to use AI? If so, in which subjects and under what conditions?
Is AI being used for grading? If so, which tools and how are they monitored?
2. Ask About Teaching and Training
Are students being taught how to use AI responsibly? Are teachers receiving training as well?
How are assignments designed to assess student learning rather than simply produce AI‑vulnerable written output?
Is the school helping students understand the difference between “using AI to learn” and “letting AI do the work”?
If AI is banned, how will that be enforced consistently across classrooms?
3. Ask About Equity and Literacy
Are students being taught how AI actually works, including how models are trained, what data they rely on, and where they can go wrong?
Are schools helping students think critically about AI’s wider impacts, such as privacy and copyright concerns, and the environmental costs of training and running these systems?
What safeguards are in place to keep students from becoming over-reliant on AI for their schoolwork?
How are students with special needs - including English language learners and those with IEPs - being supported?
If student work is flagged as inappropriately “AI-generated,” what is the process for investigating, communicating, and appealing before discipline is imposed?
What does a good answer to one of these questions sound like? It doesn’t have to be perfect or permanent. It could be as simple as: “Here’s what we’re allowing, here’s what we’re not, here’s why, and here’s when we’ll revisit this policy.” Clarity and honesty beat silence every time.
You should add your own questions based on the unique circumstances in your community. Maybe your school is further ahead or further behind. Talk to other parents and pool information. Above all, be firm, direct, and kind. We all want the best for our children, and we will need a wide range of voices on such a complex issue.
Parents Understand the Challenges - And Are Empathetic
If you feel uneasy about all of this, you’re not alone. Parents recognize that the arrival of generative AI blindsided many educators and disrupted aspects of schooling most of us once took for granted. The questions above should start a conversation, not a confrontation. Schools need to know parents are partners rather than adversaries. As partners, we want to see human judgment, creativity, and care remain at the center of instructional practice regardless of whether AI has a place or not.
Parents also understand this isn’t going to be solved overnight. Sorting out the role of AI in schools is likely to take years, not months. The technology is already powerful enough to upend much of what students have long been required to do, and it will only grow more sophisticated. Sitting on the sidelines isn’t an option. Families want schools that are proactive, transparent, and willing to keep adjusting, not institutions waiting for the storm to pass.
It Can Be About More Than Just AI
You don’t need to master every new AI tool to advocate for clear school guidelines. Start with questions. Push for clarity.
AI may even be the spark for a real opportunity. AI is forcing educators to confront questions about how students really learn, what skills matter most, and how schools can deliver without sacrificing the human relationships and aptitudes at their core. Those are discussions that should have begun well before AI entered the classroom, and parents have a vital role in making sure they aren’t postponed again.
As both a teacher and a parent, I know these conversations aren’t easy. But with or without AI driving them, they are long overdue. And if this moment draws parents, teachers, and school leaders into a more honest and productive dialogue about pedagogy, then perhaps AI will have done us one unexpected favor.
This is based on State AI Guidance for K12 schools.






Really appreciate this letter to parents. I'm returning to my district's board meeting tomorrow night to give public comment on the AI policy. This time, in addition to asking for an update on any action over the last two weeks, I'm going to frame it as a values issue similar to this line in your letter: "District leaders, principals, and teachers should be able to answer concrete questions about what’s being done to establish rules around AI that reflect the values of your institution." The AI policy should reflect the core values whether those are related to equity, student empowerment, future readiness, community engagement or others. I explored that framing briefly with the concept of the Values Mirror: https://mikewhitaker1.substack.com/p/the-ai-values-gap-reflected?r=mld5. We'll see if it resonates. I believe the local news is also going to interview me, one of the board members, and maybe my kids about their experience so far. The whole approach is not meant to criticize the district and schools, but instead is just trying to raise the urgency of the conversation and keep the attention on it.
this was helpful for my back to school day last weekend - thnx!