Last week I had lunch with one of my graduating seniors before he headed off to college. I asked him how colleges and universities should handle AI and whether he’d be willing to share his thoughts in a guest post. I’m glad I asked. His answer is honest and insightful, offering a window into how the next generation is already thinking about this massive “generational” shift.
Here’s what he wrote:
I started high school before the arrival of ChatGPT and AI chatbots were widely used in schools. Yet, by the time I graduated this spring, AI tools were ubiquitous and essential, and most of us couldn’t imagine living without them.
As I head into my first year of college, I am both concerned and intrigued about how AI will continue to evolve and shape our education. I’m worried that most universities, including mine, don’t yet have the infrastructure in place to take advantage of this generational technology. And I worry that without real instruction on how to think with AI, students will become dependent on it in ways that hurt their learning and development in the long run.
Universities need to strike a delicate balance: protect against blind dependence on AI by preserving students’ ability to think for themselves, retain information, and write coherently, independent of AI, AND leverage AI in deliberate and structured ways to enhance their learning.
Protecting against AI “brain drain”
There may be no such thing as a fully “AI-proof” assignment, but professors can design assessments that make it much harder to cheat. That means assignments that require original claims, abstract thinking, and creativity which are the things that AI still struggles to do well. These higher-order thinking tasks not only expose shallow AI-generated work but also promote real learning. Minimizing the amount of times students use AI to think for them will reduce AI brain drain - the potential for AI to negatively impact human cognitive abilities due to increased reliance.
Also, while not perfect, AI detectors like GPTZero and Sapling do exist. For students lazy enough to copy and paste large blocks of chatbot text into their essays, these tools can catch them. Professors should make this clear to students to discourage this behavior.
Leveraging AI to enhance the academic experience
I want to believe that most students genuinely want to learn and grow intellectually in college. But many have no clue how to properly use AI to enhance their thinking abilities. Additionally, I know students who are scared of even exploring AI because of the stigma around using it in school - this is also unhealthy because they are missing out on all AI has to offer.
I would argue that students would use AI more effectively if they perceive it as a social norm and understand how it can help learning. I also believe that destigmatizing the use of AI and incorporating it carefully into curriculums would reduce irresponsible use.
We know this from the college drinking discussion: when drinking is banned on campuses, people pretend they don’t drink, and because it is rarely talked about, incidences of irresponsible drinking significantly increase. The same thing will happen with AI unless it is acknowledged and taught. Responsible and effective AI use is a skill that needs to be taught.
Therefore, faculty should be trained to incorporate AI into classroom assignments, especially in college. Examples of this might be leveraging AI capabilities for critiquing students’ own writing and identifying bias, or even one-on-one tutoring.
In fact, it was recently announced that Anthropic is launching new “learning modes” for its Claude AI assistant that is supposed to prevent dependence on AI-generated answers. Similar tools include Study Mode for ChatGPT and Guided Learning for Google’s Gemini. I would certainly like to see how tools like these can improve my learning in a systematic way.
What should be done at an institution-wide level?
As a high school student, I appreciated when teachers allowed the use of AI but set up firm boundaries for its use, so we were all aware of where the line was. However, while those guidelines detailed when you could and couldn’t use AI, they didn’t cover HOW to use AI effectively.
This is why I believe that each year, college/university students and faculty should be required to complete up to a 2 hour university-specific training module on how to effectively use AI to enhance learning, as evidenced by research. Topics covered could include:
Dependency/reliance on AI
Intellectual-based skill atrophy (a.k.a AI brain drain)
Implications and examples of successful Human-AI collaboration
The learning process and importance of critical thinking
Academic integrity
Bias and potential inaccuracy of AI output
Data privacy concerns with respect to AI
This training should also outline the university’s general attitude towards AI use (though I believe the specific policies ought to differ among academic departments), and make clear what using AI to cheat looks like.
This begs the questions of how students should be allowed to use AI on writing assignments. Most students would probably be comfortable using AI for idea generation, idea honing, outlining, and refining. I would say that using AI for idea generation should be discouraged, but idea honing, outlining and refining may be permissible. However, there needs to be a way to ensure that students are actually learning from the AI feedback. If they are simply inputting a prompt and then turning off their brains, that is not doing them any good. Perhaps in this training there could be exercises where students are asked to identify how and where the AI improved their writing or idea, and why the proposed edit was an improvement.
Finally, universities ought to put in place a standardized framework for citing AI if a student uses it for assistance, or other acceptable ways. This would normalize the use of AI for specific situations.
What about AI-use for studying, organization, and class preparation?
College students are adults. A recent federal lawsuit showed that over 80% of Fortune 500 companies use ChatGPT. Everyone is using it in the workforce to increase productivity and streamline workflows.
So there should be little regulation on the use of AI in non-graded scenarios; if students want to use AI to enhance their knowledge or aid them with pursuits outside of the classroom, more power to them. They have to learn how to responsibly use it the same way as adults, just like any other tool. After all, the college experience happens both in and out of the classroom, so colleges have every responsibility to prepare students for the workforce. What we don’t want is to have severe disparities in comfortability with AI use, i.e. students entering the workforce who have no idea how to effectively use AI to increase productivity, when other students around them have been taught how to use it.
So what might responsible use of AI look like? In terms of education, it looks like using AI to understand the “how” and “why” questions. I’ve been using ChatGPT for over two years. My Chat history is all over the place, from asking it how to maximize the crispiness of a pizza crust to making it speculate on why some animals synthesize ascorbic acid while others don’t. AI is immensely useful for learning about concepts efficiently, and for finding clear and thorough explanations to specific questions that may be difficult to Google.
Sometimes, I treat ChatGPT as my research partner as I parse through literature. Sometimes I treat it as my thesaurus, sometimes as my linguistic professor, and sometimes as my philosophy pal. And I can honestly say I have learned so much from it. I essentially have an expert on all things at my disposal, and one that understands what I’m trying to say even when I can’t even properly put it into words. It has made me more intellectually curious and eager to explore the nooks and crannies of knowledge and creativity.
Concluding thoughts
Ultimately, AI is going to be a part of our lives forever, and there’s no point trying to completely shield students from it.
The whole point of college is for students to grow their ability to think and become knowledgeable about their chosen field, so to uphold that end, AI use should be regulated and taught well.
But with the advent of AI tools like Cluely (and dozens of others) which can essentially tell you exactly what to say, and when, and how, in almost any situation, I feel as though trying to “protect” students from the full extent of what this technology can do is short-sighted.
As long as we shy away from the mentality of “AI will make it such that I will never have to think again” and align more closely with “AI will teach me how to think and learn better than ever before,” we should be ok. And I believe it’s up to colleges and universities to help us get to that point.
Reviewing Vivek’s reflection reminded me just how far ahead many students already are when it comes to AI. It’s impossible to read his ideas and conclude that this is a student who wants to “cheat” with AI. He reinforces what many of us already know - most students have already moved on from the decision on whether to use AI to the more important one of how to use it. And, while some like Vivek are clearly learning from it, most desperately want more guidance from the adults in charge of the academic program to engage and reckon with that reality. His recommendations directly address the faculty confusion and student-professor disconnect that recent reporting confirms is widespread across American universities.
I want to thank Vivek for his willingness to share such a thoughtful perspective on a topic that generates intense debate in educational circles. His insights reflect the reality many students are navigating, often without institutional support. While this represents one student's experience, it aligns with broader trends documented in recent reporting, including yesterday's piece in the Atlantic, College Students Have Already Changed Forever.
SRF
As someone who teaches high school juniors and seniors, your student identified everything I have had in mind for the past two years. I think we need to give ourselves a break in that we are not likely to get this "right" -- the balance between AI usage and creative thinking -- for quite some time. Continued experimentation and learning are needed, as well as the sharing of many different ideas and perspectives. Thanks for being a part of this process.
This is brilliant! Bravo to this student for being so thoughtful, and thank you, @Stephen, for putting this out. The described two-hour training would be very worthwhile.