Ten Things on My Mind About AI and Education
Integrate it or reject it, but it is untenable not to adapt in response to generative AI
Thank you for being here. Please consider supporting my work by sharing my writing with a friend or taking out a paid subscription.

The last week or so has been pretty busy, and with the Thanksgiving holiday quickly approaching, I haven’t had much time to write. This week, I’m experimenting with a slightly different format, jotting down a few macro-level observations I’ve been pondering related to the state of AI and education as we look back on three years since the release of ChatGPT. I would be curious to hear what you think, so leave a comment if something strikes a chord!
There are good reasons both to reject and to integrate AI in your teaching. But whatever you do, we need to teach with an awareness of AI and the way it is shaping the entire educational ecosystem. The only untenable decision is to stick our heads in the sand and pretend that it’s business as usual.
Sometimes, what you stop is more important than what you start. Inertia and the status quo often drive our decisions more than intentional and thoughtful design. It’s not a bad thing that generative AI is breaking some of the things we do in school. If generative AI gives us leverage to pivot and stop doing things the same way as we’ve been doing them, we should take that advantage.
Generative AI poses the biggest challenge for assignments that put too much emphasis on product as a proxy for process. Redesigning our assignments to scaffold them into smaller chunks of work will not only help our students to avoid the temptation to outsource the work to generative AI (e.g., by minimizing the number of heavily weighted assignments that are most likely to foster procrastination and tempt student to outsource their learning) but also help us to more clearly articulate the habits of mind and practices we are trying to help our students build.
There’s never been a better time to rethink how you are assessing the work in your classes. I’ve written several times about how my own prototypes with specifications grading have completely reshaped the way I think about assessment in my class. Trying out something new in your classes doesn’t mean that you need to completely jettison your current grading system. But it does mean that you ought to take the opportunity afforded by AI to consider how you might better align the extrinsic rewards you’re giving students through their grades with the learning you hope they will get from getting that grade.
Motivation matters more than ever. I am increasingly investing more time helping students to examine their desires and reasons for taking a particular course and helping them to build motivation that is guided by an understanding of why the work we are doing matters and how it is helping them to build a flourishing life.
The most important part of feedback is that it is given by another human being. This is not to say that the feedback itself is unimportant. Good feedback on how to make our work better is important for its own sake. But no matter how many times Claude or ChatGPT tells you how brilliant and well-written your work is, it can never be as significant and meaningful as hearing a word of affirmation (or critique) from a human being you trust and respect. Education is a fundamentally human endeavor.
The environment in which students do their work matters as much as the work itself. In a world where generative AI is always only a few taps or clicks away, we need to think carefully about where we are asking students to learn. This might mean devoting class time to work that we’ve traditionally asked students to complete at home. We should do this not in an effort to surveil them and enforce compliance with a particular policy, but to help them to be their best selves by reminding them that the integrity of their work is about more than just themselves.
The most significant opportunities for AI in education right now have nothing to do with students directly interacting with AI. They have everything to do with educators using AI to enable them to create more effective and engaging learning activities. Some examples might be using generative AI to summarize and code written feedback from students on areas they are struggling with (e.g., by analyzing student muddiest point or one-minute papers). Generative AI can also enable educators to create custom simulations or animations to help students visualize ideas more clearly. What would have taken a week to code a few years ago may only take a few minutes now with the help of AI coding tools like Claude Code.
Trying to detect AI use in your students’ work is a fraught endeavor. Even if we set aside the technical challenges of detecting AI use (and there are many), the much bigger issue in my mind is the impact that this is likely to have on your relationship with your students. I’m not arguing that you shouldn’t keep your eye out for tell-tale signs that indicate your students are short-circuiting their learning. This is not the same game as automatically detecting blatant plagiarism (a problem that can be solved, at least in part, by doing a direct comparison with material from other sources).
AI can help us learn more effectively. But it can also undermine our ability to learn, or worse, delude us into thinking we’re learning when we’re not. The best way to think about AI is as an amplifier. It is a tool that can extend our abilities. But we need to have raw abilities in the first place for it to extend. Even if you believe that AI will play a significant role in your future craft (e.g., for programmers), for much of a student’s early educational years, we should focus on building the raw inputs that might later be used as inputs to the amplifier. Students will have no problem figuring out how to use AI, but if we don’t teach them the fundamentals, the amplification will be of limited use.
Got a thought? Leave a comment below.
Reading Recommendations
This essay from
in Virtues & Vocations titled “Creating a Culture of Virtuous Leisure in a World of Total Work” was the best thing I read this week.So here’s my pitch: I think professors who genuinely care about the educational mission of their university should commit themselves to building “virtuous cultures of leisure,” and that administrators should do everything in their power to enable this work. In characterizing such cultures, I note striking overlap with practices that contribute to such cultures and practices associated with “leisure” in the classical sense. By more clearly articulating this connection, I aim to demonstrate how investment in the community of the university, in building up a culture of leisure and contemplative learning, promotes the learning goals we should all care about most deeply.
A short and sweet one from
telling you to lighten your grip.Pope Leo XIV is a solid follow on X and is also writing and speaking a lot about AI. This article from the WSJ, “AI Is a Tool, Not a Soul,” helps to shed light on some of the core elements of Pope Leo’s arguments.
The Book Nook
Not sure it’s a good idea to watch it right before bed, but I can guarantee you that the first episode of Pluribus will have you on the edge of your seat.
The Professor Is In
I was glad for the opportunity to visit Vancouver and speak to the K-12 faculty at Vancouver College last week. This was one of the first times I’d directly engaged with folks wrestling with AI engagement from the early elementary school ages all the way through high school. I was encouraged to hear how they’ve been thoughtfully engaging. I hope they took as much away from our time together as I did!
Leisure Line
Making pizza this week reminded me how valuable it is to make the dough several days in advance. Can’t beat the long proof.
Still Life
Had some unique dusk lighting one evening this week. The recent rain has brought a bit of variety to our normal SoCal sunniness.








Good thoughts, Josh!
I agree with your assessment reframing. For my course next semester, I've thought a lot about AI & assessment strats.
My assignments:
1 - a short paper
2 - a final 10 page paper
3 - an practical exercise which culminates in a class presentation & discussion
4 - a pen and paper mid-term
5 - a final oral exam
The last 3 are impossible to use AI for.
I will forbid the use of generative AI in writing for writing. It's possible students could use it and successfully hide it. But the mid-term and final could include questions which relate to these papers, so a lack of learning will become obvious.
I've always believed in oral examination as a good assessment technique, especially in the context of a seminary.
But writing (and thinking through writing) must not be bypassed due to the threat of AI, nor should it be outsourced with the help of AI.
Hi Josh! Love this, especially ‘Redesigning our assignments to scaffold them into smaller chunks of work will not only help our students to avoid the temptation to outsource the work to generative AI (e.g., by minimizing the number of heavily weighted assignments that are most likely to foster procrastination and tempt student to outsource their learning) but also help us to more clearly articulate the habits of mind and practices we are trying to help our students build.’
I write about humanizing the future of learning. I’m developing Somagraphic Learning to make education more inclusive, especially with AI. 🌸
I’d love your insights on my latest piece: What if Probability could be Learned in 3 Shapes?
https://substack.com/@devikatoprani/note/p-180126474