Imagine summoning a teaching assistant by snapping your fingers. That’s the dream, right? One moment you’re staring at a blank lesson plan, the next there’s someone beside you with a stack of worksheets ready to go. The magic trick is impressive—until they also start offering to write your end-of-term reports, mark your essays, and, disturbingly, draft your wedding vows.
Artificial Intelligence has recently taken on the role of an over-eager teaching assistant. You know the type: they appear at your elbow with astonishing speed, sometimes handing you exactly the materials you need, sometimes presenting an essay so suspiciously polished you wonder whose work it really is. At their best, they can make your life easier; at their worst, they generate three pages of nonsense and then ask if you’d like them to staple it.
Of course, anyone who’s ever worked with a real teaching assistant knows the secret: it’s not about whether you’ve got one, but how you manage them. Give them clear instructions, set boundaries, and you’ll wonder how you ever coped without. Let them run wild, and suddenly they’re redesigning your lesson plan from scratch and colour-coding your entire filing cabinet.
The issue isn’t “AI: friend or foe?” so much as “AI: are you letting it help you, or is it already rearranging your desk while you weren’t looking?” Used well, it can give instant feedback, create fresh practice opportunities, and even coach teachers through reflection. Used badly, it reinforces stereotypes, fabricates citations, and convinces you it wrote your star student’s homework.
It’s tempting to frame AI as something unprecedented, a once-in-a-generation upheaval. But step back, and it looks suspiciously like a familiar pattern: every new aid is first branded a threat before it becomes standard practice. Every generation has had its scandalous shortcut. Calculators were going to kill mathematics, SparkNotes were going to destroy reading, Wikipedia was going to flatten research. Yet none of them erased the need for learning; they shifted the baseline. AI sits squarely in that lineage, as threatening as the calculator once seemed, and just as likely to become ordinary. Seen this way, AI isn’t a break with the past so much as the latest entry in the long tradition of tools that make learning look “too easy.” Students are now expected to know more, do more, and do it faster. AI is just the newest “forbidden cheat sheet,” and banning it won’t change the fact that it exists. What matters is teaching ourselves—and our students—how to wield it critically.
While it is easy to joke about cheat sheets, the truth is they have never really lightened the load—they’ve just changed its shape. For students, that means higher expectations. For teachers, it means new demands disguised as “support.” Which is where the joke stops being funny, because the punchline is bureaucracy. The pattern is clear: each tool makes some tasks easier, but the overall workload never shrinks. If anything, it grows—because someone, somewhere, decides that efficiency should mean squeezing more out of us. And that’s where capitalism sidles in.
Teachers have long been buried under endless box-ticking rituals and paperwork theatre: the endless forms, the reports filed away unread, the data-entry that keeps popping up like weeds. Hours that could be spent with students are instead swallowed by paperwork designed to “prove impact” to someone who has never set foot in the classroom. In that context, an overeager assistant is genuinely welcome.
The trouble starts when that same assistant is told to boost your productivity rather than lighten your load. It’s the difference between having them file a few worksheets and watching them re-alphabetise your entire library, then proudly announce you can now cover twice as many classes. AI could help us claw back time—time to focus on learners, or even (radical thought) to rest. But instead, there’s the danger it becomes just another mechanism to put more pressure on an already overloaded profession. A tool built to relieve us is too easily repurposed into one more set of expectations, leaving the teacher with less breathing space than before and yet another demand to “do more with less.”
Ultimately, AI is a mirror of the values we build into it. If we train it only for speed and dominance, we risk narrowing our definition of intelligence. If we train it for care, patience, and creativity, it can help nurture those very qualities. The question isn’t whether the teaching assistant is in the classroom—it already is. The question is: what jobs do we assign it, and what kind of classroom culture do we want it to serve?
In this issue of the Think Tank, our contributors take up this challenge—and we’ve paired their pieces with videos that show the conversation in motion. You’ll hear about AI as tutor, reflective coach, and creative partner, but also about the risks of shortcuts that flatten complex reading into SparkNotes summaries, or detection tools that wrongly accuse diligent students. You’ll see how cultural values shape what “intelligence” even means, and why ethics can’t be left to experts in distant boardrooms. In the articles that follow, you’ll see these themes unpacked in much greater detail—from the ethics of data use to the effects on teacher perception and well-being. Taken together, they remind us that while we may not have chosen this new assistant, we can decide how to coach it, when to trust it, and when to gently but firmly send it back to the staffroom for a quiet timeout.
Things to Keep in Mind When Talking AI
Nicky De Proost is a teacher with a love for stories and a fascination with the tools that shape them, always seeking ways to bring curiosity and critical thinking into her classroom.
