If you’ve found yourself tinkering with ChatGPT to help write a cover letter or polish a paragraph, you’re not alone. But if you’ve worried that doing so might slowly rot your skills, you’re in good company too. That’s the worry that well-known researcher into ‘GRIT’ Angela Duckworth and her co-authors set out to test in a recent, tightly designed experimental study titled Coach Not Crutch: AI assistance can enhance rather than hinder skill development. It’s a paper with implications far beyond cover letters – and one that makes a strong case for rethinking what learning with AI might really mean.
Published in its final pre-print form at the end of September 2025 and co-authored by Maria I. Lira, Todd Rogers, Daniel L. Goldstein, Laura M. Ungar, and Angela L. Duckworth, the paper brings together researchers from the University of Pennsylvania, Harvard University, Microsoft Research, and Teachers College at Columbia University. Its findings are stark, surprising, and – in this educational moment – deeply consequential.
The Experiment in Brief
The researchers posed a now-familiar question: Does using AI to assist with writing help or hinder skill development? Their answer, however, cuts through much of the noise of community conversation.
In a multi-part experiment involving five separate studies, participants first completed a baseline writing task (revising a poor cover letter), then read a short lesson on effective writing strategies. They were randomly assigned to one of several practice conditions:
- with an AI writing assistant
- without AI
- with feedback from professional editors
- or with access to Google Search to find tips and examples.
Finally, all participants were asked to write a new cover letter without any assistance—first immediately, and then again one day later.
Let’s get some of the findings out there!
The detailed research paper found:
- Those who practised writing with AI showed significantly greater improvement than those who practised without it and…
- The benefits and positive impacts of this AI supported practiced was retained by students.
Even more strikingly,
- AI-assisted writers improved more despite putting in less effort.
The very first line of the findings section says it all:
“Contrary to predictions by lay forecasters, participants randomly assigned to practice writing with access to an AI tool improved more on a writing test one day later compared to writers assigned to practice without AI – despite exerting less effort”.
Beyond the Expert
Within the five experimental studies covered by the paper, the researchers compared AI-supported practice with feedback from professional editors and Google-based help.
Again, the AI group outperformed the others.
In fact, the AI support beat the professionals.
“Practicing with AI increased skill more than feedback from human editors (or simply googling examples and tips)” .
This in itself is provocative – but the most arresting finding may be the next one. I find if challenging and it seems worthy of further exploration in future research: just seeing an AI-generated example, without any practice at all, was as effective as practising with the tool. According to this paper, participants who simply read an AI-written letter improved as much as those who practised rewriting with AI support.
“Writers merely shown an AI-generated cover letter (without the opportunity to practice) performed as well as writers who practiced writing with the original AI tool” .
Why? The authors argue that current-generation AI systems may be functioning as exemplars – offering tailored, timely, high-quality demonstrations of abstract principles.
“The current generation of AI tools may teach by example, offering high-quality and personally tailored demonstrations of abstract principles that are otherwise difficult to grasp” .
This is more than tech novelty – it speaks to the heart of what good learning looks like. The best AI tools may, as the authors suggest, outperform textbooks, pre-set tutoring systems, and even subject domain experts when it comes to offering just-in-time, customised examples of what ‘good’ looks like.
“AI tools may provide higher quality, just-in-time examples exquisitely tailored to a learner’s specific needs… [and] might surpass even those provided by domain experts”.
The Bigger Question: What Is Learning With AI?
Generally speaking, all else being equal, the more attention and effort demanded by a task generally results in better learning. (Note this doesn’t mean something as simplistic as saying that “learning should be hard”.) But here, in this paper, AI both support reduced effort and improved learning.
This is challenging. It’s interesting that the findings point to the need for tasks to be meaningful and accessible.
The authors propose a mechanism to explain the paradox. The refer to the utility of the task itself.
In other words, when the task is higher quality – even if less effortful – it can still lead to deeper learning. That’s a massive insight.
“Why did practicing with AI improve writing skill more than practicing alone, even while reducing both the quantity and quality of effort?… We propose a third mechanism that offsets decrements in human effort: the utility of the practice task itself” .
Put another way: learning is not just about how hard you work – it’s about how good the learning task is.
And AI, if used well, can make the learning task more focused, clearer, more targeted – in other words, more accessible -, and more meaningful. That’s a radically hopeful idea for anyone rethinking how we structure learning – especially in classrooms.
Deliberate Practice, Grit, and AI
The authors draw a clear link between their findings and the deliberate practice literature. Anders Ericsson and Robert Pool’s PEAK famously argues that the best learners refine their skills through focused, feedback-rich, and mentally demanding practice. (The 10,000 hour rule is a bit of an over-simplication.) A central piece of this is forming detailed mental representations of excellent performance.
That’s what AI seems to provide here – high-quality models that help learners internalise what ‘excellent’ looks like. In this light, AI isn’t just an assistant; can be a scaffold for cognition.
Angela Duckworth’s own work on grit — the perseverance and passion for long-term goals — is relevant here too. Learning isn’t just about working harder. It’s about working smarter. And this study shows that AI can help learners do exactly that.
Against the Hype and the Fear
At a time when educators (rightly) worry that AI might short-circuit skill development or encourage corner-cutting, this paper might land like a dose of cold water – and calm insight.
“Findings should temper widespread concern that AI tools invariably boost momentary productivity at the expense of long-term skill development… AI writing tools reduced the effort users invested in practicing, [but] nevertheless accelerated their skill development” .
This is not a call for uncritical adoption.
It’s a caution against blanket bans, or the assumption that ‘more human effort’ is always better.
AI doesn’t have to be a crutch.
Used well, it can be a coach – and a pretty good one.
This paper offers into how we might start using AI more effectively in our teaching.
The final sentence of the article pulls the lens wide:
“Generative AI now makes it possible to learn from personalised, just-in-time demonstrations tailored to any domain. In doing so, AI has the potential not only to boost productivity but also to democratise opportunities for building human capital at scale” .


You must be logged in to post a comment.