Imagine a University of Virginia classroom with wood paneling, tall windows letting in afternoon light, students using laptops, and someone’s coffee cooling on a desk. Piers Gelly, the English teacher at the front, is quietly causing discomfort in the classroom. He wants to know if his pupils still need him. Not to provoke. as an actual, graded inquiry. They will cast their votes at the conclusion of the semester.
That experiment, which involved 72 students in four sections of his English class during the 2024–2025 school year, might be the most honest thing a teacher has done with the AI question since ChatGPT arrived and upended all preconceived notions about writing, learning, and the true purpose of a classroom. Gelly made the tool the focal point of his course rather than outlawing it or ignoring it. He described it as posing the “central, existential question” to his students: if a machine could write in a matter of seconds, was learning to write still worthwhile? He informed them that it was their decision. He would present the proof to them. They would then make a decision.
| Information | Details |
|---|---|
| Experiment Subject | AI chatbot replacing a human English teacher in a university classroom |
| Institution | University of Virginia (UVA), large public university |
| Instructor | Piers Gelly — English teacher, writer, and educator |
| Experiment Duration | Full academic year: 2024–2025 |
| Students Involved | 72 student writers across 4 class sections |
| Core Question Posed | Is it still necessary or valuable to learn to write? |
| Student Vote at End | Students voted on whether AI could replace their human teacher |
| AI Tool Used | ChatGPT (referred to by students as “Chat”) |
| High School Context | Ashanty Rosario, senior at a NYC public high school, wrote about AI’s effect on peer learning |
| London School Pilot | David Game College — introduced AI tools for 15-year-olds before standardized exams |
| Co-principal Quote Source | John Dalton, David Game College |
| LA Unified AI Chatbot | AllHere — financially collapsed despite millions in public funding |
| Key Critic Quote | “We are not seeing anything that could replace a quality educator” — Hadida Grabow, Higher |
| Bill Gates Position | AI tutors will be “like a great high school teacher” — but stopped short of endorsing full replacement |
It’s worth stopping to consider how peculiar that is. It is not appropriate for teachers to try out for their own positions. Perhaps too conveniently, the social contract of education has always presumed that the person in front of the class is essential. Gelly rejected that presumption out of what appears to be sincere intellectual curiosity about where the line truly lies, rather than out of nihilism. According to OpenAI’s Sam Altman, ChatGPT is “a calculator for words,” which, depending on how you feel about language, can either be comforting or extremely unsettling. The argument goes that math class survived the calculator. The English class will also endure this. To his credit, Gelly chose not to simply state that. He made the decision to investigate.
In the meantime, Ashanty Rosario, a senior in high school in New York, was witnessing an uncontrollable situation take place all around her. In September 2025, she wrote in The Atlantic about classmates dropping whole Frederick Douglass chapters into ChatGPT during class discussions, submitting machine commentary on human suffering as if it were a reasonable thing, and using the generated annotations as participation credit. Before the teacher had finished explaining the assignment, a peer in Algebra II took a picture of a homework worksheet and had the AI complete it. These weren’t isolated occurrences.
They were the new normal, and Rosario was genuinely concerned about the passive inevitability with which they spread throughout the school. It wasn’t precisely the cheating that got her. The 11:59 p.m. deadline as a social and academic ritual vanished, and the shared frantic energy of students genuinely attempting to complete something independently vanished. She wrote, “That’s gone now.” It was accompanied by something more difficult to identify.

David Game College in London, where co-principal John Dalton started a pilot project in 2024 permitting fifteen-year-olds to use AI tools prior to their standardized tests, has received the most direct criticism. Personalized learning, which allows students to proceed at their own speed instead of being restricted by a classroom rhythm that hardly ever works for everyone, was the stated objective. On paper, the idea is tenable. It didn’t land well in practice. AllHere, an AI chatbot used by the Los Angeles Unified School District with millions of dollars in public funding, was cited by critics as having collapsed financially and leaving behind a product that frequently advised students to “ask your teacher.” The irony was overt. In the words of educational consultant Hadida Grabow, “We are not seeing anything that could replace a quality educator.” “The technology just isn’t there yet,” she said.
Observing all of this from a distance gives me the impression that the debate has been poorly framed from the beginning. The question isn’t whether AI can take the place of teachers; in most cases, it can’t, at least not in a way that results in learning that is worthwhile. What happens to students who use it as a default and reach for the chatbot before trying the idea is a more pertinent question. The fact that AI does the work is not the issue. It’s that students cease developing the mental habits that initially enable them to work hard. Rosario observed this in her classmates: a tendency toward softening, a dislike of sitting with difficulty, and a preference for the polished machine response over the unsure human effort. It’s really unclear if that can be reversed.
In 2023, Bill Gates made the case that AI tutors would eventually be patient, individualized, and always available, just like excellent high school teachers. He took care not to suggest that they ought to take the place of human teachers. It takes a lot of effort to make that distinction. Because the pressure to let the machine do more does not originate from educational philosophy in schools with long-term teacher shortages, districts experiencing financial difficulties, or colleges where English instructors are already overburdened with sections. Budget spreadsheets are the source of it. Additionally, budget spreadsheets don’t inquire as to whether students are genuinely developing their critical thinking skills.
There hasn’t been much in-depth coverage of what Gelly’s students voted on at the end of his semester, which may be significant or merely a reflection of how these stories are covered. However, the experiment itself—the willingness to leave the question open and allow students to sit with genuine doubt about the purpose of their education—may be worth more than the results. Because it is precisely this type of discomfort—the pressure to make a decision without the machine making it for you—that is vanishing from classrooms. Furthermore, it’s still unclear if anyone in authority has realized how much that costs.
