Bring Back the Computer Lab
Kids need to learn about computers; they don’t need computers to learn
Americans of a certain age remember the Computer Lab. It was that magical room in the school with enough computers to accommodate every student in a class, so that all could experience together the wonders of Oregon Trail, Number Munchers, and Mavis Beacon. In the Computer Lab, we learned to use the PC: to maneuver through a file system, create a document or spreadsheet, and then later to send an email or construct a search query: “pluto” AND “planet” NOT “disney”. Pluto was still a planet back then.
Crucially, the existence of the Computer Lab reflected the importance of learning how to use a computer, not the importance of using a computer to learn anything else. “Computers” did not teach, they were a subject to be taught. Computers did not help you to read or do math, to understand the Roman Empire or Newton’s Laws. Time on a computer was still an expensive resource that could not be distributed willy-nilly to all classrooms for all purposes.
As the cost of computing fell, educators and technologists began to envision a personal device for every student. Low-cost laptops running open-source software made this feasible. COVID made it reality. And once every child could have a computer, and then did have a computer, the irresistible logic of technophilia determined that every child needed a computer. Who would deprive our children of such a resource? How else will we compete in the global economy of tomorrow?
To paraphrase Dr. Ian Malcolm’s comment regarding the ill-advised over-deployment of technology for purposes of cloning dinosaurs, our schools were so preoccupied with whether or not they could, they didn’t stop to think if they should. The imperative of learning how to use a computer segued seamlessly into the imperative of using a computer to learn, so subtly that few seemed even to notice that these are entirely different premises. The continuing conflation has become especially obvious as discussions now proceed about the use of AI models. The argument goes that (1) all students need to know how to use AI, therefore (2) AI should be integrated throughout the educational experience. This does not follow.
Children learn when they have a relationship with a teacher, when they are engaged, when they interact, when they figure things out. Perhaps that sounds idealized—all well and good for the child fortunate enough to be learning from such a teacher in such a classroom, but not reflective of the typical experience. Maybe. But those same limitations apply equally to the deployment of technology, which can indeed be a valuable tool when used well, but compounds rather than compensates for an ineffective environment.
One can conduct literature reviews that find greater or lesser or negative effects of technology in the classroom through controlled experiments depending on the type of technology and the context of its deployment. But I would suggest that the better evidence comes from the behavior of the technologists in the wealthy communities who design the products but want them out of their own children’s classrooms. And ultimately, the proof is in the aggregate result: the absence of educational gains achieved in the period of intensive investment digitizing the classroom. How else could all that money have been spent?
Adoption of AI, I would humbly suggest, will rapidly worsen all these problems. It will offer counterproductive shortcuts for not only unengaged students, but also their teachers. We will undoubtedly be hearing soon about the extraordinary opportunities to enhance individualized learning at lower cost by sitting low-income students in front of AI “tutors,” while those developing the AI systems and hawking the demos shield their own children from such reliance.
Engaged and knowledgeable parents will force their children to learn how to think through an argument and compose an essay. Disengaged and time-constrained parents will once again feel powerless, this time as their children hand in AI-generated assignment. Economists like Tyler Cowen will be there to declare the situation unavoidable, celebrate it as more efficient, and cheer the “great news”—which sounds eerily similar to the pitch for globalization. “This will work out great for everyone,” the elite opinion will coalesce, even as it works out only for the elite, and only to the extent they avoid its effects.
If I may, I’d like to make two suggestions; one for technologists and one for schools.
To the artificial intelligence companies especially: Try being the good guys. The Silicon Valley culture of “go after the kids” that suffuses the strategies for profit maximization at the social media companies seem to have spread to the AI companies as well. But why?
At least, in some amoral sense, the social media companies trying to hook kids are behaving rationally. They have a product that people often hate using and wish did not exist, that relies on a combination of habit formation and network-effect entrenchment. But addiction is not the AI business model. If these companies and their large language models are going to succeed, they will succeed because they actually offer something valuable that people want to use in productive activities.
A social media company rationally fears that if it behaves responsibly, it will lose to the unscrupulous competitor. But what if OpenAI were to do the right thing and prevent its models from being used to do kids’ schoolwork for them? So they’d lose the “kids who like cheating” market? What exactly is that worth? Much less than they’d gain in goodwill from demonstrating that they can control their own technology. Schools with the option could allow access only to models demonstrating such controls. Competitors would follow suit.
Can AI models be controlled this way? So I’ve been told. The cutting-edge safety research at frontier labs is focused on preventing highly sophisticated actors from using the tools in nefarious ways. But telling the models not to help kids with their homework… that’s beyond the capabilities of our Masters of the Tech Universe? Come on. I’d highly recommend they each try to demonstrate that they are the company that can do this, not one that can’t.
And to schools, my suggestion, perhaps more of a plea, is simple: Get the computers out of the classrooms. Are there things those computers could do usefully? Sure. Are there great teachers who use them well? Undoubtedly. But I recently heard preeminent professors from two different fields at two of our most prestigious universities note at the same event that they ban laptops in their own classrooms. If top students at top universities do not need the technology, and cannot resist using it in unproductive ways when it is available to them, I promise your fourth graders will not do better.
Teach kids that learning is not a technological function, it is a habit of mind; that reading is done by closely studying the words on a page, not answering multiple choice questions to proceed more quickly through gamified levels; that science is about making sense of the natural world through experimentation, not being entertained by YouTube videos.
And when they need to learn how to use digital technology, and how AI might someday, maybe, make them more productive in completing tasks that they have already learned to actually do themselves? Walk them down the hall to the Computer Lab. That’s the room for the small fraction of education that is about computers.
- Oren
Bring back penmanship and the blue book. Inexpensive tools that truly test knowledge. You can’t fake it or use AI with a short and/or long answer blue book test.
I'm not sure you're correct about the lack of incentives for AI companies to hook children on their products. It might not be accurate to call it addiction, but a kid who learns to use AI as a crutch when faced with any mentally demanding task will probably turn into an adult who continues to do so. From what I can tell, the "kids who like cheating" market is actually pretty substantial (at least, it's widely used. Not sure if the kids 'like' using it to cheat).
Among adults I know, the amount of use varies, but there are definitely heavy users out there who seem to use it constantly. Right now it does kind of remind me of the distribution of users for other vices. Some just dabble; others have trouble regulating their use. I'm not sure if that's the right way to look at it, but from that angle it makes perfect sense to start children using AI as soon as possible. You want to increase the number of people who will rely on it for life.