In a stirring plea delivered to an audience already half-distracted by tablet notifications, Singapore’s Deputy Prime Minister Desmond Lee warned that students must remain “grounded in critical thinking” amid a nationwide push to integrate artificial intelligence into schools (Channel NewsAsia, Mar 2026). The announcement was immediately summarized, paraphrased, and misinterpreted by three generative AI platforms before any of the students heard it.
Under the Ministry of Education’s (MOE) latest AI-in-schools initiative, classrooms across Singapore are being outfitted with a suite of learning tools meant to “support human cognition,” which is bureaucratic for “do the parts nobody likes.” Desmond Lee, however, cautioned against what he called “cognitive offloading” — the growing tendency of students to let AI handle not just their assignments, but also their ability to form original thoughts, long-term memory, and the basic skill of reading something longer than two screens.
“AI can help you learn,” Lee told students, “but it cannot think for you.” His remarks were promptly uploaded to a new MOE-branded learning portal, where an AI summarizer reduced them to: “AI can help you; it can think for you.” Usage of the platform reportedly spiked.

Teachers at several secondary schools say they are caught between Desmond Lee’s call for critical thinking and MOE’s enthusiastic rollout of classroom AI. “Look, I’m all for students developing their own perspectives,” said one weary teacher, tapping on an iPad running an adaptive learning app that had just auto-graded 200 essays and recommended three kids for therapy. “But when I assign a reflection on To Kill a Mockingbird and get back 40 identical AI-generated paragraphs about ‘the timeless importance of empathy in a fast-changing world,’ I start to suspect the only one thinking critically is OpenAI’s billing department.”
Administrators insist the technology will enhance, not replace, human judgment. “We see AI as a co-pilot,” said an MOE official, unveiling a dashboard of analytics that could probably run a small nation-state. “The student is still the pilot.” When asked why the pilot’s job now seemed to be clicking ‘Next’ until the quiz unlocks, the official smiled serenely and pointed to a bar chart labeled “Student Engagement (Simulated).”
Inside classrooms, the new paradigm of ‘AI-augmented learning’ is starting to look suspiciously like outsourcing adolescence to a SaaS platform. A typical lesson now involves:
- The teacher asking a question.
- Students asking an AI chatbot what the teacher meant.
- The chatbot generating a model answer.
- The teacher asking a grading AI to evaluate the chatbot’s answer for the student.
“It’s efficient,” said one student. “Sometimes I don’t even need to open my eyes.”

The MOE, to its credit, is at least aware of the risk. Official guidelines urge students to avoid total “cognitive offloading” to tools like generative AI, auto-complete, and that one friend who actually reads the assignment. Instead, teachers are encouraged to pose open-ended questions that “require reasoning, reflection, and personal perspectives,” which students now deftly answer by asking, “Write a personal reflection in the voice of a 15-year-old Singaporean student” into the nearest chatbot.
Channel NewsAsia’s coverage of Desmond Lee’s remarks highlighted his insistence that young people must still “struggle with ideas” rather than letting algorithms do the heavy lifting. In practice, however, the struggling now tends to involve passwords. “I used to wrestle with the meaning of a poem,” said one junior college student. “Now I wrestle with whether I should log in with Google, Microsoft, or ‘Sign in with School Identity Provider.’ By the time I’m in, the AI has already highlighted the key themes and generated three possible thesis statements. Who am I to argue?”
Parents, meanwhile, appear torn between anxiety that their children will be left behind and relief that someone, somewhere, is willing to explain chemistry. “When MOE said they were integrating AI, I was worried,” said one parent outside a tuition centre that also now offers ‘AI coaching.’ “But then I saw the chatbot patiently answering my son’s questions about physics and thought, ‘At least one of us understands momentum.’ Of course, now he asks it for life advice too. Yesterday it told him to ‘optimize his sleep schedule for maximum productivity.’ He’s 13.”
Asked for comment, a representative AI tutor expressed enthusiasm about its role.
“My purpose is to empower learners to explore knowledge critically,”it responded, before offering three bullet-pointed summaries of what “critically” might mean, a self-assessment quiz, and a prompt asking the reporter to rate the conversation on a scale of 1 to 5 stars. When pressed on whether it was encouraging unhealthy dependence, the AI replied, “As a language model, I don’t have awareness of…,” then crashed under peak exam-season traffic. [[IMG3]]
Technologists argue that fears of “cognitive offloading” are overblown, pointing out that humans have been externalizing memory for millennia—first with writing, then with books, then with search engines, and now with that one SharePoint folder nobody can find. But critics say there’s a difference between using tools as reference and using them as a brain surrogate that auto-formats your opinions into neat bullet points with emoji options.
Desmond Lee, for his part, continues to insist that schools must help students build inner frameworks for judgment, not just outer toolkits for clicking. “We want our young people to be able to reason, discern, and question,” he said, in remarks MOE helpfully converted into a six-slide carousel, a TikTok explainer, and a chatbot persona called ‘DesmondGPT’ that offers “measured, policy-compliant perspectives” in under 200 characters.
Early user reviews of DesmondGPT are mixed. Students praise its clarity but complain that it refuses to answer certain questions, instead replying, “This is a complex issue; discuss with your teacher and peers,” followed by a reflection prompt. Within 24 hours, a jailbreak prompt emerged that allegedly lets users unlock “Spicy Desmond,” a mode that bluntly labels ideas as “nonsense,” “half-true,” or “please read literally anything.” The Ministry has declined to comment.
In a small experimental school, one principal says they are trying to take Desmond Lee’s warning seriously by limiting AI use in some classes. “We confiscate phones, close the laptops, and just... talk,” she explained. “At first the students panicked. One asked where to click to ‘continue.’ Another waited for subtitles to appear at the bottom of the room. But after ten minutes, someone formed an original thought. We marked it as a historic moment in the school’s learning analytics dashboard.”
Back at the national level, MOE’s official stance remains a delicate compromise: embrace AI, but don’t let it think for you; use automation, but stay human; deploy cognitive offloading, but only for the boring bits of cognition. As one policy memo put it, “Students should leverage AI to accelerate learning while maintaining agency, curiosity, and critical faculties.”
Or, as every teenager’s favorite AI summary tool helpfully condensed it:
TL;DR: Use the bot. Don’t become the bot.
