Education Minister Claude Meisch and neuroscientist Henning Beck told a conference on AI in education that schools should neither prohibit nor uncritically accept artificial intelligence
Rather than banning AI or promoting it without questioning its impact, schools must serve as guides, Meisch said. Their responsibility is to help students navigate AI tools so they learn to apply them with intention, care, and awareness.
Neuroscientist and biochemist Henning Beck, the conference’s featured speaker, cautioned that people who avoid AI will face serious consequences. He suggested that conversations around the technology focus too heavily on potential harms while neglecting the risks of avoidance.
Refusing to engage with AI could mean missing vital innovations, overlooking new economic possibilities, and losing out on opportunities that might shape the future, Beck said. Students need to develop the ability to recognise when AI serves them well and when human thinking is irreplaceable—a distinction he considers critical for tomorrow’s world.
Like Learning to Ride a Bicycle
Beck repeatedly stressed the need for a questioning mindset when using AI. People must learn to challenge AI-generated content, check for inaccuracies or distortions, and spot attempts at manipulation.
Such skills can only emerge through hands-on experience—testing the technology, exploring its boundaries, and scrutinising its outputs. He drew a parallel to learning to cycle: “You don’t master a bike by following instructions step by step. You learn by getting on and sometimes falling off.” Schools should create safe spaces where students can explore, fail, and gain resilience through trial and error.
Bringing AI into classrooms marks a profound change for educators and parents alike, many of whom attended school decades ago in a completely different environment. Hurriedly copying assignments on the morning commute may be obsolete now that AI can generate answers instantly.
Yet Beck warned that using AI purely for speed without deeper engagement carries consequences. Students who let the technology draft essays or interpret literature without their own involvement risk learning almost nothing. Worse, they may become reliant on AI, more susceptible to its influence, and ultimately less autonomous.
Genuine autonomy, Beck argued, comes from understanding how to use one’s own mind, enhanced—not replaced—by AI. The technology cannot substitute for setting objectives, cultivating curiosity, challenging established norms, and creating new frameworks.
AI Compass
The way AI is applied determines whether it empowers or diminishes learning. Luxembourg’s approach includes AI Compass, a digital platform providing resources and advice for students, teachers, and administrators on responsible AI integration in schools.
One challenge remains unsettled, Meisch noted: how to rethink student evaluation and testing in this transformed landscape. Assessment practices may need reworking to account for AI’s growing presence in education.
READ ALSO:
- Teaching Children to Harness AI While Keeping Their Minds Sharp
- US Pivots Toward Russian-Allied Sahel Military Governments in Security-First Strategy Shift
- Bill Gates regrets Epstein ties as Melinda French Gates hints at marital troubles
- Libyan Authorities Launch Investigation Into Fatal Shooting of Gaddafi’s Son
- UAE’s Biggest Bank to Open Lagos Office

















