As artificial intelligence continues to transform nearly every dimension of human life—from the classroom and the workplace to the most routine aspects of daily existence—one Oxford scholar is raising concerns that educational institutions may be imparting the wrong kinds of lessons to their students. Professor Rebecca Eynon, a distinguished academic affiliated with both the Oxford Internet Institute and the University of Oxford’s Department of Education, warns that schools must move beyond a narrow focus on teaching pupils merely how to adapt to, or cope with, the technologies of the present. Instead, she argues, they should be guiding students toward actively influencing, designing, and shaping the technological systems that will define the future.
Eynon emphasizes that artificial intelligence should not be viewed merely as an external force to which society must respond, but as a human creation—something inherently dynamic and malleable—that individuals, especially young people, have the capacity and responsibility to shape in line with the educational values and social visions they wish to uphold. As she wrote in a recent reflection, AI “is not just something to react to, but something that people should actively shape in relation to the kinds of education, and indeed society, we want.” For her, this orientation requires a deliberate and forward-thinking approach: educators must embrace a proactive stance rather than remaining reactive or defensive as new technologies permeate classrooms.
Her findings, emerging from Oxford’s *Towards Equity-Focused EdTech* research initiative, illuminate a paradox at the heart of the so-called digital generation. Contrary to prevailing stereotypes that portray young people as inherently technologically adept, Eynon’s research reveals that many students lack even fundamental digital skills. Tasks such as organizing files, composing and sending professional emails, or critically interpreting online information often prove challenging. Teachers, too, frequently express uncertainty about how digital competencies should fit into the broader curriculum—whether as stand-alone courses or as integrated themes across subjects.
From Eynon’s perspective, preparing students for an AI-infused world cannot be reduced to teaching technical proficiency alone. True AI literacy, she contends, must encompass critical reflection, ethical consideration, inclusivity, and social responsibility. It is not enough to teach young people how to operate tools or code algorithms; they must also be encouraged to think deeply about the wider systems—political, cultural, and economic—that underpin these technologies.
**1. Teach Criticality, Not Only Coding**
Eynon proposes that digital education must transcend the mechanical aspects of identifying misinformation or safely deploying AI tools. To equip students for meaningful participation in civic and professional life, schools must help them grasp the complex web of social, political, and economic forces that influence the design and functioning of modern technologies. In her words, young people should not be treated as passive “end users” of prepackaged AI systems but as informed citizens capable of interrogating and reimagining how technology operates within society. This means cultivating the kind of critical awareness that allows students to analyze how algorithmic bias emerges, why certain groups are underrepresented in data sets, and how corporations monetize user information. Such insights empower learners to challenge dominant narratives surrounding technological inevitability and to develop the intellectual tools necessary to question, rather than blindly accept, the technological infrastructures shaping their futures.
**2. Design for Inclusion**
Equally central to Eynon’s argument is the concept of inclusive design within AI education. She emphasizes the importance of hands-on, creative engagement with technology—activities that do not simply teach coding syntax but encourage students to connect digital innovation to social realities. In her view, the act of design becomes a vital method for revealing hidden inequities and for making visible how technological affordances can either reinforce or counteract existing injustices. When students examine issues such as algorithmic discrimination or unequal access to digital tools, they begin to understand that technology is never neutral—it always embodies particular values and priorities. To foster this understanding, Eynon advocates for educational projects where students investigate bias in AI systems or conceptualize digital tools aimed at solving problems relevant to their own communities. Integrating these initiatives across multiple disciplines—not merely within computer science—expands participation and helps a broader range of students see themselves as legitimate contributors to the digital future, capable of imprinting their perspectives on an evolving technological landscape.
**3. Share Responsibility**
Finally, while Eynon underscores the importance of empowering youth to critique and influence AI, she cautions against placing the entire burden of reform on students themselves. Educating critical and socially conscious digital citizens does not absolve society’s established institutions of their collective duties. The responsibility for ensuring that AI development aligns with ethical, environmental, and legal standards cannot rest solely with young learners. Governments must implement thoughtful regulations, educators must design curricula that embed social awareness, and technology companies must uphold transparency and accountability in their products and practices. Eynon’s statement that “there is a societal responsibility that does not just fall on young people to find ways to better govern, regulate, and change AI” underscores a crucial point: progress requires collaboration among individuals, institutions, and industries alike.
Taken together, Eynon’s perspective offers a nuanced reimagining of what AI education should become. Instead of treating technology as an inevitable force to survive, she invites schools to become laboratories for democratic innovation—places where learners do not merely consume knowledge but co-create it; where future citizens are not passive recipients of an AI-driven world but active architects of one that reflects their collective aspirations for fairness, inclusion, and critical thought.
Sourse: https://www.businessinsider.com/oxford-professor-3-ways-schools-can-teach-students-shape-ai-2025-11