How Teachers and Administrators Can Contribute to AI Transparency
To help students understand and use AI tools, teachers need professional development that supports them in redesigning tried-and-true assignments with an eye to teaching critical thinking.
- By Douglas Fisher, Nancy Frey
- 10/08/25
Whether personalizing playlists or serving as 24/7 writing assistants, artificial intelligence tools have become woven into many facets of students' everyday lives. In the face of this rapid change, educators face a critical responsibility. It's no longer sufficient to simply allow or restrict AI in schools. Students must understand how these tools work, what they are doing, and why that matters. Teaching AI transparency means equipping students to think critically about the systems shaping their digital world and how to use these systems responsibly and ethically.
Making the Invisible Visible
AI is everywhere, yet often invisible. Students interact with algorithms daily through search engines, recommendation systems, and learning platforms without understanding how data shapes these outputs. Many younger learners even perceive AI as alive or capable of friendship. That's why we emphasize "teaching under the hood": giving students enough understanding to see what AI is doing behind the scenes without necessarily expecting them to become engineers.
You don't need to know how a fuel injector works to drive a car, but you do need to know how the turn signal operates. Just using AI is different from understanding how it works. At our school, we teach students about AI, we teach them skills for AI, and then we teach with AI.
The three-pillar framework outlined in Teaching Students to Use AI Ethically & Responsibly, a book we co-authored with Sal Khan, guides educators in this work. Elementary students learn to recognize when they are interacting with AI and grasp basic concepts, such as the idea that computers make guesses based on patterns. Middle schoolers examine how training data affects AI decisions and practice identifying potential bias in AI-generated content. High schoolers delve deeper into algorithmic decision-making, privacy implications, and ethical use in real-world contexts. With AI tools already implemented in hundreds of school districts, these principles are not just theories. They are practical classroom realities.
Designing Tasks that Build Critical Thinking
AI transparency also calls for rethinking classroom tasks. Traditional assignments are increasingly vulnerable to misuse or missed learning opportunities. Instead of trying to make classrooms AI-proof, it's more effective to design activities that engage students in critical thinking while requiring them to show their reasoning.
One approach we recommend is the "reverse quest," in which students start with an incorrect claim and use AI as a research partner to work backward and uncover how someone might have arrived at that conclusion. This process cultivates curiosity, strengthens data literacy, and gives students practice in verifying information and making ethical choices.
What students are doing is exactly what we always ask them to do: Show us your reasoning. They're not just copying information — they're analyzing, questioning, and understanding the path to an answer.
This type of task also helps students navigate AI responsibly. AI cannot replace the critical thinking process; instead, it can augment students' reasoning skills when integrated thoughtfully. In fact, AI tools that provide transcripts of student interactions allow teachers to track thinking processes in real time, giving them insight into how students engage with learning and AI simultaneously.
Supporting Teachers to Lead with Confidence
For AI transparency education to succeed, teachers need support. One-off workshops won't suffice. Educators need time to experiment with AI in their own practice, seeing firsthand how it can enhance lesson design, generate success criteria, and model verification strategies.
If teachers find value in it, then they're more likely to say, "If it's helping me, could it also help students?"
Ongoing professional learning and collaborative problem-solving are critical. Lessons from past technology rollouts, such as 1:1 laptop initiatives, show that distributing devices alone is insufficient. Without pedagogical guidance, even well-intentioned technology investments fall short. AI offers an opportunity to do it differently: to integrate tools thoughtfully, give teachers a sandbox to explore, and foster communities where they can share insights, challenges, and successes.
Teachers need to understand how their tried-and-true assignments might need to change. Traditional assignments are often vulnerable to AI misuse, but thoughtfully designed tasks can develop the curiosity, verification, and data literacy skills that students need.
Looking Ahead
AI is already shaping how students access, analyze, and create information. Its influence will only grow. Schools cannot shield students from it; the task is to equip them with the skills to navigate AI ethically and responsibly. That means embedding transparency, ethical reasoning, and data literacy into curricula across grade levels and giving teachers the time and tools to lead confidently.
K-12 education missed the opportunity to teach responsible use of social media. We can't afford to miss it with AI.
As AI integration deepens, students will need the ability to question outputs, detect bias, and make informed decisions about the tools they rely on. And just as importantly, teachers must be equipped to guide them. By making AI visible, designing tasks that build critical thinking, and supporting educators every step of the way, schools can prepare students not only to use AI but to engage with it wisely, ethically, and responsibly.
About the Authors
Douglas Fisher is a professor and chair of educational leadership at San Diego State University and a teacher leader at Health Sciences High and Middle College. Previously, Doug was an early intervention teacher and elementary school educator. He is a credentialed English teacher and administrator in California. In 2022, he was inducted into the Reading Hall of Fame by the Literacy Research Association. He has published numerous articles on reading and literacy, differentiated instruction, and curriculum design, as well as books such as The Teacher Clarity Playbook, PLC+, Artificial Intelligence Playbook, How Scaffolding Works, Teaching Reading, and Teaching Students to Drive Their Learning. He can be reached at [email protected].
Nancy Frey is a Professor in Educational Leadership at San Diego State and a teacher leader at Health Sciences High and Middle College. She is a credentialed special educator, reading specialist, and administrator in California. She is a member of the International Literacy Association's Literacy Research Panel. Her published titles include How Teams Work, Kids Come in All Languages, The Social-Emotional Learning Playbook, and How Feedback Works. She can be reached at [email protected].