Generative AI in Schools: A Closer Look and Future Predictions

Since ChatGPT's introduction last autumn, educators have been thinking through the potential impact of generative AI on education. While its full transformative potential remains uncertain, AI shows promise for teachers grappling with resource constraints and demanding workloads. Here's a closer look at the concurrent AI landscape in schools — and a prediction of what the future holds.

An Eruption of Single-Point Solutions

We've just entered the Wild West. While long-awaited applications — such as automation of personalized tutoring — are entering the market and appear more promising than ever, educators and technologists must diligently address the intricacies of generative AI. This includes bias mitigation, adherence to pedagogical standards, fact-checking, and curbing information echo chambers that hinder diverse learning experiences.

In the meantime, teachers, students, and entrepreneurs are experimenting with dozens of different use cases where the convergence of traditional natural language processing (NLP), computer vision, and generative AI are already making an impact:

  • Teacher Practice Support: Tools like MagicSchool aid lesson planning, TeachFX offers instructional coaching, and SchoolJoy generates personalized feedback.

  • Classroom Materials: AI assists in crafting activity-specific handouts (FlintK12), customizing reading materials, and formulating student questions.

  • Evaluation and Feedback: AI tools deter cheating by generating unique summative assessments for each test-taker (Examind) and streamline essay grading.

  • Student Support: Students can now take notes with AI assistance (Notion AI) and receive precise assessments to address literacy needs (Amira Learning).

These are Single Point solutions, where the technology addresses only one specific problem. In the coming years, the market will see a large number of these solutions across the preparation-instruction-assessment lifecycle, fostering a landscape of micro-entrepreneurship — teams of three to 10 building multi-million dollar businesses without venture backing. While these tools enhance teachers' efficiency, they still require proper setup and attention. But the upside is undeniable — by freeing up lesson prep time, teachers have more time for previously considered bonus activities such as differentiation and incorporating multimodal resources.

The Rise of AI-Native Learning Experiences

The next phase is AI native experiences, where a student interacts with and learns directly from AI with no teacher intervention. Ministers, prominent investors, and tech optimists have suggested generative AI holds the key to solving the global teacher shortage — or better yet, Bloom's 2-Sigma Problem.

The current reality falls short. Teaching is a highly skilled profession with room for only a tiny margin of error. Presently, generative AI falls short in terms of accuracy. Google's Minerva, a state-of-the-art STEM Logical Reasoning model, achieves a 72% accuracy rate in solving high school algebra questions, while ChatGPT struggles with 4th-grade multiplication. From GPT 3.5 to GPT 4, the OpenAI model didn't make any gains in standardized math test AMC 10.

I offer a prediction that generative AI applications might never be ready to teach students without human supervision on its outputs. Why? Because of the statistical nature of how large language models (LLMs) are built. They're designed to predict the next best word through a black-box process — not to index established truths or perform robust reasoning.

So far, high-profile ventures in the instruction realm, such as Kyron Learning, have fused teacher-produced, recorded content with LLM-powered conversational UX. The micro-learning tool Nolej references internet material when generating tasks and tests, but always holds the language model closely to the ground truth provided by teachers. Both are intriguing takes on re-imagining how to deliver core instruction and avoid hallucinations (generated content that is nonsensical).

Focusing on Teacher Enablement

While teacher replacement remains elusive, teacher enablement is not. LLMs can facilitate basic teaching activities, such as role-play games and round-robin discussions (OKO Labs). These AI-generated activities must align with teaching objectives, necessitating rigorous training with subject-specific knowledge.

Effective tools in this space share key traits:

  1. Remove most of the guesswork out of prompting. Flint, for example, automatically suggests teachers explore new prompts after the first passes of the lesson plan are generated (e.g. it asks if you want to replace the worksheet with a quick quiz or change the group activity to a game where students compete).

  2. Reduce errors by engineering prompts to reference established educational standards such as Common Core State Standards (CCSS) and Next Generation Science Standards (NGSS). MagicSchool's toolkits take this approach, and its generated handouts and assessments need fewer corrections and manual updates.

  3. Have a strong underlying structure. Despite the simplistic UI, each tool has just the right amount of scaffolding questions to get teachers started and can generate highly usable results in student-friendly formats (flashcards, tables, differentiated printouts, etc.) — as opposed to a vomit of text from ChatGPT.

Although we are still in the early stages of development, these pioneering tools are illuminating the vast potential for AI to support educators through user-friendly interfaces and thoughtful prompting. Weeks into their official launch, these tools have garnered significant user adoption, highlighting the value of "getting teachers started with AI."

Facing Education's Trinity Test

Looking further ahead, we confront a pivotal moment: In our lifetime, we will likely reach Artificial General Intelligence (AGI), the pursuit of which is OpenAI's founding goal and many technologists' worst nightmare. Vanderbilt University's recent announcement on disabling Turnitin's AI detector, largely because it produces unavoidable false positives, already offers a glimpse into a future where human thoughts and AI input will be non-separable.

If superintelligent machines come to be, the foundational rules of modern schooling will be rewritten. For educators, it means adapting tried and true pedagogical practices for learners in a new era. For edtech, it means the LMS + tools ecosystem, which the industry spent the last 25 years actively building, would be obsolete.

So, what's the path forward? Both educators and technologists have work to do, and so I offer some challenges for both. Educators: Let's commit to leveraging AI tools in classrooms and begin the process of redefining learning toward unprecedented equity and creativity. Ed tech providers: Let's commit to welcoming more teacherpreneurs into the conversation of disruption and progress. And for everyone, let's keep in mind that our goal isn't simply to coexist with AI, but to use it as a mirror in reflecting on our uniquely human, self-determining, and creative selves. The learning world is changing, and there's no turning back. Let's change it together.

Featured

  • glowing digital lock surrounded by futuristic dollar signs, stacks of currency, and coins, connected by neon circuit lines

    FCC Reports Strong Interest in Schools and Libraries Cybersecurity Pilot Program

    The Federal Communications Commission has received 2,734 applications totaling $3.7 billion in funding requests from schools, libraries, and consortia for its Schools and Libraries Cybersecurity Pilot Program, the agency reported in a recent announcement.

  • person signing a bill at a desk with a faint glow around the document. A tablet and laptop are subtly visible in the background, with soft colors and minimal digital elements

    California Governor Signs Off on AI Content Safeguard Laws

    California Governor Gavin Newsom has officially signed a series of landmark artificial intelligence bills into law, signaling the state’s latest efforts to regulate the burgeoning technology, particularly in response to the misuse of sexually explicit deepfakes. The legislation is aimed at mitigating the risks posed by AI-generated content, as concerns grow over the technology's potential to manipulate images, videos, and voices in ways that could cause significant harm.

  • stylized illustration of an open guidebook with a glowing AI symbol hovering above

    ED Releases Toolkit for Intentional Use of AI in Education

    The United States Department of Education's Office of Educational Technology has released a new resource to help education leaders navigate AI adoption while ensuring student protection.

  • a digital lock symbol is cracked and breaking apart into dollar signs

    Report: Ransomware Costs Schools Nearly $550,000 per Day of Downtime

    New data from cybersecurity research firm Comparitech quantifies the damage caused by ransomware attacks on K-12 and higher education institutions.