NEPC Issues 'Red Flag Warning' on Personalized Learning

A new report from the National Education Policy Center has suggested that the concept of personalized learning has been productized by technology companies in ways that "can put important educational decisions in private hands and compromise the privacy of children and their teachers."

"Personalized Learning and the Digital Privatization of Curriculum and Teaching," developed by three researchers at the University of Colorado Boulder, reported that these programs are "proliferating in schools" across the country. This spread is being sparked by ample philanthropic funding, lobbying by the tech industry, marketing by vendors that want to get into the K-12 market and an education policy environment "that provides little guidance and few constraints."

The analysis has uncovered "questionable" assumptions about the efficacy of personalized learning in the most influential programs, alongside a lack of research support, "self-interested advocacy" from the tech industry and "serious threats" to data privacy.

Despite the "red flags," the authors noted, the pressure to adopt personalized learning programs continues to grow. For example, states are promoting implementation of digital curriculum even though there's little "oversight or accountability" for the use of that software.

A big problem with personalized learning, according to the report, is that these proprietary programs that districts are subscribing to place major education decisions, such as whether a student has achieved a specific competency, into "private hands." Not only does that put the privacy of student and educator data at risk, but it emphasizes "data collection and analysis over other instructional considerations," the authors pointed out.

The researchers advocated pressing the pause button on personalized learning programs "until rigorous review, oversight, and enforcement mechanisms are established." They also pushed states to set up independent entities that can oversee the implementation of these programs in myriad ways, including:

  • Requiring external review and approval of curriculum, algorithms and pedagogical approaches used in personalized learning programs by independent education experts; and

  • Subjecting any organization that collects student, teacher and other data through personalized learning materials and programs "to a standard, legally binding, transparent privacy and data security agreement."

The report, "Personalized Learning and the Digital Privatization of Curriculum and Teaching," is openly available on the NEPC website.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • Stylized illustration of an AI microchip connected to a laptop, server rack, and monitor with a chart

    HPE and Nvidia Expand AI Infrastructure Partnership

    Hewlett Packard Enterprise and Nvidia have announced an expanded partnership to accelerate enterprise artificial intelligence adoption through new modular infrastructure and turnkey AI platform offerings.

  • shield with an AI microchip emblem hovering above stacks of gold coins

    Report: AI Security Spend Surges While Traditional Security Budgets Shrink

    A new report from global cybersecurity company Thales reveals that while enterprises are pouring resources into AI-specific protections, only 8% are encrypting the majority of their sensitive cloud data — leaving critical assets exposed even as AI-driven threats escalate and traditional security budgets shrink.

  • digital learning resources including a document, video tutorial, quiz checklist, pie chart, and AI cloud icon

    Quizizz Rebrands as Wayground, Announces New AI Features

    Learning platform Quizizz has become Wayground, in a rebranding meant to reflect "the platform's evolution from a quiz tool into a more versatile supplemental learning platform that's supported by AI," according to a news announcement.

  • teen studying with smartphone and laptop

    OpenAI Developing Teen Version of ChatGPT with Parental Controls

    OpenAI has announced it is developing a separate version of ChatGPT for teenagers and will use an age-prediction system to steer users under 18 away from the standard product, as U.S. lawmakers and regulators intensify scrutiny of chatbot risks to minors.