For Ed Tech Leaders, Student Data Privacy Is Critical — But Missing from the Job Description

A new report from CoSN examines gaps and opportunities in student data privacy at schools and districts across the country. The 2025 National Student Data Privacy Report surveyed 400 ed tech leaders from 39 states and the District of Columbia about their district's privacy practices, the tools, resources, and supports available to them, and barriers to improvement.

While nearly 90% of survey respondents said they oversee their district's student data privacy program, 75% of those ed tech leaders said it's not part of their job description. Seventeen percent of those with data privacy responsibilities said they have not received any training on student data privacy, and 25% paid out of pocket for their training. Still, 88% of respondents ranked data privacy as one of their top two priorities, and 46% said it was their number one priority.

"Despite a lack of employer-provided training or codification of the work of building and improving a student data privacy program as part of a formal job description, respondents clearly prioritized the work," the report noted.

Ed tech leaders' top student data privacy concerns were:

  • The inability to manage employee privacy practices (cited by 75% of respondents as "very" or "extremely" concerning);
  • The inability to contain the influx of free and low-cost classroom technologies brought in by teachers (69%);
  • The inability to enforce internal, employee-facing privacy policies (55%);
  • The inability to mandate employee privacy training (49%); and
  • The lack of sufficient district privacy policies to guide practices (41%).

Time and manpower were the biggest roadblocks to data privacy improvement for ed tech leaders, each cited by 60% of respondents. Other barriers included the need for guidance on implementing federal (47%) and state (46%) student data privacy laws, as well as a lack of privacy expertise (38%). Thirty-six percent of respondents cited financial resources as a barrier.

"Setting the stage to establish the importance of student data privacy work within a district does not require the purchase of an expensive tool or technology," the report asserted. "Instead, the first requirement is that, collectively, attention be paid in a way that emphasizes the importance of the work across the institution, and that supports ed tech leadership in effectuating the necessary organizational change." The report's final recommendations for district leadership:

  • Recognize the work of building and improving a student data privacy program as leadership imperative.
  • Emphasize the importance of the work with all employees, and create updated job descriptions that reflect the importance of the work for those responsible for building and implementing student data privacy programs.
  • Ensure that the student data privacy program is adequately resourced — including with adequate policies — and provide training necessary for ed tech leadership to succeed in the work.
  • Support ed tech leadership in breaking down institutional silos to more effectively implement student data privacy requirements across teams, including by mandating privacy and security training for all employees.
  • Ensure that all district employees adhere to district policies with consistent enforcement of those policies.

The full report is available on the CoSN site here.

About the Author

Rhea Kelly is editor in chief for Campus Technology, THE Journal, and Spaces4Learning. She can be reached at [email protected].

Featured

  • DreamBox Math

    Discovery Education Announces Updates to Experience, DreamBox Math

    K-12 learning solution provider Discovery Education has announced enhancements to its Discovery Education Experience and DreamBox Math products, designed to create a more personalized, engaging learning experience for students.

  • abstract pattern of cybersecurity, ai and cloud imagery

    Report Identifies Malicious Use of AI in Cloud-Based Cyber Threats

    A recent report from OpenAI identifies the misuse of artificial intelligence in cybercrime, social engineering, and influence operations, particularly those targeting or operating through cloud infrastructure. In "Disrupting Malicious Uses of AI: June 2025," the company outlines how threat actors are weaponizing large language models for malicious ends — and how OpenAI is pushing back.

  • digital dashboard featuring a shield icon, graphs, a world map, and network nodes

    IBM Launches Agentic AI Governance and Security Platform

    IBM has introduced a new software stack for enterprise IT teams tasked with managing the complex governance and security challenges posed by autonomous AI systems.

  • laptop and fish hook

    Security Researchers Identify Generative AI 'Vishing' Attack

    A new report from researchers at Ontinue's Cyber Defense Center has identified a complex, multi-stage cyber attack that leveraged social engineering, remote access tools, and signed binaries to infiltrate and persist within a target network.