Bipartisan 'Protecting Student Privacy Act' Draft Legislation Released

New draft legislation out of the United States Senate would force school districts to release the names of every company that has access to their students' private data and would prohibit companies from using any personally identifiable information for advertising or marketing purposes.

The legislation, called the "Protecting Student Privacy Act," was released in the form of a discussion draft jointly today by Senator Edward J. Markey (D-MA) and Senator Orrin Hatch (R-UT). Markey has been a proponent of student privacy rights for some time and began proposing a strengthening of FERPA back in January to address the growing phenomenon of student data residing on the servers of private businesses.

Markey's comments back then raised the hackles of some in the education technology industry. However, the latest discussion draft is being received more warmly — if not with enthusiasm, at least with cautious neutrality. The Software and Information Industry Association (SIIA) today responded to the draft legislation, characterizing the proposals as "reasonable."

"This draft bill is consistent with the reasonable improvements in national student privacy that Senator Markey has been talking about for months," according to Mark MacCarthy, vice president of public policy for SIIA. "We will be reviewing it carefully and look forward to working with him and Senator Hatch in the coming months to make sure that any new national student privacy legislation provides for both full privacy protection and the ability to use student data to improve learning."

The legislation is designed to "ensure that students are better protected when data is shared with and held by third parties, and parents are able to control the sensitive information of their children," according to information released by Sen. Markey's office. It proposes specifically:

  • Prohibiting the use of personally identifiable student data for advertising or marketing purposes;
  • Requiring certain safeguards be in place to protect the integrity of data in the hands of private companies;
  • Giving parents the explicit right to view their children's data and make corrections to erroneous information;
  • Making available the names of every company that has access to a district's student data;
  • Limiting the personally identifiable information that can be transferred between companies and schools; and
  • Ensuring that "private companies cannot maintain dossiers on students in perpetuity by requiring the companies to later delete personally identifiable information."

The complete text of the discussion draft can be found at markey.senate.gov.

 

About the Author

David Nagel is the former editorial director of 1105 Media's Education Group and editor-in-chief of THE Journal, STEAM Universe, and Spaces4Learning. A 30-year publishing veteran, Nagel has led or contributed to dozens of technology, art, marketing, media, and business publications.

He can be reached at [email protected]. You can also connect with him on LinkedIn at https://www.linkedin.com/in/davidrnagel/ .


Featured

  • Case Systems makerspace

    Case Systems Launches Line of K–12 Makerspace Installations

    Case Systems recently announced the launch of SALTO, a line of classroom fixtures and installations for K–12 learning spaces like STEM labs, art rooms, and makerspaces. The product line is designed to provide teachers with flexibility and adaptability, enabling them to shift between collaborative and individual learning environments.

  • a glowing golden coin with a circuit board pattern, set against a gradient blue and white background with faint stock market graphs and metallic letters "AI" integrated into the design

    Google Reportedly Investing $1 Billion in AI Startup Anthropic

    Google is investing more than $1 billion in generative AI startup Anthropic, expanding its stake in one of Silicon Valley's leading artificial intelligence firms, according to a source familiar with the matter.

  • glowing AI text box emerges from a keyboard on a desk, surrounded by floating padlocks, warning icons, and fragmented shields

    1 in 10 AI Prompts Could Expose Sensitive Data

    A recent study from data protection startup Harmonic Security found that nearly one in 10 prompts used by business users when interacting with generative AI tools may inadvertently disclose sensitive data.

  • Two figures, one male and one female, stand beside a transparent digital interface displaying AI symbols like neural networks, code, and a shield, against a clean blue gradient background.

    Microsoft-IDC Report Makes Business Case for Responsible AI

    A report commissioned by Microsoft and published last month by research firm IDC notes that 91% of organizations use AI tech and expect more than a 24% improvement in customer experience, business resilience, sustainability, and operational efficiency due to AI in 2024.