Privacy Concerns Dominate 2017 Student Data Legislation

During the 2017 legislative session, data privacy showed up in 93 of 183 state bills touching on education data. This is hardly a new trend. Since 2013, lawmakers in 49 states have introduced 503 bills, and 41 states have passed 94 laws regarding education data privacy and security, according to the latest accounting by the Data Quality Campaign, a nonprofit that monitors how information is used by schools, districts and states to improve education.

Each year, DQC issues a report on state legislation regarding education data and student privacy. On the heels of this year's report, the organization also published a "roadmap" to help state education agencies work with researchers to improve education outcomes.

DQC measures the impact of legislation based on a four-policy priority list — 1) measure what matters, 2) make data use possible, 3) be transparent & earn trust and 4) guarantee access & protect privacy — intended to evaluate whether the new laws support "effective data use and protection" or inhibit it. (The totals below don't add up to the numbers above because some bills and laws address more than one DQC policy.)

In measuring what matters, 36 states introduced 95 bills and passed 31 laws connected to the collection, linking and governance of education data. Among the bills on collecting the right information, Maryland passed a law revising how data "about seclusion and restraint practices" is reported to the Department of Education. Now, details about gender, race and age must be included. For governance, Maine passed a law creating an "Educational and Attainment Research Navigation System" to keep and report data on education and workforce outcomes, with an executive council in charge of its oversight.

On making data use possible, 26 states introduced 56 bills and passed 21 new laws. For example, Utah passed a law setting up an early warning system that districts can use to help identify high schoolers at risk of dropping out, so that educators can intervene with appropriate support.

Lawmakers offered 59 bills and passed 20 new laws related to being transparent and earning trust. In this category, Connecticut passed a law intended to help prospective students make better career choices by creating a new public report that shows how they fare after graduating from different colleges and programs.

For guaranteeing access and protecting privacy, 38 states introduced 107 bills and passed 30 new laws. In one of those new laws, Virginia now requires education service providers to give parents access to an electronic copy of their student's personal information.

Many states have also followed California's example by proposing or instituting legislation related to how third-party education tech companies can use the data about students that they collect. California's Student Online Personal Information Protection Act (SOPIPA) law was passed in 2014; in 2017, 16 bills out of 32 were based on that legislation; five were passed.

"States have used legislation to make significant progress toward the goal of making data work for students," the DQC report noted. "Now they must build on this work to ensure that every parent and teacher has access to the data they need to help students, educators have the training they need to use data effectively in their classrooms, and public data gives stakeholders the information and context they need to understand their schools and inform their decisions." The work isn't over, the organization asserted: "In 2018, state legislators can consider additional ways to make data more useful, actionable and secure."

The annual summary of new legislation is openly available on the DQC website. A state-by-state legislative summary is also available.

DQC's new roadmap, released shortly after its legislative report card, offers guidance for state education agencies to help them understand how to work with researchers. Setting up an "effective" partnership has eight components, the report suggested:

  • Creating a shared vision on how the research will be set up, conducted and used;
  • Putting data governance in place to formalize researchers' access to the data;
  • Establishing how data sharing will be done and data privacy ensured;
  • Allowing researchers to obtain "quality" data in a timely manner;
  • Communicating with stakeholders and the "public" about the research work;
  • Protecting participant privacy without "unnecessarily limiting research";
  • Having reporting processes in place that will make the research findings usable; and
  • Addressing capacity building to ensure that the research can be applied for ever more student success.

As the authors explained, "Without high-quality and trustworthy education research, communities will be making decisions in the dark, and more students will be left behind." The guide will, according to the DQC, help states, policymakers and others better understand "how to partner with researchers to address important education questions through research."

The report is available on the DQC website.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • An elementary school teacher and young students interact with floating holographic screens displaying colorful charts and playful data visualizations in a minimalist classroom setting

    New AI Collaborative to Explore Use of Artificial Intelligence to Improve Teaching and Learning

    Education-focused nonprofits Leading Educators and The Learning Accelerator have partnered to launch the School Teams AI Collaborative, a yearlong pilot initiative that will convene school teams, educators, and thought leaders to explore ways that artificial intelligence can enhance instruction.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Supported by OpenAI

    OpenAI, creator of ChatGPT, is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.

  • closeup of laptop and smartphone calendars

    2024 Tech Tactics in Education Conference Agenda Announced

    Registration is free for this fully virtual Sept. 25 event, focused on "Building the Future-Ready Institution" in K-12 and higher education.

  • cloud icon connected to a data network with an alert symbol (a triangle with an exclamation mark) overlaying the cloud

    U.S. Department of Commerce Proposes Reporting Requirements for AI, Cloud Providers

    The United States Department of Commerce is proposing a new reporting requirement for AI developers and cloud providers. This proposed rule from the department's Bureau of Industry and Security (BIS) aims to enhance national security by establishing reporting requirements for the development of advanced AI models and computing clusters.