The Problems with FERPA and COPPA in 21st Century Learning

Forty-three-year-old FERPA, the Family Educational Rights and Privacy Act, was originally enacted by the U.S. Department of Education to give parents access to the records maintained about their children by schools and districts and to require their written permission before personally identifiable information was disclosed to others. Back then, the data was kept in folders inside filing cabinets.

COPPA, the Children's Online Privacy Protection Rule, was initially introduced in 1999 and enforced by the Federal Trade Commission. The intent was to allow parents to decide when and how personal information about their children was collected, used and disclosed online by commercial operators. If a user was under the age of 13, the website or online service needed to get parental consent before collecting that personal information. In 2013 COPPA was updated to encompass new forms of technology that could collect data from young people, such as social media, mobile apps, gaming platforms and connected toys; and to designate that operators could get consent from schools instead of parents to collect personal information from students as long as it was for the benefit of the school and not for commercial purposes.

Both of these pieces of legislation address children and their personal information. But it has taken until this year for the FTC and the Department of Education to get together in a public forum to discuss how these regulations intersect and whether they need to be updated. This week the two agencies convened a "Student Privacy and Ed Tech" workshop that brought together agency experts; district IT, legal and instructional technology people; parent advocates; and company representatives.

The big point of intersection between the two statutes occurs in schools, where education technology is used by teachers in the classroom. Often, usage takes place without parents knowing anything about what programs are in use, what data is being collected or what happens to that data once it's generated, said Thomas Pahl, acting director of the FTC's Bureau of Consumer Protection, in his opening remarks. "In particular, parents and advocates raise concerns about secondary uses of information collected from their kids in the school environment," he explained. "For example, we've heard from parents worried about ed tech providers using the information they collect in the school context to market non-educational products to students."

The workshop covered ground on several issues:

  • How well understood the requirements of FERPA and COPPA are by those who need to adhere to the rules when ed tech companies are collecting personal information about students;
  • How the information collected by ed tech vendors can legally be used;
  • When it's appropriate for schools to provide "parental" consent under COPPA and whether they need to notify parents as part of that process;
  • What rights parents have to access and delete their children's information and how they can exercise those rights without interfering with the schools' roles in educating students.

For the last couple of months the agencies have been collecting comments from the public regarding these and related topics they hoped to cover in the workshop. A common theme among many of the submissions was a request for greater clarity on the requirements.

"Well intentioned ed tech vendors and educators still have difficulty understanding how best to comply with COPPA in the educational context and FERPA in the digital context," wrote Common Sense Media in its submission. For example, "Directory information is opt-out under FERPA, but much of that information is protected as opt-in under COPPA." A major point of concern, the non-profit stated, is figuring out what's an educational purpose for data and what's a commercial purpose. (One case that surfaced several times during the workshop was the use of student data to improve a vendor's program; would that be an educational purpose or a commercial purpose?)

Doug Levin, president of EdTech Strategies, suggested that the two agencies review their rules from the perspective of school student data security. Levin, who maintains the "K-12 Cyber Incident Map," noted that both sets of regulations "presume that schools have the resources and knowledge to assess their own data security practices, to say nothing of their vendors," yet the evidence says otherwise. First, the map itself has documented 279 incidents over the last two years, he pointed out; second, the poor findings in two separate state audits of district cybersecurity practices, one in Missouri and the other in Wyoming, offered "more evidence of a systematic lack of capacity."

Levin, who attended the workshop as an audience member, was " disappointed that the panelists and speakers were not particularly diverse. I fear that the experiences of poor, minority and rural students and families may be significantly different than what was expressed over the course of the day."

One example of that uniformity in viewpoints was evident in the two individuals chosen to represent the parent viewpoint. Both are representatives of organizations focused on slowing down the use of data by schools. Rachael Stickland is the co-chair for the Parent Coalition for Student Privacy, and David Monahan, is the campaign manager for the Campaign for a Commercial-Free Childhood.

Stickland noted that parents who went to school in the 1970s and 1980s "think that school records are still held in a file cabinet in the principal's office," or if it's online, "it's in the servers the school district maintains, and they have control over it, and that there's not any third-party sharing." In either case, she stated, they "don't understand the implications; but when they do — when they are sort of illuminated — then they have very serious privacy concerns and feel like the FERPA and COPPA laws should be protecting against a lot of things they actually don't."

Among the recommendations in their formal comments, the two organizations suggested that under the terms of COPPA, "schools should be required to notify parents, and operators should be required to obtain their consent for use of online services." That notice "should include the names, websites, and privacy policy/terms of service of every applicable operator that the district has determined to be COPPA-compliant, what specific data is being collected from students, and how it is being used, protected and secured." Likewise, they added, "operators should be required to obtain parental consent whether their services are used inside or outside of the school."

Michael Hawes, director of student privacy policy at the Department of Education, pointed out that any teacher could talk about the difficulties of obtaining parental consent "for even worthwhile activities." "Consent is a great option," he said. "It is the most transparent. It is the most privacy protective. But it can be problematic when you're talking about required school services."

Likewise, data has its value, emphasized Amelia Vance, education policy counsel for the Future of Privacy Forum. An analysis of data, she said, was what led to the findings showing that certain minorities experienced "disproportionate suspension rates," or weren't evenly "referred to advanced placement classes."

There's been a "massive legal shift," Vance said, "and it has been very hard for schools, states and technology providers to keep up and understand what their obligations are."

Chris Paschke, the executive director of data security at Jeffco Public Schools in Colorado, said that during his district's involvement as a partner school with the beleaguered inBloom initiative, it figured out that it was "behind the 8-ball" in terms of its information security policies. Those were written more towards data stored in the data center than in the cloud, where it now typically resides. As a result, he added, the data is "more decentralized and just managing those vendors is difficult. Something as simple as a record request has become very, very complicated. Both polices and our staff haven't kept up with it."

In the intervening years Paschke's district has "elevated roles in the organization" to help enforce data-related rules and "retooled a bunch of processes" related to data management, purchasing, security and communicating with district families. It has also begun work on defining student "data lifecycles." that encompass a stage in which the data is purged.

The outcomes from the workshop won't be quick in arriving. Kathleen Styles, the chief privacy officer for the Department of Education, asked the closing panel how many of them would like to have the same rule under FERPA as under COPPA. Response was tepid. More important to the panelists was better training for teachers and school leaders about the rules in place, more standardized tools for helping schools determine whether the programs they're adopting adhere to the terms of the regulations and sharing of best practices for achieving better communication with parents regarding the use of data and what's done with it once the educational purpose has ended.

A webcast of the entire proceedings will shortly be made available.

Featured

  • An elementary school teacher and young students interact with floating holographic screens displaying colorful charts and playful data visualizations in a minimalist classroom setting

    New AI Collaborative to Explore Use of Artificial Intelligence to Improve Teaching and Learning

    Education-focused nonprofits Leading Educators and The Learning Accelerator have partnered to launch the School Teams AI Collaborative, a yearlong pilot initiative that will convene school teams, educators, and thought leaders to explore ways that artificial intelligence can enhance instruction.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Supported by OpenAI

    OpenAI, creator of ChatGPT, is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.

  • closeup of laptop and smartphone calendars

    2024 Tech Tactics in Education Conference Agenda Announced

    Registration is free for this fully virtual Sept. 25 event, focused on "Building the Future-Ready Institution" in K-12 and higher education.

  • cloud icon connected to a data network with an alert symbol (a triangle with an exclamation mark) overlaying the cloud

    U.S. Department of Commerce Proposes Reporting Requirements for AI, Cloud Providers

    The United States Department of Commerce is proposing a new reporting requirement for AI developers and cloud providers. This proposed rule from the department's Bureau of Industry and Security (BIS) aims to enhance national security by establishing reporting requirements for the development of advanced AI models and computing clusters.