What to Expect on Student Privacy for 2020

Even though the Federal Trade Commission (FTC) held a workshop last October to examine "the future of the COPPA rule," we probably shouldn't expect any changes to it in 2020, according to the Future of Privacy Forum. In a December press briefing, Amelia Vance, the director of youth and education privacy projects, suggested that the process for updating the Children's Online Privacy Protection Act, which was legislated in 1998 and went into effect in 2000, could require more time to sort out than the current FTC commission has — depending on how the presidential election swings in November.

Where COPPA Might Be Heading

The COPPA rule gives parents control over what information websites and online services can collect from their kids under the age of 13. The purpose: to protect children's privacy and safety and secure the personally identifiable information collected from children online without their parents' permission.

The last time COPPA was amended was in 2013, when the FTC updated the definition of personal information to incorporate geolocation information along with photos, videos and audio files that contain a child's image or voice. Those changes also regulated the use of persistent identifiers that can recognize users over time and across different websites and online services and explicitly referenced sites and services that relied on outside services linked in through plug-ins and advertising networks that could also be collecting data for behavioral advertising.

According to Vance, the October workshop sought input on whether to update COPPA yet again to encompass new online changes, such as the growth in the use of Internet of Things devices, social media, educational technology and a continued proliferation of websites hosting third-party "child-directed content." The FTC also asked for public comments beginning in July 2019, generating some 66,000 comments by the time the comment period had closed in December.

Now, explained Vance, the process has to unfold. That means the Commission will have to look at those comments and put together a report that summarizes them, make the case for changes to the Rule, take additional comments on the report and then put out its own set of draft comments — "at which point, politically, we're likely near the presidential election, and you may have some turnover as a result of that election with commissioners and others."

Even as that process advances at a glacial pace, Vance suggested that we might see "potential federal amendments to COPPA" from Congress as an alternative.

One model is coming from California, which recently enacted its own Consumer Privacy Act. That went into effect this month. According to Stacey Gray, FPF's senior counsel, "It's the first state to pass a comprehensive privacy law," she said, adding that because California has the fifth largest economy in the world, "it becomes our de facto national law."

The California law "not only expands privacy protections to minors under the age of 13 but also creates additional protections for children aged 13 through 16, who must affirmatively consent rather than opt out of the sale of their personal information," explained Vance. She cited Europe's General Data Protection Regulation (GDPR), which extended its privacy protections for children to the age of 16, and the United Kingdom, which required its Information Commissioner's Office to create an "age-appropriate design code," which would have applications to youth up to the age of 18."

"There's definitely a lot of forward momentum to change...child privacy, and we're likely to continue to see calls for increased privacy protection in the US and abroad," Vance said.

Facial Recognition Backlash

Over the last five years, 40 states have already passed more than 130 laws regarding student privacy. So, it's unlikely that "we'll see too many new student privacy bills in 2020," said Vance. However, she does expect the 10 states without such regulations "to at least introduce legislation" and some that do — such as New York and Ohio — to expand their laws.

Vance also predicted a rise in the attention of state legislators regarding the use of facial recognition in schools. As one case, the New York Assembly passed a bill in 2019 that would have put a moratorium on all data collected in schools — largely prompted by the ensuing backlash experienced at Lockport City School District when it began its implementation of AEGIS, an object and facial recognition technology. That bill "didn't have time to get through senate approval," Vance noted. She predicted that the same legislation would surface again this year, "and possibly come up in multiple states as more and more districts are adopting facial recognition and parents are reacting, often negatively."

More broadly, Vance said, there was an increase in state safety legislation in 2019 that mandated the use of technology for school safety, including surveillance technologies like facial recognition. "Unfortunately," she pointed out, "most of those bills don't include privacy guardrails." As a result, she would expect lawmakers to continue finessing their rules as schools adopt technologies pushed in those state safety bills.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • glowing digital human brain composed of abstract lines and nodes, connected to STEM icons, including a DNA strand, a cogwheel, a circuit board, and mathematical formulas

    OpenAI Launches 'Reasoning' AI Model Optimized for STEM

    OpenAI has launched o1, a new family of AI models that are optimized for "reasoning-heavy" tasks like math, coding and science.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Supported by OpenAI

    OpenAI, creator of ChatGPT, is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.

  • clock with gears and digital circuits inside

    Report Estimates Cost of AI at Nearly $300K Per Minute

    A report from cloud-based data/BI specialist Domo provides a staggering estimate of the minute-by-minute impact of today's generative AI boom.

  • glowing lines connecting colorful nodes on a deep blue and black gradient background

    Juniper Intros AI-Native Networking and Security Management Platform

    Juniper Networks has launched a new solution that integrates security and networking management under a unified cloud and artificial intelligence engine.