What Is COPPA?

Learn what teachers need to know about this important law.

The Children's Online Privacy and Protection Act, more commonly known as COPPA, is a law dealing with how websites, apps, and other online operators collect data and personal information about kids under the age of 13.

COPPA basically says — among other things — that tech companies making apps, websites, and online tools for kids under 13 must:

  • Have a "clear and comprehensive" privacy policy.
  • Get parental consent before collecting information about kids.
  • Not use kids' data for marketing-related purposes.

For a more detailed, yet still accessible overview of the law, be sure to check out EdWeek's "COPPA and Schools: The (Other) Federal Student Privacy Law, Explained." The article also gets into the somewhat confusing and contentious issue of whether or not schools can stand in for kids' parents when giving consent. In short, schools can grant COPPA consent if — here's the tricky part — the tool is used specifically for an educational purpose. And it can often be hard to tell what's specifically educational and what isn't.

Beyond COPPA's parental consent issue, it's important to know that even though the law specifically regulates technology companies, teachers and schools aren't off the hook when it comes to understanding the law and its intent. Here's why:

COPPA was originally enacted in 1998 — nearly 20 years ago! Technology has changed a lot during that time. And the technologies that kids use both on their own and in school are no exception.

Many edtech companies and websites tout a seal of COPPA compliance as part of their marketing. But in some cases, COPPA compliance might depend more on how teachers and students actually use the tool at the classroom level. Some edtech tool developers are even being creative in how they highlight their compliance with the law. In certain cases this essentially means shifting responsibility for COPPA compliance and obtaining parental consent directly to schools and individual teachers.

Innovative teachers — many of whom tend to be early adopters of new tech — are likely to try out tools that haven't been made specifically for kids or haven't been made with educational use in mind. Along with innovative teaching comes the responsibility to understand how our students' data is being collected and used.

What can teachers do?

1. Know your school's policies on adopting new technologies and follow them. Does your school or district have an approved list of apps and sites for student use? Chances are, students' data privacy issues were a big part of the decision to approve — or not approve — a tool.

2. Choose your classroom tech wisely.

  • Stick to tools designed with education in mind, especially if kids are going to sign up and create accounts.
  • When you bring new tech into your classroom, be mindful about how the tools ask kids to sign up, enter personal information, or share anything online.
  • Always get parental consent first — send a note home to ask for permission.
  • Avoid apps, games, or websites that seem focused on advertising.
  • Be cautious with tools that claim to be for education, but are also aimed at consumers or the business world.

3. Not sure about a technology tool? Our Privacy Evaluations (privacy.commonsense.org) can help! We have evaluations for many of the most popular ed tech tools that help identify the privacy risks in ways that are easy to understand.

About the Author

Jeff Knutson is senior editor of education reviews for Common Sense Media.

Common Sense Education helps educators find the best edtech tools, learn best practices for teaching with tech, and equip students with the skills they need to use technology safely and responsibly. Go to Common Sense Education for free resources including full reviews of digital tools, ready-made lesson plans, videos, webinars, and more.


Featured

  • cloud icon with a padlock overlay set against a digital background featuring binary code and network nodes

    Cloud Security Auditing Tool Uses AI to Validate Providers' Security Assessments

    The Cloud Security Alliance has unveiled a new artificial intelligence-powered system that automates the validation of cloud service providers' (CSPs) security assessments, aiming to improve transparency and trust across the cloud computing landscape.

  • robot brain with various technology and business icons

    Google Cloud Study: Early Agentic AI Adopters See Better ROI

    Google Cloud has released its second annual ROI of AI study, finding that 52% of enterprise organizations now deploy AI agents in production environments. The comprehensive survey of 3,466 senior leaders across 24 countries highlights the emergence of a distinct group of "agentic AI early adopters" who are achieving measurably higher returns on their AI investments.

  • laptop with a neural network image, surrounded by books, notebooks, a magnifying glass, a pencil cup, and a desk lamp

    D2L Updates Lumi with Personalized Study Supports

    Learning platform D2L has introduced new artificial intelligence features for D2L Lumi that help provide more personalized study supports for students.

  • cloud with binary code and technology imagery

    Hybrid and AI Expansion Outpacing Cloud Security

    A survey from the Cloud Security Alliance and Tenable finds that rapid adoption of hybrid, multi-cloud and AI systems is outpacing the security measures meant to protect them, leaving organizations exposed to preventable breaches and identity-related risks.