FERPA Was Written for File Cabinets, Not Cloud Servers
- By Thamir Aljobori
- 04/14/26
Every morning, my 2nd-grade student Yaqeen logged into a literacy app on a classroom tablet. The program read stories aloud, highlighted letter sounds, and adjusted difficulty based on her responses. I watched her confidence grow as she recognized patterns and built early reading skills. When she moved up a level, she clapped her hands and said, "I did it!"
In my classroom, digital tools do more than support literacy. I use an adaptive math platform that analyzes student responses and adjusts difficulty in real time. I can track reductions in error patterns and measurable gains in fluency within weeks. I also use AI-supported translation tools, so multilingual students can access instructions in their home language while building academic English. For teachers like me, tools like these feel purposeful and powerful.
Yet, later that same week, I noticed the app asking for permission to track location data. I couldn't explain why. The app was district-approved, widely used, and labeled "FERPA compliant." Yet nowhere did it say who could access the data, how long it would be stored, or whether it would be used beyond instruction. That's when the unease set in.
The learning the app afforded Yaqeen made sense. The data collection did not.
That tension captures the problem with the Family Educational Rights and Privacy Act, or FERPA. Passed in 1974, it was designed to protect paper records locked in filing cabinets. It was never meant to govern cloud-based platforms, artificial intelligence, or the invisible flow of student data across third-party vendors.
Today, nearly every classroom relies on digital tools. The average U.S. school district uses more than 2,000 education technology applications a year. In fact, children today generate more data before middle school than previous generations produced in a lifetime. That information can follow them across grades and districts, often beyond FERPA's reach.
Each digital record can include data logins, behavior patterns, academic performance, and even engagement metrics. That information does not stay in the classroom. It flows through private companies operating under vague federal definitions written decades before machine learning existed. If educators cannot see how student data is collected, stored, and shared, then strong laws governing how companies collect and use that data become essential.
Teachers are not privacy experts, yet we are often the ones parents turn to for answers. I've signed dozens of vendor agreements that include broad language such as "data may be used for product development" and "anonymized information may be shared with affiliates." Those phrases sound harmless, but they rarely specify retention limits, AI training use, or clear opt-out procedures. I rely on district approval lists, but even those rarely clarify whether data trains AI systems or is retained beyond the school year.
Our students deserve better. A 21st-century FERPA should rest on three pillars: consent, encryption, and control:
- Digital consent and AI transparency. Parents and students should specifically grant permission before companies use educational data for product development, AI training, or sharing it with third parties. That consent must be front and center in plain language, not buried in legal text.
- Cross-platform security. Student information now moves across devices, school systems, vendors, and third-party partners. A national standard should require end-to-end encryption and mandatory disclosure of breaches. The current system, where vendors "notify at their discretion," leaves families unprotected for months after leaks occur.
- Parent control dashboards. Families should be able to see who holds their child's data, what categories exist, and how to delete or export them. If parents can track an Amazon package in real time, they should be able to track their child's digital footprint.
Updating FERPA is foundational. Congress can establish a national privacy floor, require independent audits of education technology vendors, and penalize companies that misuse or sell student data. Without federal clarity, states and districts will continue writing patchwork policies that leave families uncertain and teachers scrambling for answers.
With protections in place, technology would remain a tool rather than a risk. I would still celebrate Yaqeen's growth in reading. But I would do so knowing her data is protected by law, not by assumption. Innovation and privacy would no longer compete in my classroom. Consent would be clear. Security would be guaranteed. Standards would be enforced.
Tomorrow morning, Yaqeen will log into her literacy app again. I will celebrate her growth in unlocking words and understanding what they mean. But I will also wonder where her data travels once the lesson ends. That uncertainty should not fall on teachers or families. It should be resolved by law.