K–12 Data Privacy During a Pandemic

K-12 privacy expert Amanda Vance shares the four questions that still matter: What data is being collected? Who has access to it? How will it be shared? And how will it be protected?

The Future of Privacy Forum has long helped educators understand the privacy issues involved in the technologies they rely on in their districts and schools. Recently, Senior Counsel and Director of Youth & Education Privacy, Amelia Vance, talked with THE Journal about the new kinds of concerns that have cropped up in an environment where virtual teaching and learning has become the de facto form of education.

The interview has been edited for clarity and brevity.

THE Journal: Let's just talk about the disclosure of health information. What are the main aspects to keep in mind there as schools are putting together their plans and policies?

Amanda Vance: When it comes to reopening plans, we've seen a lot of schools pick up the [Centers for Disease Control and Prevention's] guidelines or similar guidelines and try to implement them and build them into their reopening policies, which makes a lot of sense. These are the health experts, right? But they're not necessarily drilling down a level and thinking about what it means when you collect health information on a regular basis or when you require students or parents or school staff to enter how they're feeling every day or their health symptoms.

What information are you requiring people to turn over? Have you built a system that incentivizes people to tell the truth? Are they going to trust that the information that you're collecting is just to prevent COVID-19 infections, that it's not going to be used to disadvantage their child or harm or stigmatize their family within the community? Are you keeping in mind the many symptoms that are similar to COVID-19 or aspects of symptoms of COVID-19 that would indicate pretty sensitive things? A high temperature could indicate that someone recently had surgery, that they might be pregnant, that they might have a sexually transmitted infection.

What do you suggest they do?

There's no super special formula here. It really goes back to the central principles of privacy and fairness that have been around since at least the 1970 s--and many of them way before then.

The Fair Information Practice Principles, which were codified in part of the Family Educational Rights and Privacy Act and pretty much every federal agency and many state agencies and many international laws, provides a basic framework for data governance. It says four things:

  • That you should be transparent about what information is collected;

  • That you should make sure that information collected for one purpose will not be used for another purpose;

  • That there should be adequate security and that only the people who need access to the information will get it; and

  • That only the information that you need will be collected in the first place.

There are plenty of excellent resources for schools to adopt if they don't already have a data governance plan. A new fabulous report on data governance was just released from the National Forum on Education Statistics that features case studies from state and local education agencies.

There are so many materials that schools can use. And it's not so much about doing a detailed analysis. It's about asking the questions in the first place and just going that level deeper as schools are creating their plans that hinge on new data collection or adoption of technology: What data? How is it going to be used? How will it be protected? How will it be shared?

That in itself helps to build a policy and helps schools decide what should or should not happen and how you gain that trust from parents and students so you get the information you need to keep the community safe.

Much of what you're explaining here has been needed for a long time, but now there's this extra level of technology tracking. Schools are talking about taking student and staff temperatures as they enter the building or running their own contact tracing systems. Can you bring us up to date here about what schools should consider as they update their plans to take those kinds of things into consideration?

That is where it starts to get more complicated. But those underlying principles, those underlying questions that need to be asked and written into a policy, remain the same.

When it comes to something like thermal scanning or temperature checks, especially with a lot of the technology that's spamming administrator in-boxes right now, there should be initial threshold questions:

  • Does it work?

  • What is the benefit?

  • What evidence-based studies or information is there that this technology works?

There have been plenty of articles, for example, on thermal scans and the fact that they may not actually provide the information needed or may provide too many false flags to help with diagnosing cases of coronavirus because you're not measuring the core temperature of people.

You're going to end up having students show high temperatures who just worked out; who have higher temperatures due to a different sensitive medical condition; who have troubles for whatever reason regulating their temperature. They're going to show up on the heat map as these red colored blobs. And once you introduce this technology, somehow someone is going to have to do pull every one of those students or staff aside and ask them invasive questions about their health. Once you actually take a temperature in the traditional way, you may find that was a false flag.

We've seen instances of students who have been bullied and families that have had graffiti put on their house when people think that they have the coronavirus. At the end of the day this may end up being more stigmatizing than it will be useful in actually finding out about cases.

If you really start with that threshold question--does it even work?--you can eliminate about 90 percent of all of the solicitations in in-boxes. And then from there going back to those basic principles of privacy.

In the springtime schools adopted technology willy-nilly just to deliver instruction. But people may not be so forgiving in the fall about that. What do teachers and school leaders have to keep in mind about the tools they adopt for instruction?

Schools haven't necessarily had the time or had the money to prepare for this fall. There's no certainty and many are trying their best to make in-person instruction happen. The pedagogical strategies and approaches and which tools to use--a lot of that changes when you're doing online learning versus in-person learning. Most teachers haven't been provided with paid professional development on this. So, if they've done any professional development on this at all over the summer, they've basically been working for free. Schools haven't had a robust approach where they've been evaluating for the best and most secure and privacy-protective online learning software. They've been hoping that things will reopen and that they're able to have a fairly normal or closer-to-normal school year.

We still have [a few weeks] before most schools open. I would hope schools are looking for opportunities to simplify what they're offering for online instruction and to try to mitigate as much as possible that "wild west" that was happening as many educators were adopting technology--perhaps for the first time--and almost certainly without any training in privacy or in education technology use or online learning in general. It really is important to start using the time we have now to figure out how to make those processes easier so there isn't a privacy backlash down the road.

What else is keeping you up at night on the education front?

One of the big things that I think a lot of folks aren't thinking carefully about yet is the privacy implications of the monitoring that in many ways has to happen in providing online education. This takes the form of several different things. How do you take attendance in a virtual environment? There were so many kids that dropped off the map, and educators had no idea where they were; if they were OK; if they were safe; in some cases where they had been on a lesson, did they log into that lesson and then walk away to play a video game?

There are many financial incentives for schools to make sure that students are in attendance. There are requirements for equity that schools are serving students equitably, that they are making sure there aren't disproportionate outcomes as much as possible with students from different backgrounds. And there are reporting requirements related to how they're serving those students.

And yet we're still in the middle of the pandemic.

And so there was a ton of monitoring that was already happening in the spring. Pretty much all school-owned devices have some sort of monitoring software on them due to the Children's Internet Protection Act, where if a school receives E-rate funding, they need to monitor both the internet network as a school but also the devices provided to students to make sure that inappropriate content isn't being accessed.

That has ballooned since this law was passed in 2000. Then, monitoring meant the teacher was looking over the student's shoulder in the computer lab. Today it means everything from light-touch software that sends an email to the principal when a student uses a swear word or looks up porn on the internet using their school device to intense levels of surveillance that monitor everything a student types and scans and it uses some sort of machine learning or artificial intelligence to try to detect bullying or suicidal ideation or school threats.

Many districts already have this in place, scanning school devices but also school accounts in many cases--for Microsoft Office 365 Education, Google 's G Suite for Education, student email accounts and the documents they have stored in the cloud, the websites they visit while they're logged in.

Now that students are at home, they're obviously using these devices even more--especially students who may not have had their own devices before. Now, when they want to look up something online, participate in a forum for students who may be LGBTQ or look up safe sex, or even something innocuous like research on breast cancer, which some of the filtering software has banned in the past, all these things could raise a flag.

The students who have their own personal devices--the school won't know about it. Those students who are now relying on school devices are even more likely to be using that device more often and to get a whole bunch of red flags from the monitoring software about the activities that they may be doing and how they may be using that device--even though their peers are doing exactly the same things, just not on a school device.

There are also a lot of schools looking at using the learning analytics that are built into a lot of ed tech products, that weren't necessarily used too much in K-12 schools before. They're used more often at the higher education level. Professors get reports from Blackboard or Canvas or any of the learning management systems, saying, "John Smith has not logged on to get his homework in three weeks." It tells you a little thing. It says, "Jane Smith was logged into the platform for three hours reading this assignment"; or "Sammy took 20 minutes to take this test."

Some of those things are really useful, especially in pointing out students who are certainly falling behind or whose schools are having trouble reaching the people who need interventions. But it is awfully easy for this to slide into creepy. A lot of these analytics aren't necessarily indicative of whether a student is participating in class or not.

And just as teachers and administrators haven't been well trained in privacy, they generally haven't been trained in how to use ed tech effectively. They may not understand that these are signals and not diehard evidence-based recommendations. All of a sudden, they're going to be diving into the deep end of using these analytics, so schools can report the things they're required to report.

At some point a parent or a student is going to notice when a teacher says, "Hey you're going to fail the assignment because you never logged into the learning management system," and all of a sudden, they'll realize, "You're tracking what?" That's a breach of trust. So maybe the parent decides it's easier to homeschool, or just not engage with the school this year because it's not going to be fair, or they'll transfer to a different school or whatever it may be. [Used poorly,] those analytics can undercut the trust between the school and parents and students that is essential, especially right now.

The only conversations I'm really seeing about this cover what data to collect and all of the different ways to measure the required things to report virtually, like attendance. It hasn't dug deep into the next layer--what shouldn't we collect? How do we make this transparent? How do we make sure that parents are on board? "Instead of in-person participation, we're going to have that part of your grade be you logging into Blackboard, pulling down the homework assignment or posting a question on the forum." Are you clearly laying out what students have to do to get to get credit, to get their grade, as opposed to relying on analytics to tell you things about the students that they don't even know that they're supposed to do or were tracked on?

It's not that schools don't care about privacy, or that they are choosing not to talk about it. It's just that they don't know that they need to ask the questions in the first place. They're not--and shouldn't be--privacy experts. They need to be provided with professional development, with support, with plug-and-play model documents. And, unfortunately, funding is being cut from schools as opposed to provided, as uncertainty continues. So, we're really relying on hope that teachers and administrators will educate themselves in their spare time--unpaid--about this thing that isn't even on their priority list.

Your organization has launched a rebranded education-specific site: Student Privacy Compass. And you've been publishing some new resources this summer to address student privacy in the pandemic. Tell us about it.

We already have several training videos and activities for educators that we've tried to keep fairly short. We also have in a playlist on YouTube; we're happy to email people if they need access, or even mail them a thumb drive, as well as slides and activities and resources related to each topic that they can use to educate themselves or use in giving presentations on privacy to the teachers that they're working with, administrators or the school board.

Student Privacy Compass has also published two new briefs, one on increased data collection and sharing and the other on thermal scans and temperature checks, intended to provide brief explanations for educators of the issues involved in these subjects.

Our hope is that by providing resources that people can watch in the five minutes they have for lunch or in between other things, they can start to build up a base of knowledge that will help them not only deal with these concrete, really sticky issues around the pandemic but also with all the future technologies and issues that are sure to come their way as education continues to evolve.

Find more at Student Privacy Compass, including a list of COVID-19-specific resources.

Featured

  • glowing digital human brain composed of abstract lines and nodes, connected to STEM icons, including a DNA strand, a cogwheel, a circuit board, and mathematical formulas

    OpenAI Launches 'Reasoning' AI Model Optimized for STEM

    OpenAI has launched o1, a new family of AI models that are optimized for "reasoning-heavy" tasks like math, coding and science.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Supported by OpenAI

    OpenAI, creator of ChatGPT, is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.

  • clock with gears and digital circuits inside

    Report Estimates Cost of AI at Nearly $300K Per Minute

    A report from cloud-based data/BI specialist Domo provides a staggering estimate of the minute-by-minute impact of today's generative AI boom.

  • glowing lines connecting colorful nodes on a deep blue and black gradient background

    Juniper Intros AI-Native Networking and Security Management Platform

    Juniper Networks has launched a new solution that integrates security and networking management under a unified cloud and artificial intelligence engine.