App Security: Yet One More Thing Teachers Are Being Held Accountable For

By and large, districts make decisions about the choice of textbooks; districts and schools make decisions about the choice of curricula; and districts and schools make decisions about the choice of hardware.

But teachers are typically left to figure out how to integrate that technology into those curricula and textbooks. Elsewhere in this blog, we have argued that identifying appropriate software and/or integrating that software into the classroom is a significant burden, a task for which teachers have virtually no preparation.

Well, a teacher’s burden just got bigger!

The New York Times reports that, “Scores of education technology start-ups … are marketing new digital learning tools directly to teachers.” Not surprisingly, teachers are trying out these new apps. But, in addition to deciding how to use an app with their students, a teacher now needs to figure out if that app provides adequate privacy and/or security for their students’ data.

Teachers who are trying “adaptive technologies” — software that tries to determine what problems/information each child should receive — are potentially causing privacy and/or security violations.  That is, these personalized learning apps collect data about the student and use that information to decide what a student should do next. But, the Times reports, “…some districts have experienced data breaches … in a few cases, student records have been publicly posted on the Internet and online security researchers have discovered weaknesses in a couple of dozen popular digital learning services.

The article also notes, “When it comes to privacy and security, it’s a little unfair to put the burden on the teacher…” A “little unfair”? A more accurate description comes elsewhere in the article: "Guarding against the potential pitfalls — data breaches, identity theft, unauthorized student profiling — is a Herculean endeavor.“

Besides teaching 30 children, a Herculean task if ever there was one, we have now burdened the classroom teacher with making decisions about software security and privacy.

Really? REALLY?

 “Administrators … want teachers to have free access to the best learning apps.”

Yes, teachers should always be allowed to bring in materials, but the school/district needs to give support and guidance. However, as noted above, while districts and schools have procedures to make decisions about curricular materials, they often leave technology decisions up to the classroom teacher.

Data privacy and security are serious, serious issues. Look at what happened to Target — and the administrators who were ostensibly responsible for protecting their customers’ privacy/security.  With the current policies putting decision about selecting apps on the backs of teachers, it is only a matter of time before a parent will sue a teacher for allowing their child’s data to be published on the Internet.

Really? REALLY!

About the Authors

Cathie Norris is a Regents Professor and Chair in the Department of Learning Technologies, School of Information at the University of North Texas. Visit her site at www.imlc.io.

Elliot Soloway is an Arthur F. Thurnau Professor in the Department of CSE, College of Engineering, at the University of Michigan. Visit his site at www.imlc.io.

Find more from Elliot Soloway and Cathie Norris at their Reinventing Curriculum blog at thejournal.com/rc.

Featured

  • blue AI cloud connected to circuit lines, a server stack, and a shield with a padlock icon

    Report: AI Security Controls Lag Behind Adoption of AI Cloud Services

    According to a recent report from cybersecurity firm Wiz, nearly nine out of 10 organizations are already using AI services in the cloud — but fewer than one in seven have implemented AI-specific security controls.

  • stacks of glowing digital documents with circuit patterns and data streams

    Mistral AI Intros Advanced AI-Powered OCR

    French AI startup Mistral AI has announced Mistral OCR, an advanced optical character recognition (OCR) API designed to convert printed and scanned documents into digital files with "unprecedented accuracy."

  • robot waving

    Copilot Updates Aim to Personalize AI

    Microsoft has introduced a range of updates to its Copilot platform, marking a new phase in its effort to deliver what it calls a "true AI companion" that adapts to individual users' needs, preferences and routines.

  • teenager interacts with a chatbot on a computer screen

    Character.AI Rolls Out New Parental Insights Feature Amid Safety Concerns

    Chatbot platform Character.AI has introduced a new Parental Insights feature aimed at giving parents a window into their children's activity on the platform. The feature allows users under 18 to share a weekly report of their chatbot interactions directly with a parent's e-mail address.