Using Data to Drive Educational Change: The Bad, The Good, and The You-Decide

"The heart of science is measurement. We're seeing a revolution in measurement, and it will revolutionize organizational economics...."[1] Erik Brynjolfsson, director of the Center for Digital Business at the Sloan School of Management at MIT.

Education is indeed a science — it is not a matter of opinion or belief. As such, measurement is at the heart of education. Easy to say, hard to do. Measure what? Measure how?  And, most importantly, how are the data to be used?

With good intentions, during the Bush Era, we went through a period where federal funding for an educational innovation would not be given unless there was scientifically rigorous, empirical data backing up the innovation. SBR, scientifically based research, meant controlled studies with randomized or quasi-randomized trials. For example:

"Test scores were not significantly higher in classrooms using the reading and mathematics software products than those in control classrooms. In each of the four groups of products-reading in first grade and in fourth grade, mathematics in sixth grade, and high school algebra-the evaluation found no significant differences in student achievement between the classrooms that used the technology products and classrooms that did not."[2]

SBR delivered study after study that said the use of computing did not improve student achievement.[3] Indeed, lest we might have missed the government reports, the New York Times beat us over the head with those data 8-9 times last year.[4] But for all the money that SBR studies cost to carry out, they provide previous little information — only X is "better" than Y (or not). Thankfully, SBR-style studies are no longer ... in vogue.

Now for some measurements that do tell us lots of good stuff! Since 2003 Project Tomorrow has been carrying out yearly surveys of schools, parents, students, etc. Rather than looking for thumbs up/down data, Project Tomorrow is looking for trends. For example:

"Teachers are increasingly interested in leveraging technology for activities with students and many are modifying their instructional plans to incorporate more digital experiences.  Nearly a majority of classroom teachers (45 percent) noted in 2012 that they were creating more interactive lessons because of having access to technology, an increase of 25 percent in just the past two years."[5]

And, there is a new form of measurement come to fore: learning analytics. Since students are using computing devices for almost the whole school day, the "clickstream" of data that results from a student interacting with a computing device can paint a detailed profile of what each student is doing, and collectively, what students are doing, as learning, ostensibly, is taking place. In contrast to the limited information that SBR-type studies can provide, these clickstream learning analytics apparently can be used to solve most all of education's ills:

"Using analytics to reduce drop-out rates, improve academic results and engage parents and students is transforming K-12 education. Putting information to work to improve day-to-day decision making by teachers, administrators and even students, is the next wave of K-12 innovation."[6]

For example, the Carpe Diem schools are using these clickstream data to drive adaptive computer-based instruction:

"Carpe Diem has two schools, one in Yuma Arizona and one in Indianapolis that have captured the attention of the nation.  It uses a combination of adaptive software, tutoring, group projects, and classroom instruction to provide highly individualized education for each student at an astonishingly low cost $5597 per student annually (compare to Texas per pupil funding of approximately $7000-7500).  It has been rated among America's best high schools by Business Week and US News & World Report."[7]

Sounds great! You decide — after taking a look at the picture posted on the Hechinger Report Web site and watching a Carpe Diem video[8]. As one education blogger commented: "They had 240 students working on computers when I walked in, and you could have heard a pin drop."[9]

Measure what? Measure how?  And, most importantly, how are the data to be used?  Education will change as a result of these new forms of measurement; no question about that. For the better? "Aye, there's the rub."[10] (Shakespeare, 1604


Featured

  • robot typing on a computer

    Microsoft Unveils 'Computer Use' Automation in Copilot Studio

    Microsoft has announced a new AI-powered feature called "computer use" for its Copilot Studio platform that allows agents to directly interact with Web sites and desktop applications using simulated mouse clicks, menu selections and text inputs.

  • AI microchip under cybersecurity attack, surrounded by symbols of threats like a skull, spider, lock, and warning shield

    Report Finds Agentic AI Protocol Vulnerable to Cyber Attacks

    A new report from Backslash Security has identified significant security vulnerabilities in the Model Context Protocol (MCP), technology introduced by Anthropic in November 2024 to facilitate communication between AI agents and external tools.

  • educators seated at a table with a laptop and tablet, against a backdrop of muted geometric shapes

    HMH Forms Educator Council to Inform AI Tool Development

    Adaptive learning company HMH has established an AI Educator Council that brings together teachers, instructional coaches and leaders from school district across the country to help shape its AI solutions.

  • illustration of a human head with a glowing neural network in the brain, connected to tech icons on a cool blue-gray background

    Meta Introduces Stand-Alone AI App

    Meta Platforms has launched a stand-alone artificial intelligence app built on its proprietary Llama 4 model, intensifying the competitive race in generative AI alongside OpenAI, Google, Anthropic, and xAI.