Policy

Guidelines to Help Prove Learning Impact in the ESSA Era

With No Child Left Behind, it was "scientifically-based research." Now with ESSA it's "evidence of impact." Both phrases refer to stipulations that schools, districts, states and education technology vendors prove that a given technology or practice has true, measurable merit.

To help those who are immersed in the challenge of showing effectiveness or efficacy during an ed tech adoption, the education arm of the Software & Information Industry Association (SIIA) has supported development of a set of guidelines. Intended to be used by K-12 ed tech vendors, educators and the research community, the recommendations cover four phases of adoption: getting started, designing the research, implementing the design and reporting the results. The new document, produced by Empirical Education for the SIIA's Education Technology Industry Network, is an update of the guidelines originally published in 2011.

The guidance offered in "Guidelines for Conducting and Reporting EdTech Impact Research in U.S. K-12 Schools" specifically fits scenarios where student and school data will be systematically collected and where standards defined by ESSA apply, with an emphasis on measuring the impact of the ed tech product on student outcomes. The idea is for ed tech companies to help their district clients come up with the evidence they need to quantify the value of the software tools they're being sold.

While much of the advice is highly technical ("Identifying the study's unit of implementation and unit of analysis is fundamental to how products are tested and what conclusions to draw...."), it's also couched in terms understandable by non-researchers ("Decide who is being tested: students, teachers, schools or a combination").

"In light of the ESSA evidence standards and the larger movement toward evidence-based reform, publishers and software developers are increasingly being called upon to show evidence that their products make a difference with children," said guidelines peer reviewer Robert Slavin, director of the Center for Research and Reform in Education at Johns Hopkins University, in a prepared statement. "The ... guidelines provide practical, sensible guidance to those who are ready to meet these demands."

The Guidelines can be downloaded from the Empirical Education website here.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Whitepapers