4 Steps to Evaluating ESSER Program Effectiveness
Sound data management practices are critical for complying with federal reporting rules.
- By Delonna Darsow
- 02/23/22
The
$190 billion in Elementary
and Secondary School Emergency Relief (ESSER) funding
approved by Congress since March 2020 gives K-12 school systems
much-needed federal aid, but it also comes with key challenges.
For
example, districts are being asked to submit reports on the outcomes
of the investments they make with this money, a best practice that
many have not previously undertaken. These reporting guidelines will
require careful data collection, management and evaluation.
A
significant portion of the pandemic relief aid must be used to
implement evidence-based programs and interventions that attend to
students’ academic, social and emotional needs. Under guidance
issued by the U.S. Department of Education, state and local education
agencies must be prepared to report on matters such as:
-
What
programs were purchased using ESSER funds?
-
How
many students (broken out by various demographic groups) used these
programs?
-
Were
these students enrolled in remote, in-person or hybrid instruction?
-
What
was the average and standard deviation of growth rate for students
participating in each intervention?
To
conduct the robust evaluation required to report on student outcomes
in such detail, school systems will need to prioritize data
management across all sites. Here are four important steps that K-12
leaders should take to do this effectively.
Prepare
a list of all intervention programs purchased with pandemic relief
funds.
If
your ESSER spending is centrally managed, this process will be much
easier. If you use a distributed model in which various schools and
departments are responsible for making their own purchasing
decisions, you’ll need to coordinate among all departments to make
sure your list is comprehensive.
Determine
what measures you’ll use to evaluate the progress of students
taking part in these programs.
Make
sure the methods you use are valid and reliable. The indicators you
choose will depend on the outcomes you’re hoping to achieve. For
instance, if you invest in an intervention designed to increase
student engagement, you might use attendance rates to measure
outcomes. If you invest in a math intervention program, you’ll need
to measure the academic growth of all student participants using a
suitable math measure.
Set
up a process for collecting pre- and post-intervention data and start
monitoring progress for all students participating in the program.
Depending
on how your state has chosen to implement the federal government’s
reporting guidelines, you might need a system for tracking the
progress of students enrolled in these interventions, including the
number of sessions and/or minutes that students spend participating.
But even if your state doesn’t require this degree of specificity
in reporting on ESSER uses and outcomes, best practices for using
data in education would call for collecting this information. That
way, you’ll have the insight you need to understand and monitor
whether the interventions are working as intended — and whether you
need to make any changes to core instruction.
If
this information is tracked through the program internally, make sure
the data can be exported in a format that’s usable for evaluation.
If the intervention does not track this information internally,
identify a simple, low-barrier process for collecting the data.
Consider
the tools you’ll use to collect and manage student progress
information.
Data
collection and management are critical, yet often time-intensive
practices. To streamline these processes, school systems can
implement a data management platform that’s built to optimize
program evaluation — making ESSER reporting much more manageable.
Look for a platform that can help you…
-
Measure
intervention outcomes at the student, group, grade, school and
district levels.
-
Recognize
differences across student groups, so you can easily evaluate the
effectiveness of the interventions in which you’ve invested.
-
See
all student data in one view to accurately identify students in need
of support.
-
Identify
average growth rates for students in each intervention based on
their participation.
-
Use
a customizable bank of research-supported interventions to match
instructional supports to students’ needs.
-
Monitor
student progress and document the adjustments you make to maximize
student growth.
Evaluating
and reporting on the use of pandemic relief funding can be
challenging. But with the right tools and strategies, K-12 leaders
can effectively meet the federal reporting requirements while
positioning their schools for success.
About the Author
Delonna Darsow, PhD, is the Product Champion for Sourcewell’s
school technology division.
Proliftic,
by Sourcewell, is an
evidence-based data integration platform that provides intuitive
reporting to help K-12 districts monitor student progress, identify
gaps in learning, match students with effective interventions and
evaluate program effectiveness.