Data Systems

Swimming With Data

How much data does a district really need to make informed decisions about student achievement? Not that much, it turns out, but districts likely can’t do it alone.

It’s not exactly a well-kept secret: Despite the drumbeat for data that has echoed through the jungles of education for over a decade, its nuanced use to make decisions for students is rare. And even though state and national longitudinal data initiatives from the US Department of Education have increased the pressure, relatively few schools or districts are actually spending the time or resources to work with the available data to pinpoint areas for student improvement.

"Most other professions have it down," says Andrew Tolbert, long-time superintendent of Warren School District in southern Arkansas. "If I go to the doctor for a headache, he doesn’t treat me for bronchitis. The data tells you what the problem is. [Yet] if Johnny can’t read, why is he being taught the same stuff?"

The problem is not a lack of data, says Kathleen Barfield, chief information officer at Edvance Research. "There’s a public perception that schools are swimming in an ocean of data," she says. "They may be, but the data isn’t in formats or systems that make it easy for them to actually take it out and use it."

Furthermore, despite that ocean of data, there may be only a few key pieces of information that a district needs to really put data-driven decision making to work. How does a district find and apply the appropriate data?

The simple answer is: not on its own.

Many districts in the Southwest REL region have discovered the advantage of tapping the expertise of an organization like an REL that can collect and massage data, marry it to appropriate research practices, and help districts then apply it to their own needs.

In other words, with the help of Southwest REL, districts in Texas and Arkansas--and now Louisiana--are learning to swim with data.

One Indicator at a Time
Edvance Research, based in San Antonio, is one of 10 Regional Educational Laboratories, or RELs, funded by the USED’s Institute for Education Sciences to serve educational needs within specific regions. Edvance has a five-year contract (ending this year) to run REL Southwest, serving New Mexico, Texas, Oklahoma, Arkansas, and Louisiana.

Barfield says the focus at REL Southwest, and "just about every large education organization in the country," is to figure out how to make available data--whether in state or local systems--understandable to districts and schools. "It’s not just about longitudinal data," she says, referring to multiyear collections of data accumulated within state data reporting systems. "It’s about all the data that districts and states have available, and how to make the best use of that in trying to address the problems of practice schools are facing."

Southwest REL’s challenge is to give districts an opening into the data analysis process that would help them find a way to move forward with it. The agency found one leverage point in a 2005 report from the Consortium on Chicago School Research. CCSR identified a key indicator for on-track high school graduation: students who graduate on time have enough credits by the end of the ninth grade to be promoted to the 10th grade, and have not failed more than one semester of a core subject area. This one indicator, says the Chicago group, "is a better predictor of high school graduation than eighth-grade test scores or students’ background characteristics."

After reading REL Southwest’s January 2011 report on the Chicago findings, several Texas districts asked the agency to see if the research had any relevance to their own students. A team of Edvance researchers examined five districts’ historical data and confirmed that on-time graduation rates were indeed higher for students who met the on-track requirements at the end of the ninth grade than for students who were off track. This applied to students across all racial and ethnic groups, suggesting that if school personnel intervened in that crucial year, they could make a difference in whether a student graduated on time.

REL Southwest prepared the data and then trained cross-functional teams in the participating Texas districts on how to apply those indicators to their own data. It was this work that attracted the attention of educators in Arkansas.

A Capacity-building Model
The Warren School District is an average–sized district for Arkansas, with 1,530 students in five schools separated out by grades. As members of the REL Southwest board, Superintendent Tolbert and Luke Gordy, executive director of the Arkansans for Education Reform Foundation, learned about the on-track graduate work being done in the Texas districts at a meeting held in that state.

"That was a flag for me," Tolbert says. "We had some of the same issues--kids not getting out on time, or not finishing at all and dropping out."

The dropout problem was especially acute in the Delta Region, which abuts the Mississippi River in the eastern part of the state. Tolbert and Gordy approached their REL Southwest peers and suggested the organization do the same kind of training for its Delta districts as it had in Texas.

Edvance turned to the Southeast Arkansas Education Service Cooperative, a regional service provider, to make contact with Arkansas districts that might have an interest in being included in the new project. Nine signed on for the training, sending superintendents, data analysts, and others charged with district reporting work.

With their work in Texas and now in Arkansas taking place across a consortium of districts, the analysts at Edvance realized that they were not just disseminating research, but creating a capacity-building model to help district personnel strengthen their understanding of performance management and their use of indicator data to make decisions.

REL Southwest is still writing up documents to describe the On Track Indicator (OTI) model, says Barfield, but she can point to three tools developed along the way that can help people make effective use of OTI and other early warning indicators.

The first is a data analysis tool that allows users to examine their data in the form of indicators on a screen, drill down into the data, and create reports. The second is a clearinghouse that links the indicators to interventions. For example, if a school is having problems with students failing ninth-grade algebra, the clearinghouse offers a list of interventions appropriate to that subject area. The third tool brings the data together with the intervention to establish an implementation plan.

Getting Usable Data
The toolkit is clearly valuable, but it doesn’t address a persistent problem in most districts: How do you get the right data in a form that you can use? Like other states, Arkansas has a website--APSCN, or the Arkansas Public School Computer Network--where users can download a multitude of education-related statistics. But those statistics didn’t necessarily come in a form that could be used by districts for this purpose.

That’s where the value of an organization like REL Southwest really shines. REL worked directly with the state agency in charge of data to come up with the information needed to create an on-track indicator that would work for those Delta participants.

Trying this on their own probably would have paralyzed many of the districts. REL, for instance, had to provide the agency with the field layout for the project. Then it had to clarify in a follow-up meeting the agency’s understanding of what it needed. Once REL received the data, it had many subsequent conversations with the people in Arkansas to assure they understood the data they had received. REL worked with the districts in making modifications to the analysis, which required still more data from the state. Finally, the agency could do its calculations on the data, share the results with the districts and state agencies, and come up with a template approach to enable the districts to do their own work.

Barfield says that’s par for the course in a data project like this. "Because you have to get down to the real nitty-gritty, the exact data elements you need, sometimes it takes a fair amount of communication back and forth to get the right data," she says.

Now that this has been sorted out, however, the state can take the formulas set up by the REL and dish it up to all the districts in the state, not just those in the Delta region. "To me that’s one of the best outcomes of the effort," Barfield states. "We built the capacity at the state level to understand what [districts] needed for this on-track indicator, as well as the capacity to provide that report now statewide."

That capacity building included getting districts together to share practices for using data, something Tolbert says hadn’t happened before in his region. "We haven’t asked for it," he says. "We all know there’s a problem with graduation rates, but sometimes you have some best kept secrets."

The meetings REL held in Arkansas allowed the researchers to walk district personnel through a hands-on exercise to use on-track indicators to identify those ninth-grade students who would be at greatest risk of not graduating on time.

While a hands-on exercise with a couple of variables may sound simple enough, the step-by-step process was a bit of a revelation to Tolbert. "I don’t get to deal with that hands-on stuff," he confesses. "It gave me the opportunity to see what exactly a counselor or teacher would need to look at and do to help kids. That was really a big deal. I think it’s going to help our district."

The Last Mile
Clearly this kind of knowledge is invaluable for district leaders, but if the information and analysis skills aren’t extended to their staffs, the effort dies. Tolbert has every intention to share what he learned about on-track indicators with his high school principal and counselor, but, at the time of this writing, that hand-off hadn’t taken place, pointing out a gap in the model: the RELs may not be the best operatives for handling that "last mile" effort-- making sure all the data practitioners at individual schools get the information they need to keep the momentum going.

As one participant points out, "When you get back to the district, there are other things to be done." As an example, John Hoy attended the first Arkansas consortium meeting of Delta districts as a representative of the Monticello School District. Subsequently, he accepted a position as the state assistant commissioner for academic accountability. Before his departure for the new job, Hoy says he discussed what he’d learned from the REL Southwest session with a person at the high school, but "not to the extent that they could utilize it."

"I was gung-ho to get to it, but I didn’t because there were other things that were priority," Hoy says. "We didn’t have the personnel to do all the things that needed to be done. The smaller the district, the more likely it is for you to run into that type of situation."

Barfield acknowledges, "the ‘last mile,’ of data analysis is the hard part. When individuals go to any kind of event that inspires their thinking, they come back to their own organizations to face the realities of daily operations," she observes. "For the inspiration to be transformed into action, it requires: 1) leadership on the part of the individual in a change agent role; 2) combined leadership and action on the part of a like-minded team; or 3) scaffolding of leadership learning over time through the aid of external friends, colleagues, and/or consultants who can further empower a site leader to take the right action."

Meanwhile, REL Southwest is exporting the model developed in Texas and Arkansas to Louisiana, where several districts have expressed interest in creating a consortium comparable to the ones in the other two states. "We got something like 34 districts and 10 charter schools telling us they had interest in being part of this," says REL Southwest Executive Vice President of Dissemination Gail Ribalta, "which speaks to the local desire to have that kind of interaction and coming together around a specific need. They want to find solutions that actually work."

Ribalta says she believes district profiles may differ, but "the solutions and the processes are the same." That’s why bringing district people together to learn some basic techniques for working with relevant data points packs such power. "They call it a safe environment. They have the opportunity to work back and forth. You see people sharing and having this open, safe dialogue."

Hoy, now the state assistant commissioner for academic accountability adds, "You’ll get a consistent way of doing things, so we’re all doing it the same way with the same understanding. When we start to measure out things and put numbers on the table, we’re comparing apples to apples."

A math teacher-turned-administrator, Hoy says he has spent a lot of time in various roles evaluating the efficacy of the investments schools make to improve student outcomes, such as figuring out whether a given software package delivered what the vendor had promised. "You spent $100,000; what positive gain did you get from it? Do we have evidence that it worked? Did it make a difference?" In doing those calculations, he recalls, "I got an answer that I thought worked for me. But I didn’t know if it was statistically valid. It was a ‘John Hoy’ algorithm. If I go to the regional provider working with somebody like REL Southwest, I’ll get a standard that will help me put something more solid in place."

Getting Involved in a Data Initiative
The take-away from educators who have worked with REL Southwest is that it’s very hard for districts to make the best use of data on their own. They need the guidance of an organization that has a capacity to work with data and to share the data analysis efforts with other districts. "[Districts] need a lot of support and tools from vendors and the state and RELS and whoever else can help them, to help them pull that data, organize it, and make sense of it," says Kathleen Barfield, chief information officer at Edvance Research.

If you’re sitting in your district office and are not yet part of some data analysis consortium, here are some concrete steps to take:

1) Contact your regional educational service provider to see what it may or may not be doing in the data realm. ESPs can act as communication vehicles to bring districts together and find appropriate training resources.

2) Every district resides in one of the 10 Regional Educational Laboratory areas. Find out if yours has a data project that you can join. Keep in mind that although the RELs deliver services for free to the districts, the districts have to cover the expenses related to training its personnel.

3) The Data Quality Campaign ( has begun directing some of its work at the district level. Sign up for e-mail alerts to stay informed about what that national collaboration is doing.

4) Talk to your SIS vendor. If it doesn’t already have a cross-district training and analysis initiative, perhaps you can persuade it to start one.

THE News Update

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.