Data-Driven Decision-Making: Mission Accomplished

##AUTHORSPLIT##<--->

Two districts, one goal: to use data to drive decision-making. Returning to them nearly a year after our last visit, we find both have achieved breakthroughs, discovering new ways to draw insights from student performance, and then step in to better it.

Mission AccomplishedWHEN WE LAST LEFT Chicago Public Schools (CPS) and Texas' Plano Independent School District in April 2008, both school districts were immersed in the long and labored advance toward becoming a fully matured data-driven operation, one that uses sophisticated technologies to gather data, sort and interpret it, and ultimately use it to pursue actions that bear higher academic achievement.

We noted then that each district was certain to "face new challenges discerning what data is relevant, addressing tolerance for change among users, and figuring out how to respond now that data is driving its decision-making." Coming back to them now, we find that in the past year both have broken through to a more developed, thoughtful, and consequential use of data by uncovering fresh ways to look at data that they had not considered before, and using technologies that allow them those capabilities.

Both districts have found the starts and stops along the way to be not setbacks, but learning opportunities, enabling them to fill holes in their data with additional variables that create a more accurate portrait of students' abilities and needs.

As the two districts press ahead with the business of translating data into action, evaluating the effects of those actions will determine a new set of best practices-- as well as a new set of failed ones. But what's important is the fundamental change the use of data has brought to the school culture. "Our conversations have changed," CPS CIO Bob Runcie says. "There's no longer speculation and guesswork about what works and what doesn't. Now stakeholders look at the facts."

Plano: Three Easy Pieces

The way Jim Hirsch, Plano's associate superintendent for academic and technology services, explains it, the stages of a data-driven decision-making project are not linear, but oval, always leading back around to the start. Even as his own district has reached the end game-- taking action-- of a DDDM initiative that began in earnest in 2005 with the launch of its data management tool from SAS, it has always kept a foot in the preceding stages, determining new data points to collect and how to best use them to drive student achievement. "Our new learning comes from asking new questions," Hirsch says.

As the initiative evolves, Plano continues to find more passageways in the data, incorporating new metrics into the SAS tool to both broaden and deepen the picture the system delivers of the individual student, classroom, and school. "In each of the last three years, we've had a significant new component brought into our conversation about student performance," Hirsch says.

In the first two of those three years, the district began providing teachers with glimpses of their students they had never had before. A key discovery was that long-used demographic variables such as economic status had less to tell educators about a student than a newer metric the district was gathering: cognitive ability. Plano began administering the Cognitive Abilities Test (CogAT), from Riverside Publishing, to its students to test their reasoning skills. "We started talking about student performance beyond just a state test," Hirsch says.

"Our teachers [know] what a student's learning needs are, besides just, ‘Here are last year's results on a state test and here are their grades from last year.'"

The district now had a new angle on its students, as they associated the CogAT scores with scores on the state's high-stakes test, the Texas Assessment of Knowledge and Skills (TAKS), which Hirsch says are not always in sync. It allowed the district to identify a student with a low cognitive level but who scored at proficiency on the state test. "You start putting those two pieces together, and you [realize] this is a student we want to make certain that we are paying attention to," he says, "because right now they're being successful, but they're being successful for a variety of reasons, one of which may be just pure hard work, but the other one may be by accident we were giving them the right instructional program without realizing it."

Still, Hirsch says, more pieces were needed. The district wanted to be able to estimate learning growth going forward, not just chart it in the past. So Plano added a third measure to the SAS tool, a formative testing element: the Measures of Academic Progress (MAP), from the Northwest Evaluation Association, which is given periodically throughout the school year to track student learning over time.

Armed with the data from the three assessment measures, the district was able to derive yet another statistical metric that it could provide in a visual format to teachers and principals: "projected learning growth," as it relates to achievement in core content areas for each student. The new metric was launched last spring and made available to parents through a newly created SAS parent portal. Hirsch explains that principals must use it to identify in early fall students who are at risk for not meeting proficiency standards on the state test. Interventions are expected to be put in place, for which the principals are held accountable during the evaluation process: Did they work, or did they show no quantifiable benefit?

"Once the state assessment is finished, now you can compare: Here were the projections, here were our interventions, here are the results," Hirsch says. "If the results are better than the projections, then you were applying the right interventions. If the results are equal to the projection, nothing you did seemed to help that much. And obviously if the results are worse than the projections, then the interventions you selected were counterproductive."

The evidence thus far says the actions Plano educators have made in response to data have been a distinct success. The district attributes the dramatic rise in proficiency scores on the state test in the last two years to personalizing instructional strategies for students whom the data showed had grade-level cognitive skills but were scoring below grade level on the TAKS. That group of students in the eighth grade, for example, had passing rates of 54.3 percent in math and 61.3 percent in English on the 2006 assessment. On the 2008 TAKS, those proficiency levels were up to 82.6 and 93.6 percent, respectively.

"We're moving in the right direction," Hirsch says. "Our teachers say they feel more confident in their classrooms, knowing what a student's learning needs are besides just, ‘Here are last year's results on a state test and here are their grades from last year.'"

Hirsch says that the desire to add more metrics to the tool, or to mix and match existing variables in new and untried ways, is sparked by each fresh batch of results.

"As you begin to learn your tools and your data more completely, you begin to realize questions that you hadn't thought of earlier are legitimate questions," he says, "because you're now combining variables and you now have access to variables that you didn't realize before."

Even now, Hirsch is imagining new ways to mine the data for the sake of students and teachers. He thinks the next frontier is connecting students' summative and formative test scores. "Can we bring this into the tool and begin modeling something that would make sense and be beneficial to our teachers?" he asks.

"Those are the kinds of questions that continually come up each year as we look at, here's what we now know, but boy, wouldn't it be nice if we could..." He lets the sentence fall off, as if running through the many possibilities for completing it, he's unable to land on just one.

CPS: Slicing and Dicing

"I just clicked on one," says Runcie, Chicago's CIO, diving feet first into a data point on the district's dashboard. "I'm looking at the attendance rate for a particular week. I see it's 96 percent. I can go in here, student by student, by grade level, and see the number of days absent since the start of school. That's something I may want to track, because if kids aren't in school, it's a problem."

One click begets second and third clicks, as Runcie drills down deeper into the data to demonstrate how granular it gets, revealing different aspects of a school's attendance rate.

The unveiling of the dashboard last January was a milestone in Chicago's DDDM initiative, which had begun five years earlier as an effort to modernize the critical systems-- student information, human resources, payroll-- of the nation's third largest district. A massive overhauling was in order, one that would integrate the operations of 600 schools and bring all the data produced by those campuses under one roof-- and one language. As Runcie explained last year, there was no consistency to the data. In the early planning stages, it took CPS months to formally define the distinction between a tardy and an absence.

Nor was the data clean, which is to say reliable. In the world of data, cleanliness is next to timeliness. "You can't have a kindergartener who was born in 1950," Runcie says. The release of the new dashboard remedied those ills, broke down the silos, and pulled the data into one location. "It's one-stop shopping for folks to get to the core data they need to look at to figure out what to do to drive performance."

But even as it propelled CPS to the next stage of its data operations, the new tool had its shortcomings. The metrics it showed were too few in number, narrowing the window the tool provided on student performance. It was back to the dashboard for the district, for more strategy meetings and focus groups to help select new data points to incorporate. "It was a lot of work," Runcie says. "Well over 100 metrics were identified."

The result is a revamped tool set for launch this month, whose functionality is underpinned by Microsoft technology. The company's Office PerformancePoint Server provides users with metrics "that matter most to the organization and users in terms of driving performance toward specified goals," Runcie says.

"Last year, you had data points that showed you where you are today, but not how you have progressed in relation to the last couple of years, or compared to other schools. This year you're able to make those kinds of comparisons. You see more graphics, more trending." The expanded dashboard is more able, he says, "to slice and dice information," one piece opening onto the next.

"One of the things we can do now is show the number of freshmen who are not on track to graduate. If you drill down, we can also show you how many students have D's and F's. If you click on that, it will identify the specific students in your school, and their teachers. That's a metric we didn't have before."

The enhanced granularity of the data provides administrators with actionable data. And that, Runcie says, is where the difficulty arises. "The most challenging piece of working with data is figuring out what to do with it. What kinds of interventions will you put in place to improve the outcomes in your schools?"

A key step is making sure the tool gets used, which Runcie says, requires some sensitivity. "We went to great lengths to position it not as a performance evaluation tool, but as a resource that principals can go to for information to help them better manage their schools. We're advising AIOs [area instruction officers], don't look at the dashboard and call the principal and say, ‘Hey, what's going on with this metric here?' or base their evaluation on it. We're trying to be careful about the kinds of conversations that exist around it." The district's latest research shows that more than 90 percent of principals have used the dashboard; 70 to 80 percent view the data at least once a week.

Runcie says the system hasn't been in place long enough to judge its impact, but he believes its success is dependent on the users. "Improving student achievement is going to be a function of how teachers and principals at the school level who work with the students utilize that data."

The release of the tuned-up dashboard this month doesn't signify a culmination of CPS' DDDM project, says Runcie. Like Hirsch, he says that any district interested in positive outcomes is constantly cycling back to Stage 1: determining what questions it wants the data to answer. "You never finish that stage," he says. "If you're a learning organization, you're always looking at the data from different perspectives and gleaning new insights into what you need to do. We continue to add new metrics to the tool every quarter as we build and continue on."

Going forward, Runcie hopes to take measures to ensure the data his educators are using is pure. "How clean and timely and accurate is the information that's being input at the source? I'm working on developing a measure that can capture that, and then feed that information back to the principals.

"No organization is ever going to be able to survive and be effective without having good data to guide its journey and what it's doing. We have provided that. Now how we actually use that information is going to be the big question."

::WEBEXTRAS ::
For more information on data-driven decision-making, visit our website at www.thejournal.com. Enter the keywords Data Management.

Jeff Weinstock is executive editor of T.H.E. Journal.

This article originally appeared in the 02/01/2009 issue of THE Journal.

Whitepapers