On the Road to DDDM

##AUTHORSPLIT##<--->

Implementing a data-driven decision-makinginitiative is a painstaking, six-stage process.Two districts are halfway home.

On the Road to DDDMTWO SCHOOL DISTRICTS-ChicagoPublic Schools (CPS) and PlanoIndependent School District inTexas-have undertaken ambitious initiatives in recent years to introducedata-driven decision-making into their operations. Althougheach has struggled with different obstacles, both have progressedbeyond the initial two stages on the DDDM continuum, where issuesof planning and funding are fleshed out and purposes are defined,and now reside at the middle rungs, stages 3 and 4, where data is gathered,organized, and interpreted.

In the case of CPS, getting the data into a condition where people would actually believe it-and rely on it-has required massive restructuring of the main applications upon which the district relies. Plano, on the other hand, began to spin its wheels in tool adoption: Should it go with systems that district people would be comfortable using, or should it make a leap of faith?

On the Road to DDDM How the districts chose to resolve these issues would be sure to go a long way toward determining in what shape and with what degree of success they would move on to stages 5 and 6, where data begins to drive decision-making and the outcomes of those decisions are judged.

CPS: Delayed by Definitions

Five years ago, Chicago Public Schools, the third-largest school district in the country with 426,000 students and 48,000 employees, began the arduous task of replacing a number of aged legacy systems. Student information, human resources, payroll-all critical systems-were, according to district CIO Bob Runcie, 30 to 40 years old.

Those applications, covering the workings of 600 schools, generated a tremendous amount of data, but doing analysis across data stores was difficult. It was all siloed. Only a handful of people were capable of maintaining the software, and a small proportion of potential users ever bothered accessing the programs. The programs' primary purpose was as a compliance vehicle-to collect just enough information to do the required reporting to the state and federal government. That meant that users maintained the data they cared about on individual PCs or ran "shadow" systems to generate the reports they needed for their own schools.

The time had come for upheaval. The opening shot was a new comprehensive student information management system that CPS called IMPACT (Instructional Management Program and Academic Communication Tool). It began with a simple premise: to make people confident that the information they were working with was consistent and reliable. Runcie explains that no data gathering can start until all sides abide by the same language. "Some schools, for example, have different definitions of perfect attendance," he says. "Some may include religious holidays. Some [may not]. And they may define tardy differently."

Until schools agree on what classifies a tardy or an absence, attendance statistics can't be compiled. It took CPS two to three months to decide on the definitions of just those two terms.

Stage 1: Define the outcomes.

PROJECT FUNDING IS SORTED OUT AND BROADORGANIZATION GOALS ARE SET.

The problem of inconsistent, "bad" data reared up publicly in a federal court case involving the district in 2003, Runcie says. "We had our law department, our budget department, research and accountability. We had another area called academic enhancement. They all were presenting data. The judge [said] it's hard to believe we're all working in the same place, because everybody is producing reports on the same data but they've all got different numbers."

The painful process of pinning down and defining data actually had real value, says Runcie. "That's one of the most significant benefits of doing business intelligence, dashboards, scorecards. It's not so much the data that's ultimately presented or what the scorecard looks like. There's value in going through the exercise, working with stakeholders, and thinking through how the organization runs and how it has been defining data."

Once the new systems for managing student information, human resources, and finances were in the works, CPS could exit Stage 3 of its data renovation and move on to Stage 4: giving users access to the information it was collecting so that the data could be cleaned up.

Stage 2: Define the questions.

PARTICIPANTS DECIDE WHAT QUESTIONS THEY WANT THE DATATO ANSWER AND IDENTIFY THE ORGANIZATIONAL PROCESSESTIED TO GETTING THOSE ANSWERS.

"Data is very much an iterative process," says Raveen Rao, of Diamond Management & Technology Consultants and a project manager for the CPS initiative. "And a guiding principle that we went forward with was, if we don't expose this data to people and show that it's being used, it's never going to get clean."

Last December, the IMPACT team set up a soft launch for a dashboard it had created, allowing principals the chance to access their school's data to evaluate its accuracy. They were told, says Rao, "If it's not right as far as you can tell, let us know and we can work with you to get it right."

The initial release of the dashboard in January to principals, area officers, and central office administrators incorporated static reporting only. Users could view a small number of metrics. That will expand in the future to add additional metrics, the ability to slice and dice the data, and perform what-if scenarios.

To give those users a taste of what's in store for them, district senior leaders held a "CompStat" meeting with 120 high school principals. Fashioned after a process introduced in the New York City Transit Authority and later adopted by the New York City Police Department as a tool to track and reduce crime, CompStat brings together participants to evaluate computer statistics relating to their specific areas of management. "We basically challenged them to explain why the numbers were where they were at," Runcie says, "and what they were going to do to improve the situations in their schools and get those numbers to improve."

Stage 3: Collect and sort.

THE DATA IS TRACKED DOWN AND ITS QUALITYEVALUATED. THE TOOLS AND APPLICATIONSPARTICIPANTS WILL USE TO WORK WITH THEDATA ARE DECIDED UPON.

He says another CompStat is scheduled for the spring to find out what activities those school principals have put in place in response to the initial meeting. Eventually, Chicago's elementary and middle school principals will go through the same exercise. Runcie says the CompStat sessions are vital: "We need to be able to provide a tool that people rely on and say, 'Hey, these are the data points I know I'm going to have to move on because they're going to help drive performance.'"

As CPS tries to push on to Stage 4 on the DDDM continuum, the decisions it has made in Stage 3 on how best to collect the data are already bearing benefits. Reliance on the new systems is growing, driven by a new district policy that prohibits the use of any alternative system that maintains information comparable to what's tracked by IMPACT. Albeit with some resistance, the policy, established in 2004, has caused data usage to soar. Whereas formerly between 2,500 and 3,000 users accessed data through the legacy software, with the introduction of the new enterprise systems, that number has expanded to more than 30,000 users, mostly teachers, with all of the access occurring through web-based tools.

Plano ISD: Moving Forward Fast

A little more than three years ago, Plano Independent School District, serving 52,000 students, decided it needed to get more out of student assessment beyond the reporting it was used to. At the time, the district was using an application from D2 Data Driven Software (formerly EdSoft). It was at a crossroads: Should it renew the license it had for the software, or was it time to move the district's decisionmaking to the next level?

As Plano's Jim Hirsch describes, the software would allow users to take data from a variety of sources, create a report, and deliver it online to the teachers to use for creating action plans for their students. "In reality, it was a very static report," says Hirsch, associate superintendent for academic and technology services. "It didn't combine more interesting variables in ways that truly were actionable. It was more of an autopsy piece."

Stage 4: Extract meaning.

PARTICIPANTS GET ACCESS TO THE DATAAND LEARN HOW TO INTERPRET IT.

Decision-making was still an intuitive endeavor because the data frequently wasn't timely, and Plano teachers and principals didn't necessarily know how to analyze it. The district had no way to massage the information it was collecting without creating a new report. In fact, when teachers would go into the system, they would have to poke through 650 individual reports.

So Hirsch pulled together a group of about 70 people-members of his staff, teachers, principals-to help assess alternatives. For the next year, they discussed, as he puts it, "what experience we wanted to have happen in our classrooms." It quickly became apparent that the questions they wanted answered by the data were most critical. Hirsch offers an example: "I have a huge range of abilities in my classroom, but I don't know for certain which students need which type of assistance to get to the level of proficiency that they are going to be tested on this spring."

During that year's time, Hirsch's group met with about 10 vendors in the education assessment space, including some they'd worked with before, such as D2 and Excelsior Software. They were, summarizes Hirsch, "very nice-looking, web-based products, but they were just wrapping on an old thought process."

Hirsch read up on the concept of performance management, in which an organization sets predetermined goals, then monitors progress toward them in a systematic way. "It really began to make sense to me," he says. "That's exactly what we were trying to get into our school district." His team was skeptical that a product outside of education could supply what they were looking for, but Hirsch turned their attention to considering some of the leading vendors in the business intelligence arena.

Stage 5: Take action.

NOW THAT THE DATA IS UNDERSTOOD, OFFICIALSNEED TO DETERMINE WHAT TO DO ABOUT IT.

Meanwhile, SAS, a business intelligence software and services provider, was looking for some innovation in K-12, and, says Hirsch, "We were a school district looking for a vendor that was thinking differently from the others."

Ultimately, SAS beat out the traditional education products because Hirsch's team realized that the analysis of student performance encompassed more than the results of student testing. It also involved asking questions that entailed working with data from the financial and HR sides. Because SAS' software modules aren't limited to student assessment-and, in fact, need to be programmed to work with it specifically-the district can integrate data from these other areas in order to derive new insights, such as what sort of impact teacher credentials and training experience have on learning in the classroom, or what impact salary has on teacher retention.

Then there was the money angle. "We weight certain students in terms of the dollars given for those students based on their atrisk characteristics, the courses they take," explains Hirsch. "We said, 'What if we could bring in some of that financial piece?' All of a sudden, you can do a program cross, a salary cross. As we look at teachers, what's it costing us to get a student to move 10 RIT points?" (The RIT, or Rasch Unit, scale charts students' yearly academic growth.)

Although SAS' modularity offered a more flexible approach to analytics not provided by the other vendor candidates, Hirsch's team realized it would take longer to get the new software running. That was because the company lacked the models-descriptions of the programming logic that represented the analytics-for K-12 that it had in other segments, such as banking or financial services. The district would have to build those models itself.

Stage 6: Evaluate outcomes and modify as needed.

THE DISTRICT ASSESSES HOW EFFECTIVEITS EFFORTS HAVE BEEN.

Hirsch has a pyschometrician-a testing expert-on staff, so he knew the district could handle defining the models. What it couldn't do was program them. For that, it relied on SAS professional services developers. Two SAS employees came in for about three weeks during the initial stage of the project, but everything after that was handled remotely.

In those early days, the pressure was on. The EdSoft application license had expired, so Hirsch's assessment group cranked out spreadsheets manually to deliver the data that teachers and principals needed.

The district spent 18 months getting the SAS Enterprise Intelligence Platform configured and developing the base models it would use for student assessment analysis. What the district didn't know was how well the software would scale to the number of users Hirsch expected to be accessing it: all 4,000 teachers and principals. During an initial workshop, attended by 90 people-primarily principals in the district-there was, says Hirsch, "a magnificent crash" within five seconds of going live. He blames the setup of the Apache-based database server that was managing the requests to the data, which collapsed under the weight of so many simultaneous appeals.

Hirsch says SAS wasn't accustomed to installations where more than several dozen people had access to the analysis tools. "The difference was, we're having all 4,000 of our teachers access the real SAS tools-not reports, but live, derived-frommodels results....SAS had never scaled a customer to this size."

At that point, Plano decided to take more ownership of the project. Hirsch assigned a network engineer on his staff to develop a more scalable application, while transition architecture was put in place, allowing users to use the system, "but not to the degree we wanted," he says.

Taking the First Step

Jim HirschJIM HIRSCH, A SUPERINTENDENT with the PlanoIndependent School District (TX), who is leading thedistrict's journey to data-driven decision-making, saysthe key to a successful DDDM effort is starting with thequestions, not the software: "Identify the most criticalquestions you need to answer about your students to help improvetheir achievement, then find a system and a process that will allow youto answer them. Don't start collecting data, don't start buying systems,until you have a good idea of what questions you are trying to answer."

In the fall of 2006, the new system finally went operational. Now it handles about 4,000 requests a day.

The program, which has a portal on the front end, is role-based. When a teacher logs in, for example, the portal retrieves the data to build a class perspective for that user, which includes all the information available about the students in the class. When principals log in, they see additional tabs within the portal, which enable them to view adequate yearly progress and Academic Excellence Indicator System (AEIS) scores for a given campus, as well as other related charts; they can also drill down to individual classrooms. (The AEIS is what Texas calls its set of reports that gathers information on the performance of its students).

Although Plano's data management efforts were roughly at the same stage as Chicago Public Schools', its focus wasn't on data quality so much as tool selection. Users had a certain level of comfort with the data being produced by current systems, so why make changes? The answer: in order to squeeze more out of it.

Looking Ahead

While both Plano and Chicago are pushing to advance from Stage 3 on the DDDM continuum to Stage 4-from gathering data to extracting meaning from it-Plano already has its business analysis software in place and a willing group of participants to work with the tools. CPS is seeing slower progress in sorting out its data collection issues even while users have begun to put more faith in the reporting provided by the new system. As each district progresses, it will face new challenges discerning what data is relevant, addressing tolerance for change among users, and figuring out how to respond now that data is driving its decision-making.

::WEBEXTRAS ::
For more information on data-driven decisionmaking,visit www.thejournal.com. In the Browseby Topic menu, click on Data Management.

Dian Schaffhauser is a freelance writer based in Nevada City, CA.

This article originally appeared in the 04/01/2008 issue of THE Journal.

Whitepapers