ISKME Special Series Part 2: Data Use and School Reform

##AUTHORSPLIT##<--->

T.H.E. Journal, in partnership with the Institute for the Study of Knowledge Management in Education (ISKME), prepared this special four-part report on data-driven decision-making.

New technologies have made data gathering easier than ever. Educators now must confront one big question: How can the data be used to improve student achievement?

IN A TIME OF WIDENING achievement gaps, accountability mandates, decreased budgets, and high turnover of school leaders, K-12 educators have grown determined to substantiate their decision-making with hard data. This new attention to data coincides with the advent of improved information technologies, faster Internet connections, and easierto- use applications and interfaces, which are supplying administrators with mounds of data on all aspects of the education enterprise. But the questions being raised in public forums and in private corridors reduce to this: What problems does the data help us solve?

Those on the cutting edge of these developments are asking themselves and others the tough questions. Education leaders are trying to better understand how to reform their systems, districts, and schools to improve student achievement. “We can’t continue to design our schools in a way that leaves some kids falling through the cracks,” says Gregory Peters, the co-principal at Leadership High School in San Francisco. “We have to personalize and know our students well.”

With the current emphasis on statewide, districtwide, and schoolwide testing and assessment, there is no shortage of data out there. A growing field of consultants, vendors, and advisors are helping districts and schools determine what data to collect, how to provide confidential access to it, and how to make meaningful inferences from data that will ultimately bring results.

Making Sense of It All

With so much data circulating, the first issue facing us is: What should we be keeping track of, and what do we do with the data once we have it? And how do we recognize good data from bad?

“It’s not about whether or not data is a good thing,” says Sandra Stein, CEO of the New York City Leadership Academy. “It’s about paying attention to what we observe. When people ask, ‘What does the data show?’ they are often referring to test scores only.” The NYC Leadership Academy trains both aspiring and new principals in New York City schools on how to slice and dice data to get beyond test scores. “We use data to teach school leaders how to be instructional leaders and better managers,” says Stein. “They learn how to look at several indicators of student performance, such as teacher practice and curricular fit, at very granular levels.”

So her participants understand what kinds of data are relevant and what to take from them, in the very beginning of the program Stein provides them with a simulated environment. “They have school report cards, examples of teacher practice, written observations, teacher files, student work, floor plans, videotapes of teachers,” she says, “and people come in to role-play various school community members. So the ways in which data and information come to them can be very different.

“They have to look at all data sources and determine where there are patterns. Essentially they are being to taught to think, Where is the evidence? We’ve taught them about triangulation, mental models, and the ladder of inference, and how you should stay on lower rungs looking for disconfirming evidence.”

Turning Data Into Action

For principals such as Leadership High’s Peters who find themselves awash in data, the task is then to do something constructive with it. “Data should be used to keep schools accountable,” Peters says. “This means we use data as a tool, to plan our next actions. When a school says, ‘This is our mission,’ we use data to show how we are moving toward that mission. We don’t design a school around a test, but around what we’ve learned from the data, as well as the school’s promise to the community. Rather than driving change, data should be used to support and monitor change.”

David Silver, principal of Think College Now (CA), a small elementary school developed in partnership with the Oakland Unified School District, directs data use and professional development at his school. His plan is in effect before the school year even begins. First, a group of teachers gets together at an August retreat, looking at a variety of data from the previous year. They look at where they did well, where they met their goals, and where they didn’t—and why. They look at parent- and student-satisfaction surveys, and then craft a set of action steps for the coming year.

At Back to School Night in September, the school and its community reflect on their progress from the year before and set a course for the new year. A few months afterward, at a midyear retreat, a data coach convenes with teachers, and together they look for patterns in student achievement in language arts and math, and discuss the resources they need for students who need extra help in meeting grade-level goals. Meanwhile, during the year, a group of involved parents present accountability data to the other parents, making the data accessible and comprehensible to all. Finally, in June, at a schoolwide “accountability event,” the community comes together to share its updated progress, and plans for the future. And thus the cycle continues.

Silver says the key in using data to drive high levels of student achievement is personnel. “Make sure you have the right teachers there, people that have high expectations,” he says. “Let them know that it is all outcomes based, and that they will be judged on their results. There needs to be 100 percent engagement in terms of constant feedback between student and teacher, with clear objectives.”

Critics of this intensive use of data and sense-making processes argue that the strategy simply doesn’t scale. Not so, counters Peters. “The way that I leverage leadership in a school of 400 is to work in small groups with a data-based inquiry methodology,” he says. “We identify leaders and develop learning communities. In this way, an entire school can look at and get the pulse of the school.

“It’s the responsibility of the leaders to maintain coherency,” Peter explains. “For example, if we look at assessment data in June and we find an achievement gap for students who come from homes where English is not the primary language, for the next year our professional development will focus on developing the skills, knowledge, and ability to support these students.”

But are schools structured in ways that allow time for this type of inquiry? “Principals are running all day,” says the Leadership Academy’s Stein. “They barely have enough time to sit down and really look at data. They have the orientation to do it, but not the time. So if you can develop a team that can do this, then you have developed the capacity of the organization so that it is not just relying on one person.”

Getting Teacher Input

Tania Gutierrez, an education consultant and data coach in Oakland, CA, thinks it’s important to build a professional framework around data use. One district she works with starts by deciding when the assessments will be given, which is put down on a calendar. “We then build professional development activities around an action research cycle,” she says. “We figure out what time we need to set aside based on when teachers will be doing planning, so we need to structure their time in order to bring in input.”

However, this strategy requires schools to work in conjunction with their district office. Teachers often report that they administer the tests and submit the data to the district, but sometimes wait for results past the time when they would have been useful. “Timing is everything, says Gutierrez. “If you plan all of your professional development around a certain set of data, and then the district doesn’t meet that deadline, you’re unable to carry out your scheduled work.”

As with any other group of professionals, teachers are likely to understand and accept evidence that shows where they could be more effective in their work if they are themselves a part of the process of inquiry, so their involvement is key. Yet participation in a cycle of inquiry also needs to lead to a meaningful change in practice in order for positive action to take place. In other words, data needs to be used not only to support school reform, but also to monitor and improve practice.

For example, student assessments are more useful if you are able to look at item analysis to see what the item is actually measuring. “It’s not just about what the kids’ scores are,” says Stein, “but what those scores mean for instruction. What does this mean about my teachers’ ability to teach?”

“It’s not about whether or not data is a good thing. It’s aboutpaying attention to what we observe.”
Sandra Stein, New York City Leadership Academy

Changing Assumptions

Initially, teachers often resist the request to look at data, but there are proven methods for building activities around trusting data, as well as organizing data in a way that makes it easier for people to engage with.

“As a facilitator and a data coach, I always assume that teachers don’t value the tests,” says Gutierrez. “Therefore, I offer sessions about discovering what the data says. For example, after looking at student outcomes on a math assessment, one teacher might say, ‘I know that I really like to teach fractions, and I can see that I’m good at it.’ Those people start to trust data. It becomes intuitive. You also might look at something that your students aren’t as successful at, and ask yourself, ‘What is the practice that will help me do this better?’ You can try to do it, and then go back [to the data] and see how you did.”

Ultimately, data can be used to help change the assumptions of teachers—particularly those who may not believe in the capacity of children in lower-achieving schools. Gutierrez says that early in one school year, she had teachers at one school group their students into low, medium, and high achievers. She then had them look to see how well their groupings matched the students’ actual test results. “They saw that there were students who they thought were high achievers but actually did poorly, yet they made excuses for them, such as assuming that those students were not good test takers,” Gutierrez says. “For the group of students who performed better than the way the teachers had grouped them, they immediately said, ‘Well, those students must have guessed.’ This is often the starting point for me.”

Gutierrez works with the teachers over time to see if those students continue to test against type. “This is how you get teachers to question their assumptions and change their expectations. To me, anything they’ve learned counts as an action.”

Facing Some Hard Truths

Data use is not necessarily a feel-good activity, at least not at first. Some hard truths are learned, eliciting some emotional responses. “We might not feel good when we look at this data, so we use guidelines to avoid falling into data pitfalls,” says Peters. For example, “throughout the data-based inquiry process, we describe what we see and surface assumptions.”

Gutierrez believes that ultimately there is much to gain when data challenges entrenched assumptions and expectations. “In the long run, it makes you more able to be nimble, get there more quickly, and question assumptions to find out what is really happening,” she says. “A lot of resistance comes from fear, and fear of failure. On some level you have to know. You have to blame yourself or your kids. The teachers who blame themselves end up leaving; the ones who blame the kids, stay. That’s why people are demoralized. As a facilitator you acknowledge that people don’t feel good. So the first thing is to make them feel successful about something. I’ve seen data that’s really low. So the slice you take is, What are their strengths as teachers? What do they like to do? What fuels their curiosity? What did they like to do when they first began teaching?” There is a tipping point, though, in convincing teachers to examine the impact of their own teaching practices. It arrives, Gutierrez says, when teachers see that something they felt intrinsically “is down on paper in a concrete way. I think for some teachers, a lot of what they do feels like magic. They often say to me, ‘We don’t know why it worked.’ To actually know why a lesson worked and how to repeat it is extremely validating.”

The Bottom Line

Widespread attention to school accountability, driven by state and federal mandates, has increased the stakes for all schools. These mandates have made more data available to educators at all levels, but they have not been as effective in encouraging teachers to use student achievement results to undertake the kinds of analysis and self-reflection that can lead to improved teaching and learning. That work must be done in the schools. The schools must engage teachers in analyzing student performance in relation to their objectives for their next lesson, their next semester, their next year.

According to Peters, the process of analyzing data and identifying teaching objectives is “essential to whatever work or reform we are doing. It’s the means by which we ask ourselves not just how we are doing, but how we improve; not just how we did, but what we need to do. Each year the data could look very different. Our expectations for our students might remain stable, but our teachers, students, and curriculum might change. How do we know what to work on, where to focus? Data is the way it’s done.”

Lisa A. Petrides is president of ISKME (www.iskme.org).

This article originally appeared in the 03/01/2006 issue of THE Journal.

Whitepapers