Business Intelligence :: Magnum B.I.
        
        
        
        ##AUTHORSPLIT##<--->
Using powerful business intelligence tools, two districts are drilling downto the finer points of student data, where the most revealing insights lie.
 “When marketing and sales really work together, you get a very powerful system,”  Dave Chiszar explains, drawing a parallel between business and education. “In education, when curriculumand assessment work together, you leverage some really good things.”
“When marketing and sales really work together, you get a very powerful system,”  Dave Chiszar explains, drawing a parallel between business and education. “In education, when curriculumand assessment work together, you leverage some really good things.”
At the Naperville Community Unit School District 203, outside Chicago, where Chiszar is the director  of assessment and quality analysis, really good things are the norm. One of the five largest districts  in Illinois, Naperville has 19,000 students, and more than 90 percent of them meet or exceed  state educational standards, according to Chiszar.
How have they done it? Through what Nicole Engelbert, a senior analyst in the technology business  unit at Datamonitor, a market research and analysis firm, calls perhaps the strongest implementation  of business intelligence (BI) tools she knows of in K-12. Naperville employs a sophisticated  data warehouse, mining, and statistical analysis software from SPSS to track how its students  are performing on ongoing assessments. Through incremental benchmarking and yearly testing,  Naperville’s principals and administrative team can see how students are progressing toward standards,  or whether a particular student’s performance is falling off.
Engelbert says the district uses SPSS not simply to check on student performance in the previous  month, but to ask sophisticated, forward-looking questions such as: Given what we know about how  we did last month, are we or aren’t we on track in terms of adequate yearly progress? How much  improvement do we need to make this week in order to stay on track?
That’s the appeal of business intelligence tools, which offer schools the ability to look beyond a  routine statistic, such as what percentage of students have passed a given test. Through data analysis,  schools can view specific scores for a select group of students, for example, and compare that  data to other groups, classes, or teachers. That, Engelbert says, is the kind of information the No  Child Left Behind Act is after, and the kind that “BI fundamentally offers—the ability to drill down into  your reports.”
Using Assessment to Build Curriculum
Even before NCLB came into being in 2002, mandating that  school districts show adequate yearly progress (AYP),  Naperville understood the need to gather student data for use  in curricular development. Roughly seven years ago, the district  began following a 10-point quality-improvement process  for K-12 called Standard-Bearer, a set of standards developed  by the Schlechty Center for Leadership in School Reform that  provides school districts a framework by which to continuously  measure and boost the quality of instruction and level ofstudent engagement.
Naperville began applying aspects of the Standard-Bearer  framework to the curriculum review process, while at the  same time using a template called Understanding by Design,  to help build its curriculum and units. UbD, the brainchild of  East Coast-based educators Grant Wiggins and Jay McTighe,  is a structure for improving curriculum and instruction that  asks questions such as: What aspect of this learning will still  be relevant to the student in five years? How do you know students  are learning what you expect them to? What will carry  through to other layers of the curriculum and to the bigger  picture? By posing those sorts of questions, combined with  the quality-improvement process inherent to Standard-Bearer,  Chiszar says the district began to change how it designed itsassessments.
              When assessment feels separate from curriculum, you have a problem.
              — Dave Chiszar, Naperville Community Unit School District 203
“You get into this whole idea of assessment,” Chiszar says,“and how to write a good assessment—how to collect the data;  how to understand if your authentic assessments are measuring  things well; how to understand if [the curriculum] is working  for the individual  students.”  As the Standard-  Bearer and Understanding  by Design  precepts were implemented,  fine-tuned,  and then applied to developing assessments, a growing awareness  of the need for good data developed. Looking back,  Chiszar credits the 10 points in the Standard-Bearer template  with changing thinking and procedures throughout Naperville.“That was the foundation piece,” Chiszar says.
Though Naperville was ahead of the curve on data gathering,  other districts, according to Datamonitor’s Engelbert,  have been driven by NCLB’s requirement for detailed reporting  to begin looking to data-analysis products and tools as a  solution. “There’s a clear alignment between what BI does andwhat NCLB wants districts to build their reports on,” she says.
Typically, the adoption of BI follows about 18 to 24 months  behind the introduction of an enterprise resource planning  solution, Engelbert says. That’s partly because a new ERP or  student information system shows staff and teachers what’s  possible; users then start thinking in new ways about data. As  users move from the transactional reports that ERP and student  information systems produce, the next step is analytics:  getting insights and information from the data the ERP solution  has made available. Engelbert says, “That’s when the conversations  about BI start to emerge.”
Naperville selected and installed SPSS a little more than a  year ago, moving up from the company’s desktop products  already in use. Prior to that, Chiszar says, the district simply“outgrew” Microsoft Excel and started doing some moresophisticatedthings in an SPSS desktop program for Windowsthat handled statistical analysis. Users would then shift thedata back into Excel for storage. “But we were creatingspreadsheets that were 130 megabytes, using up literally everycolumn and almost every row of data.” It became clear that thedistrict needed a data warehouse and better tools. Enter theSPSS system, which serves as both a data warehouse and adata analytics tool.
“The district has used data for a long time,” Chiszar says,  explaining the evolution of Naperville’s use of data to shape  its curriculum. “Curriculum and assessment are tied closely  together here.” As a curriculum is being developed, assessments  are developed at the same time to help administrators  learn how students are performing against the curriculum.“The district learned a long time ago that you have to measureit to understand it,” he says.
To help teachers with assessments and curricula, Chiszar  explains that he and his staff of two work to turn useful statistics  into “easy-to-read graphs that help answer the questions  that teachers want answered.” The graphs inform teachers as  to what their students know right now compared to what they  don’t know, and what teachers need to help them learn. The  visuals also let teachers know which students are on pace,  which are falling off, and which are racing ahead—and what  exactly that pace should be.
In addition to the window it provides into student progress,  the software can be used to drill deeper into data. For example,  in analyzing the results of a test, the system can present  colored buttons that indicate overall performance on each  question, along with question difficulty and response rate. But  by drilling down, a user can see not only which students  answered a question correctly, but also which answered incorrectly—  a sophisticated indicator of question validity. Thisenables a teacher to get a report on any individual student.
“You can spend hours on any one test, clicking down to each  individual student and looking at who answered what,” Chiszar  says. The system can also generate statistics such as which questions were consistently answered wrong; that information can be  used in redesigning the test.
Creating the reports teachers need and want is a collaborative  process. Chiszar and his staff regularly meet with a 20-person  assessment steering committee, using an iterative process in  which Chiszar says the first thing he and his people need to  know from the committee is the nature of the business problem  it’s trying to solve. Second, they need to know what format the  committee wants to see the report in. They then create an initialreport and tailor it as requested as things proceed.
The goal is not to provide personalized reports for individual  teachers; rather it’s to create a set of reports that will be  useful to a wide number of teachers. “We don’t want teachers  creating reports,” Chiszar says. “We want teachers to have a  report that they can have a conversation about with otherteachers and with administrators.”
In making the necessary ties in the curriculum-building  process, teachers and staff regularly ask a fundamental question:  How will we know when a student has mastered an  assessment item? “You have to answer that question while you  build the curriculum,” Chiszar says. “You have to be thinking  of the assessments while you build the curriculum. When  assessment feels separate from curriculum, you have a problem.”  That continuous cycle of measuring and making  changes, then measuring again, is all part of the qualityimprovementprocess Naperville has adhered to for years.
Ultimately, the goal is to have “this wonderful, rich, robust  set” of different assessments—some mandated by the districts,  others optional—to offer to teachers and staff when they need it.  However, “that’s not the reality now,” Chiszar says. “We have  good pieces of it, better in some places than in others. But that’sthe idea of what we want to provide to staff.
             
"A teacher can go in [to a data analytics tool] and say, 'I want to look at all the students in my class that are second-language learners, scored below the 50th percentile on the CST, have a MAP score of less than 220, and receive reduced lunches.'"                
— Robert Gravina, Poway Unified School District
“Training helps teachers understand which assessments to  use when. The curriculum doesn’t change; how to get students  there is up to the teachers.”
Speeding Up the Feedback Loop 
Poway Unified School District,  with 33,000 students,  serves a suburban, middle- to  upper-middle-class community  about 20 minutes north  of San Diego. The district has  been using SAS data warehousing  and analytics software  for a little more than  four years, both as a data  warehouse and through a custom  application it calls  TIM—total information management.  TIM is a userfriendly,  SAS-built front endto the district’s data warehouse.
According to Poway CTO Robert Gravina, TIM gives teachers  a simple entry to data such as student achievement scores.  TIM is particularly helpful because it delivers data that is relevant  to a particular teacher’s students. “What used to happen,  and still does happen in most school districts,” Gravina says,“is that districts print out data and give it to the teachers,[then] try to teach the teachers how to use that data to changehow they teach.”
Instead, the system in place at Poway provides information  on a desktop to teachers, allowing them to manipulate the data  themselves to find statistics specific to their classes. Gravina  gives an example of the sort of digging that can be done:“A teacher can go in and say, ‘I want to look at all the studentsin my class that are second-language learners, scored belowthe 50th percentile on the CST [California Standards Test],have a MAP [Measures of Academic Progress] score of lessthan 220, and receive reduced lunches….You can do that inany variation.”
Poway’s IT staff has set up TIM so that California education  standards and assessment scores are built in to the system. As  Gravina explains it, a Poway teacher can select specific  assessment scores, which the system translates into standards  the students need to learn. “The teacher,” Gravina says, “can  then say: ‘Okay, two-thirds of my class don’t know how to  write a comparative essay. This is a 10th-grade standard, and  my kids need to learn this standard before the end of the  year.’” Teachers can even drill down into which particular students  don’t know a given standard, then tailor lessons accordinglyfor those kids.
That feedback loop isn’t that different from how Poway teachers  operated before the SAS system was in place, Gravina  acknowledges, but the time it takes to evaluate problems and  make changes to what’s being taught is far shorter. Assessments  and adjustments to address learning needs are done throughoutthe year, rather than after students have moved on.
Convincing teachers to use TIM has been relatively easy,  Gravina says, because it presents a graphical interface that’s  simple to use and relies on mouse clicks. That ease of use is  key to appealing to reluctant teachers. Getting busy educators  to use a new technology can be a challenge—Gravina says  Poway offered good, useful data as a reward: “The thing with  teachers is, you really need to give them a hook. We gave them  some information they really wanted, like student phone numbers[and] parent home-phone numbers.”
That information had been difficult to obtain because it was  in the district’s student information system, which teachers  couldn’t fully access. “That was the hook,” Gravina says.“Once we got them to go in, they really loved it.” He cites asurvey of TIM usage conducted last year as evidence of itsacceptance: More than 50 percent of Poway teachers are usingthe program on a weekly basis.
TIM is tied into other software systems at Poway; eventually  it will be connected to human resources and finance, along  with some predictive analysis tools. That will allow district  administrators to collect information, for example, on the costs of two different reading systems, then to compare those  costs to each system’s impact on student achievement. “We  can ask: Which system is giving us a bigger bang for the  buck?” Gravina says. To do that now would require pulling  information from several systems, somehow combining it in  one place, then sorting and analyzing the gathered data to produce  meaningful reports. With all the software linked, Gravina  says, administrators will be able to easily compare data across  systems, such as assessing whether teachers who have been with  the district longer consistently have students who perform better.  “We’ll be able to do some comparative analysis and some  data modeling.”
The project has come at a cost—the district has a full-time  technician assigned just to work on the economics of the SAS  product. “There’s a lot of work that needs to go into customizing  the program,” Gravina says. “Even though it’s an  out-of-the-box product, you buy the base [and] have to develop  it so that it meets your unique needs.”
Implementation Challenges
Despite their usefulness—and despite the urgency for reporting  brought about by NCLB—school districts generally lag  well behind private industry and higher education in adopting  BI tools and technologies.
One challenge for data warehousing and BI vendors is that  the needs of school districts tend to be very particular. Business-  oriented tools designed for a  profit-focused Fortune 500 company  simply aren’t a good fit for education,  which has a different structure  and intent. Understanding that, some  vendors are targeting the education  market specifically. Datamonitor’s  Engelbert singles out Cognos as one  example of a company that has focused on education and  reaped big returns as a result. “It’s put some great resources  [into] understanding the needs of education institutions,” she  says. Engelbert also recommends keeping an eye on SPSS and  SAS in the K-12 market.
Beyond infrastructure, schools also face challenges regarding  security. Although analytics tools don’t make data less  secure, Engelbert points out, they do make it more visible and  accessible, rather than hidden in a database somewhere.  Understanding the process of controlling passwords, identity,  and access management can be a challenge. “There isn’t a  strategy in place for understanding what [positions] need to  have access to what data and what part of the IT infrastructure,”  she says.
   A third issue in implementing BI is data integrity and quality.  Without good data in the first place, it’s difficult or impossible  to perform any sort of useful or reliable analytics. “I have  no reason to believe that data quality is very high in education,”  Engelbert says. “Institutions should assume it’s poor,  then implement strategy tools to address that. If you don’t,  you’re proceeding at your own risk.”
 Despite the challenges, BI is doable—and can yield great  returns—but cautions are in order. BI is not a technology purchase,  Engelbert warns: “It’s a business process or a change  management decision.” That means involving not only the  CTO and IT directors at the table, but also representatives  from the academic, process, and service departments. And be  sure to choose a vendor that is familiar with your existing  infrastructure, including your ERP and student information  system. The best choice is a BI  provider with experience in education,  Engelbert advises, so that  reporting models will be specific to  K-12 and the educational process.  That’s very different from many  other uses of BI tools—with good  reason, Engelbert says. “We’re not  making widgets. We’re educating students.”
:: web extra ::For more information, visit  T.H.E. Journal. In the Browse by Topic menu, click on Business Intelligence.
Linda L. Briggs is a freelance writer based in San Diego.