Technology & Achievement | Q&A

Measuring 1:1 Results

One CTO reveals how his district uses data-driven decision making to eke benefits from its 1:1 laptop implementation.

Four years ago Mooresville Graded School District (MGSD) in Mooresville, NC, rolled out a 1:1 laptop program that put MacBooks in the hands of all students in grades 3 through 12. Even before the computers were doled out though, Scott Smith, CTO of the 5,500-student district, said the IT team, school leaders, administrators, and teachers decided that they wanted to do more than just "hand out laptops to kids."

"We looked at the initiative as a digital conversion," said Smith, "knowing that it was completely transforming the teaching and learning environment that we were all accustomed to."

Improved student engagement, state and national assessment scores, and student attendance were a few of the district's top priorities. To achieve those goals MGSD developed a three-pronged approach that comprises the equipment itself, free value-added assessment software, and regular Scantron assessments to track progress.

Smith described the district's approach to THE Journal and talked about the tools that MGSD is using and the results it's seen from its integrated 1:1 approach.

Bridget McCrea: What initial thinking and planning went into this 1:1 implementation?

Scott Smith: When you move an entire district into a digital environment a lot of things change. What doesn't change is the fact that everything revolves around academic achievement.

We want students to succeed at or above their current grade levels. Going digital doesn't change that, but it does change the role of the teacher, student, IT facilitator, and administrator. The whole environment takes on new meaning. We looked at this 1:1 initiative as more than just the distribution of technology. We wanted to know what kind of data we'd have access to and what kind of decisions we'd be able to make based on that data.

McCrea: What tools did you use?

Smith: The state of North Carolina provides the EVAAS value-added assessment software from SAS. The predictive analysis software gives us information on students statewide, and it allows us to make predictions on student progress.

We rank students in the state on four levels [with four being "great" and one being "not doing so well"], and the software can tell us which students are predicted to make fours versus ones, twos, or threes. With this information in hand we can develop targeted intervention programs for each group.

McCrea: How do the Scantron tests come into play?

Smith: EVAAS is a just a predictive tool that doesn't necessarily reveal how the students are doing. We needed current information and the old method of teaching a lesson and then testing at the end of the semester doesn't cut it anymore. Our teachers use Scantron online assessments on a quarterly basis. The test results show where the student is at and where additional instruction is warranted. Using the quarterly tests the teachers can quickly determine prescriptive intervention for every student who needs it.

McCrea: What were the hard parts of this initiative?

Smith: Staff development was a big issue.

Before the 1:1 rollout we spent at least six months on staff development. Going from 30 kids in a room opening textbooks to 30 kids opening computers is a significant shift.

We wound up with a number of early adopters who bought into the change and a bunch of others in the middle who were saying, "Give me time and we will get there."

Then there were staff members who refused to participate and threatened to retire. We stuck to our guns and told everyone that we were moving in this direction and that everyone had to be on board.

Four years later we're still not there yet but we've definitely made progress. Getting to 100 percent is going to take a while.

McCrea: Has the district honed its data-driven strategy since rolling out the 1:1?

Scott: Yes. Everything has become more and more integrated.

Early on we knew we wanted to generate and use the data, but we really didn't know what that data was going to look like. We also knew that we needed to find out how our students were doing and how we could get them to the next level.

We had EVAAS when we started and we added Scantron shortly after that. To make sure everyone was on the same page we started holding quarterly data meetings to look at progress across all grade levels and/or departments. There is complete transparency and total accountability. If Ms. Jones is doing a phenomenal job and posting great scores, we'll ask her to work with the rest of her team. If Mr. Smith's classes aren't performing well, we'll pair him up with Ms. Jones.

McCrea: What results has the district realized over the last four years?

Smith: Our student suspension rate has gone down; our dropout rate has gone down; our attendance rate is up; our graduation rate is up; and the end-of-course exam scores in subjects like algebra I and biology are up. There's basically been a positive trend across all data when you compare 2008 to 2011.

About the Author

Bridget McCrea is a business and technology writer in Clearwater, FL. She can be reached at [email protected].

Whitepapers