Virginia: When Technology Met Accountability

##AUTHORSPLIT##<--->

Virginia’s decision to link the two programs hasbenefited the state’s entire educational system.

WHEN VIRGINIA LAUNCHED its Standards of Learning (SOL) accountability program in the mid-1990s, the state’s deployment and usage of technology was in its infancy. The state’s technology plan was being developed, and it focused upon building infrastructure. Schools were just beginning to receive funding from the Virginia General Assembly, in accordance with the plan.

Case Studies: Virginia

CHILD’S PLAY: Converting paper-and-pencil
assessment procedures to online testingmay be a
challenge for educators, but students adapt with ease.

Around this time, decision makers began to ask whether the investment in technology was worth the cost, and whether all of the tools being put in place were going to make a difference in student achievement. A decade later, that question is still being asked.

At the Virginia Department of Education, we knew that demonstrating the value of technology, especially at a time when teachers had barely begun to use it, would be a difficult task—and that it was likely to become a long-term effort. Thus, we made a decision that has positively affected our state education system ever since: We linked technology to our growing accountability system.

Our board of education, governor, and legislature were firmly behind our accountability and testing system, and were very interested in anything that would help raise the bar for student achievement and learning. For their part, schools wanted to get test results back promptly so they could make educational improvements. We believed that if we could show how technology would improve the functionality and timeliness of the accountability system, the people who support technology would see returns on their investment, prompting them to continue funding the infrastructure and allowing the time needed to train teachers to use technology effectively.

Creating the Information Highway

When funding became tied to accountability, the state legislature continued to provide significant money for technology to reduce the student-to-computer ratio, build infrastructure, increase connectivity, and develop an online testing system. With those resources, Virginia’s school technologists worked hard to create an electronic highway to move information to and from teachers, administrators, and students.

Networks were put in place to meet the rigid reliability requirements for online testing. Student-to-computer ratios dropped from about 15-to-1 to the current ratio of about 3-to-1. Maximizing federal e-Rate funds helped the average school district save about 70 percent on connectivity costs. Access to the program also improved district connectivity, which went from school-shared T1s to school-specific T1s. Schools increased bandwidth to accommodate both online testing and daily instructional internet usage. As the capability and quality of the infrastructure increased, Virginia was well positioned to embark on a large-scale, high-stakes online testing program.

Online Testing Begins

The yardstick for achievement in Virginia is the Standards of Accreditation. Schools are measured against those standards to determine their level of accreditation; measurement is based on student scores on the SOL tests, which are considered minimum competency exams, with the expectation that students will achieve well beyond minimum competency. When the paper-based SOL program was begun, only 2 percent of Virginia’s schools met accreditation standards.

The state started online testing in the fall of 2001, with about 15 districts and 1,700 high school students participating. These were end-of-course tests that helped determine whether students would graduate. Major issues that needed to be dealt with initially included availability of computers, reliability of networks, adequate training, and the customary general resistance to change. Policies and procedures were well established for paper-and-pencil testing, so it was a big job to make the change to online and to get people’s cooperation and support. Districts formed teams to work through these issues, and users eventually began to recognize the convenience and reliability of computer-based testing. A big plus was that students adapted quickly; when surveyed, they indicated that they “almost enjoyed” this method of test taking. For many students, going online is a snap. They expect to receive everything electronically, as they do in their lives away from school.

The rural Charlotte County Public Schools district (six schools and a student population of about 2,400) has embraced online testing, having administered roughly 2,100 tests online to students in grades 6-12. Steve Baker, the district’s assistant superintendent for administration/operations, believes that the biggest benefit of online testing is the nearly instantaneous turnaround in obtaining results. Baker can then use the data to work with staff on making the instructional changes necessary to help students master material, review, or receive remediation. He attributes Charlotte County’s success in instituting online testing to strong communication and many hours of planning and preparation. Baker says that once testing starts, “it’s a breeze,” but concedes that making sure everything is ready on test day (computers, the testing engine, the mass testing system download)can be stressful.

In northern Virginia is the Prince William County Public Schools district. Much larger than Charlotte County. Prince William includes more than 80 schools and approximately 68,000 students. But it’s catching on to online testing too—its high schools administered more than 21,000 tests electronically in 2004-2005. Currently, the district is working to create materials that can support the wireless rolling labs that make online testing possible at the elementaryschool level.

Since the 2001 launch, the Virginia Department of Education has expanded its online testing program and anticipates that by the end of this year, 75 to 80 district middle schools and 30 to 40 elementary schools will be participating. Online now are 14 end-of-course tests: tests for all high school courses except English writing; five middle school tests in math, reading, and science; and eight elementary school tests in math and reading. By 2009, all of Virginia’s 1,862 schools are expected to be administering tests online.

The online test engine used by the state has the capability to display and report assessment data, but it became apparent after online testing was implemented that teachers and administrators needed additional robust tools to help them disaggregate and analyze data. Some districts began to purchase data warehousing and other newly developed data tools for local use. The state saw this as another opportunity to continue to tie the need for assessment and accountability data to the use of technology. Data could easily flow over the same infrastructure built for online testing.

Getting the Tools to the Schools

Our basic aim in creating an educational information management system was to meet the Virginia Department of Education’s need for state and national data. But we also wanted to provide schools with the tools they need to help children learn and achieve. We knew that the new system would be a lot of extra work for the department, so we worked hard to make the system meet its needs.

The basic elements of our system are familiar to those who create data systems: assignment of a statewide identifier for each student, loading of testing results, inclusion of student demographics, and the ability to create various statewide reports. All Virginia districts have submitted four years of assessment and demographic data. Roughly 1.2 million students have received testing identifiers, and schools have access to disaggregation and reporting tools, which allow them to perform detailed analysis by subgroups, including pass rates and scaled-score band and reporting categories. The latter is especially important to teachers as they work to identify instructional and learning problems and develop plans for improvement.

There’s always an aha! moment when school officials discover how access to data can help them. Earlier this year, one of the Virginia DoE’s educational technology specialists was training an administrator in one of the state’s many rural counties on the use of the Educational Information Management System (EIMS). Knowing that the best training includes real examples, the specialist helped the administrator use the system to view scores of a particular teacher’s algebra class. Students in the course were doing quite well, with most passing the SOLtest and thereby earning a verified credit for algebra.

As the administrator and the ed tech specialist looked at a breakdown of the sections of the test, they noticed that students were achieving very well in all of the reporting categories but one. Now that the data identified an area where more work was needed, the administrator would be able to work with the teacher to amend instructional delivery, enhance content, and otherwise change instruction so student learning could improve. Later, because EIMS can follow student progress over time, the administrator and teacher would see whether their instructional modifications made a difference.

Virginia launched online testing in the fall of 2001,with about 15 districts and 1,700 high school studentsparticipating. By 2009, all of the state’s 1,862 schoolsare expected to be administering tests online.

Demonstrating other features of the system, the specialist compared test results from all of the algebra teachers in the math department. The results showed that all of the students were having difficulty with material in the same reporting category; the problem was not isolated to one teacher, but was a departmental issue. Given this information, the administrator decided to have a departmental workshop focusing on improving instruction in that category.

Providing administrators, teachers, and department staff with this kind of practical data that can be used to improve instruction is one of the key factors that led to implementation of EIMS in Virginia.

As the state has developed EIMS over the last two years, the Education Department has worked closely with schools to try to fulfill their needs. Only some of those needs have been met, but we are well on our way to satisfying all of them. Charlotte County’s assistant superintendent, Steve Baker, indicates that his district is actively using the data disaggregation feature for state-administered SOL tests to improve instruction, but he would like to see locally administered test data added. Joan Middleton, data analyst for Prince William County Public Schools, thinks that one of the best features of the state’s EIMS is the ability it affords users to quickly look up state testing identifiers. Still, she would like to see some improvements, such as ways to put more child-specific data in the system and to allow users to print testing labels locally. Prince William has an excellent data warehouse, and it’s used regularly to assist administrators and teachers with instructional decisions.

We envision that, once EIMS is completed, the system will include data on pre-K through college. We expect to have created tools that are useful, flexible, and easy to use. When those tools and resources enable teachers and administrators to create individual student learning plans, we will have achieved our goal.

One Step Closer

As of today, 92 percent of Virginia’s schools have met accreditation standards. During the time represented by those accreditations, resources have been provided to schools to support accountability through technology. The state’s online testing system has met expectations, and the statewide information system is beginning to make use of online testing results to determine student progress.

The success of the testing and the recognition of accountability goals met persuaded legislators to fund 1,200 instructional technology resource teachers. The ITRTs’ role is to work with teachers to help them integrate technology into their instruction. Ten years ago, when both technology and accountability were in their early stages, we could not have determined the impact of technology on learning. Now, as the ITRTs do their work, administrators and teachers are gaining the skills, tools, resources, and confidence to use technology to make a difference in student learning. We are confident that we will be able to measure results as the ITRTs’ work with our teachers continues.

In the long run, joining technology to accountability has given the state of Virginia time to build a system of technical and human resources that will help all students learn better and be more prepared to work in a competitive and technologically advancing world. Our advice is to tie technology to your educational needs. It has certainly worked for us.

Lan Nugent is the assistant superintendent for technology at the Virginia Department of Education and the chairman of SETDA’s board of directors.

Featured