Virginia: Getting Ready for Online Testing
Can't you just hear the call coming? "Let's take our statewide high-stakes pencil-and-paper testing system and use technology to automate it. This will provide ready access to tests, greater testing flexibility, faster turnaround of test results, and test data decision-making capability. Plus, the kids will love it!"
If your call indeed comes - and it appears more and more likely that it will come in light of the increased testing requirements of the No Child Left Behind Act - you may spend some sleepless nights if you are the technology person managing the project. Embarking on a high-stakes online testing program is a major enterprise that requires planning, teamwork, communication, flexibility and good problem-solving skills. It is important to be aware of the issues you may confront and what hurdles might have to be overcome to be successful in your job.
Establishing a Project Management Team
The Virginia Department of Education established a project management team (PMT) to develop and guide what we called the Web-based Standards of Learning Technology Initiative. We included technology, assessment, instruction, accountability, finance and our vendor partner on the team. Since testing impacts all departments within our educational agency, we wanted everyone to know what was going on and have the ability to provide input to the PMT.
Several work committees were formed to identify issues of concern and suggest solutions. A formal project management plan with project phases, goals, milestones and deliverables was followed throughout the project. Representatives from the PMT met bimonthly with Superintendent of Public Instruction Jo Lynne DeMary. Her support and help with making decisions on critical issues have kept the project focused and given it high visibility both in the department and with school divisions. Most schools followed our model and formed broad-based teams to solve local online testing issues, which proved very effective. While 16 districts began testing in the fall of 2001, 120 out of 132 districts were tested this spring. More than 100,000 tests have been administered so far, and it is anticipated that the number will reach 400,000 by next year.
Online testing is not really a technology project, but it certainly needs to start out as one. Our technological goals were to establish student access to computers at a ratio of one computer for every five students; create Internet-ready LAN capability in every school; and assure adequate high-speed, high-bandwidth capability. If your state is anything like Virginia, schools have a variety of infrastructures, computer configurations and varying degrees of connectivity. This makes it necessary to have a fairly high degree of technological conformity to conduct online testing. We recommend developing architectural guidelines that set minimum technological specifications to get everybody moving together in the right direction.
Our PMT also established a three-stage certification process. Schools certified that the architectural guidelines had been met in stage one, verified the certification for stage two, and verified that everything was working properly just prior to testing for the final stage. Certification helps mitigate the risk of technological failure during testing.
Time spent planning and thinking through the goals of the project, what needs to happen along the way, and what you expect to achieve in the short and long run will pay off in being able to answer the hundreds of questions that will arise. For example, early in the process, our schools were concerned with trying to test students with their existing limited infrastructure. They asked, "How do they do online testing in a school when it d'esn't have enough computers or bandwidth to accommodate all kids?" Our answer: "Don't start with a complete school, start with a subject or two using existing technology. Then increase tests given until you reach the number of computers that you have available. As you have more computers and develop a more robust infrastructure, increase the number of tests given." This response showed that we understood that it would take time and resources to accomplish our technology goals. Districts felt relief that they could plan a "test of the test" and then build from that experience.
Our assessment goal was to establish a statewide, secure Web-based system to provide administrators and teachers with the ability to register students for tests, manage test delivery sessions, and report test data. Students must be able to take tests online and get results quickly, but should not be able to "break security." Assessment issues were addressed by PMT work groups as the test engine was developed. Often, the school teams provided some of the most valuable input. As testing strategies, students suggested adding a "highlighter" and the ability to "mark for review" questions that they wanted to revisit. When tests were administered, proctors suggested having student names appear at the top of screens so they could be certain that the right student was taking the test.
We found that our technology people had much to learn about assessment, and our assessment people had much to learn about technology. We originally thought that we could separate the two, but when a technology glitch happened, it also caused a testing irregularity. In a similar way, test presentation on a computer screen, needed test accommodations, and creation of in-test manipulatives (e.g. protractors, compasses, periodic tables, etc.) became technical challenges. But continual communication and working together as a team to solve problems helped us deal with these issues.
Communication with schools as these areas of tests were administered was of vital importance. We realized that issues which were not addressed promptly could lead to a local testing disaster. For instance, districts expressed concerns that they had to use local funds to hire contractors to verify their certification. To address these concerns, we worked with our vendor partner to make software available that could emulate test takers. This allowed districts to verify their capabilities whenever they wanted, as well as gave them reassurance that the technology would work during testing.
Another challenge was what happened when 60 to 100 test takers in a school hit the "test download" button at the same time. This used all available bandwidth and took too much time. We originally tried to solve the issue through workstation pretest downloads, but the ultimate answer became "proctor caching." The high degree of security needed to offer tests online enabled districts to download tests to proctor caching servers and deliver them from there. Many schools felt this was a breakthrough and solved much of their bandwidth issue. They reminded us that they needed bandwidth for instruction and could not afford to shut down during testing.
When you decide to take your tests online, remember to plan well, involve everyone, get the correct technology, and then address the assessment issues. In Virginia, the ability to take tests at will has allowed districts to extend instructional time and do extensive retesting. Administrators are using the test data that the system provides in a limited way to determine pass/fail, steer remediation activities and make placement decisions.
Our next challenge will be to merge the online test data into our developing statewide Educational Information Management System (EIMS). Some divisions, like Hanover County Public Schools, currently have local data warehousing capability. Through EIMS, all Virginia districts will have enhanced data-based decision-making capability to do things such as analyze testing trends, compare school demographics, analyze student mobility, and align remedial instruction to test results.
So when the call comes, it may very well be for online testing; a data warehouse; or, if you're really lucky, both at the same time.
This article originally appeared in the 07/01/2004 issue of THE Journal.