Tech Readiness

Getting Your School Tech Ready for Common Core Assessments

Schools have about 15 months to prepare for the online assessments that reflect the learning goals of the Common Core State Standards. Some districts already know what the transition will be like. Here’s what they’ve learned about preparing.

This article, with an exclusive video interview, appears in THE Journal's July 2013 digital edition, focused entirely on preparing for the Common Core.

cross section of a school

In February and March, researchers from the Partnership for Assessment of Readiness for College and Careers (PARCC) sat next to students in six states as they performed a study to determine the viability of test questions that could appear in the 2014-2015 school year on online assessments being developed as part of the Common Core State Standards initiative. One of those states was New Jersey, which--aside from a few exceptional schools--still uses paper-based summative assessments.

During the same period, the state's Office of Educational Technology has been busy cleaning and analyzing data from a technology readiness survey put on by PARCC. According to director Laurence Cocco, the ultimate goal of the survey work is to categorize all the schools "by how ready they are and what they'll need to get ready" in terms of computers, broadband, and other technology considerations. From there, he notes, "we're going to formulate an action plan and hit most of the districts with some form of help starting in September and moving onward until we get to the test date."

However, Cocco points out, "Getting ready for PARCC is not the be-all, end-all of the technological needs of the districts." It just so happens that when districts implement technology correctly, "they're building the same kind of infrastructure they would need for assessments."

That's an alluring benefit for technology leaders in states that are participating in the online assessments: By adhering to that 2014-2015 online testing deadline, all of those schools will presumably be better outfitted for digitally supported instructional practices.

Some districts and states are further along in the process than New Jersey; they've already made the transition to online assessments. What they have to share is worth listening to, because they understand the landscape everybody else is about to traverse.

Adapting the Test to the Environment
Jon Cohen is the executive vice president for the nonprofit organization American Institutes for Research (AIR) and director of the Assessment Program, which has a contract with the Smarter Balanced Assessment Consortium to deliver the pilot test, a field test, and build the test delivery system. He says there are two ways to handle the transition from paper and pencil to online testing: "One is slow and painful. The other is like pulling a bandage off. Just do it."

According to Cohen, the states or districts that take the "slower, more painful route" tend to be very traditional. The schools follow the same model they've always followed with paper: putting students into a room with the test, giving them a set amount of time to take the test, rearranging the bell schedules, and releasing them early from school when the test is over. In that scenario, "the environment and the students adapt to the test."

A better approach, according to Cohen, is to adapt the test to the students and the environment. What does that mean? AIR's tests, for example, can be started and stopped so that they integrate into the standard classroom schedule. "If one kid works faster and pauses his test on question 30, and another kid works slower and he pauses his test on question 10, it's just not a problem," he explains. The test can be picked up and restarted at some unspecified time in the future (within parameters) until each student has completed it.

The advantages to that flexibility are many, Cohen notes: Schools don't have to be shut down "so everyone can take a test"; teachers can take "student fatigue" into account; and "you actually can give more tests and use less testing time if you're not trying to force everything into the same couple of weeks of testing."

Cohen advises educators to stop expecting online assessment to be a like a paper test. "It's just not. You have to start thinking about the world a little bit differently. Locking a kid in a room for 2½ hours with a bubble sheet is not the gold standard of validity," he says.

Hawaii, whose sole district covers the entire state, was persuaded enough by those benefits to shift quickly to online assessments. The state ran its own pilot of online assessments in 2009-2010. "We were spending a lot of money on paper/pencil testing," recalls Cara Tanimura, director of the state Department of Education's Systems Accountability Office. "It's very expensive to ship millions of pieces of paper here and get it back to the continental US for scoring. We explored online."

The switch was abrupt. Hawaii conferred with other states, such as Oregon, that had already gone through the experience, and the advice Hawaii received closely aligns with Cohen's thinking. Oregon, for example, took several years to make the transition to online assessments, and according to Tony Alpert, then the assessment director for Oregon's Department of Education and now the chief operating officer for Smarter Balanced, that route was "much more painful and people were much more resistant." They counseled Hawaii to jump in feet first.

"It was a leap of faith," Tanimura acknowledges. The state contracted with AIR--the same organization now working on the assessment platform for Smarter Balanced--to deliver its online tests.

Logistics First
Hawaii's hasty switch to online testing happened in the 2010-2011 school year, long before its schools were thoroughly outfitted. Two primary areas of concern for most districts facing online assessments are having an adequate number of computing devices and having sufficient bandwidth. Hawaii had neither, but the panic was minimal. Instead, technology leaders gave themselves some good advice: Don't worry about what you lack; just work with what you've got and keep building from there.

The state's schools had "some very old operating systems," explains David Wu, assistant superintendent for Hawaii's Office of Information Technology Services. Even though both PARCC and Smarter Balanced have declared that their individual assessments will work on older devices (as far back as Windows XP with Service Pack 3 and Mac OS X 10.4.4), the consortia are also recommending that districts migrate away from older operating systems that don't or won't have vendor support.

The Smarter Balanced tests don't necessarily run poorly on those legacy systems, says Wu. The problem is that the hardware doesn't do a very good job of staying connected to newer networks. "In one particular school we had a problem with the wireless card within the computers. They were so old they wouldn't connect properly to the wireless access point. They would either drop connections or they wouldn't provide the proper bandwidth. If you had those issues during the test, it could cause the test to drop or to reconnect."

The district resolved the issue, Wu says, by reconfiguring the computers and using a USB wireless adapter. "Oftentimes budget restrictions prevent schools from immediately replacing their devices," he noted. "Moving forward, we are encouraging schools to review computer purchases with our technology office to ensure full compatibility with newer networks."

A dearth of computing devices in a school calls for making hard decisions: Do the devices get used for testing or do they get used for instructional purposes? There's no easy way around that--at some level, quantity matters, says Allen Miedema, technology director of the Northshore School District in Bothell, WA. "The folks at Smarter Balanced have done a nice job to make sure you don't have to be on the latest and greatest computer or operating system to have things work," he observes. "The bigger challenge districts are going to have is more along the lines of, 'We just don't have enough technology'--never mind the version number."

Device availability isn't a problem at Kenosha Unified School District. "If we said that next week we had to do Smarter Balanced [testing] online, 95 percent of our schools would have no problem implementing it from the technological standpoint," says Kristopher Keckler, executive director of Information Services, Data Management & Evaluation for the Wisconsin district. Kenosha is looking forward to saying goodbye to 12,000 paper-based test booklets that form the foundation of the annual Wisconsin Knowledge and Concepts Examinations required by the state.

Because scheduling can be so much more flexible with online assessment, district leaders may envision students heading into the closest computer lab to take their assessments on whatever devices are available, but that may not be the case universally. It's possible that students in well-outfitted schools such as Kenosha may just stay in place and tackle the tests in their classrooms. That poses a challenge of its own: ensuring a quiet environment. As Keckler says, "You don't want it where the classroom on either side of the lab is [holding] normal classroom activity, which is sometimes loud, and students in the middle [are] trying to take a standardized assessment."

Prepping Your People
Between now and the time when the official assessments are made available, there will be ample opportunities to perform dry runs of online testing, such as the large-scale 2013-2014 field tests scheduled by both consortia. Districts can use those practice runs to structure their IT and assessment organizations and prepare staff to handle the new testing formats.

For example, students in nine of Kenosha's schools were identified by Smarter Balanced as fitting the criteria it sought as part of a major scientific pilot the consortium performed across a broad spectrum of schools throughout the country in April and May to check out its test item types. To prepare for the experience, Kenosha divided the district into regions, and a regional technology support person was assigned to oversee the technical aspects for each district. Keckler's office then assigned a technology person as well as an instructional coach to each school to help with the pilot implementation. Finally, the district conducted a workshop among those instructional and IT people who would be involved.

Renee Blise, Kenosha's research coordinator, says, "Technically, it went well. We didn't have any problems with our hardware. There were issues with the test itself." For one, the same question froze at the same time at several schools, she explains. "Then we also went in our lab and the same thing happened to us. We reported it to Smarter Balanced, saying it's not an issue with our system. It must be an issue with that question."

For another, Smarter Balanced hadn't been "absolutely clear with the fact that schools could pause or stop the test and resume it if they chose the next day," adds Keckler. "A lot of them were trying to reach a fictitious end point. We cleared that up right away."

Nobody was put off by the experience. As Blise notes, "That's what a pilot is--to improve things for the next round."

Keckler strongly advises other schools to participate in the 2013-2014 field tests. (In the meantime, schools can try out practice materials from Smarter Balanced and PARCC as they become available.) Keckler found the experience of the scientific pilot "very beneficial," adding, "We saw the first snapshot of what the test would look like so we can work through the issues now, prepare ourselves, and prepare our students, our parents, and the entire staff, so we're ready when we go live."

Prepping the Students
The Common Core includes English language arts and math standards for grades K-12. Some participants--such as Northshore School District--have begun wondering whether younger children really will be able to participate in the online assessments.

Northshore has already made the move to online assessments, but district leaders still saw the need to participate in the Smarter Balanced scientific pilot as part of "being prepared," says Miedema: "That's the culture of our district. When these things happen, we generally don't wait until the last minute."

While Northshore has a healthy population of devices in hand and sufficient bandwidth for assessment and instructional purposes, the challenge Miedema expects to face is the same one that has beset his district every time it has tried out a new type of online testing: figuring out what technology skills students will need. For example, for younger learners, "Unless you've made a real effort to teach them those skills before they take the test, they may be struggling--not because they don't know math, but because they have trouble selecting text and highlighting it," he says.

AIR's Cohen believes the concerns about younger students may be mostly unwarranted. "Our system is designed to not require a whole lot of dexterity," he says. "For example, when you click on a multiple choice item, you don't have to click on the bubble. You can click anywhere on the response. There are big click areas."

To make sure the assessments are usable by the youngest of students, AIR regularly performs cognitive labs. "We try to find kids who are young--third or fourth grade--and who don't have a computer at home and have very limited access to computers at school. They're getting harder to find," Cohen says. "We bring them in. We try stuff out and refine the interfaces so that the kids can use it. You'd be amazed how capable these kids are."

Cohen's confidence notwithstanding, Northshore's experience with the initial Smarter Balanced pilot taught them that they do need to pay attention to the tech skills of younger students. "We've got a better understanding of what kinds of skills the kids need to have in those areas," notes Miedema. "We believe that it makes a big difference if kids have basic keyboarding skills in place before they take these tests." Those skills include being familiar with the layout of the keys, especially the delete and arrow keys and the space bar, and how to select text and work pull-down menus.

"I keep hearing how 'kids already know how to do these things,'" Miedema says. "No, they don't. They might know how to do very specific tasks associated with playing a game at home. That doesn't necessarily translate to the skills that are needed to effectively take an online assessment." The next step, he says, is to pull together educators and "start having conversations: 'Okay. These are some skills that the kids are going to need by third grade, fourth grade.'"

Miedema observes that Smarter Balanced could probably do a better job of informing districts about "what kinds of things your kids will need to do mechanically with a computer in order to be successful on our tests." The problem, he says, is that there isn't a good mechanism for the consortium to collect that kind of feedback. "We had a phone conversation with folks at Smarter Balanced. That was one of the things we brought up. We send them messages and try to help there. I don't think they have a real formal way to do that yet."

Many districts plan to tie student assessment results to "pay for performance" in teacher contracts, so there's a lot of anxiety built into the testing process. However, says Kenosha's Blise, nothing is ever going to be problem-free. Her advice: "Just prepare as best you can. Take the time to stay as informed as possible. Have the support to address [problems] in a responsible manner."

 

6 Tips From the Field

Keep technicians informed. That includes training them to know how the computers used for assessments need to be set up and preparing them to "jump into action when something's not going right," says Tom Harris, network manager at Kenosha Unified School District (WI). What you want to avoid, he adds, is having the teachers bring in kids for a test "just to find out they're missing plug-ins or something's not set up [right]. Have the environment ready so students can do their best."

Keep your environment stable. Northshore School District (WA) discovered that some operating system updates made on the devices on which students would be performing assessments created incompatibilities with the testing software in a few cases. Technology director Allen Miedema advises, "Don't do a big software update right before you run your test."

Understand wireless capacity in your testing rooms. There needs to be sufficient access-point capacity to sustain the amount of wireless traffic the tests will require. "It's one thing to have a website dropped when kids are doing a research project and they can get right back on," says Miedema. "It's something completely different to have a kid kicked out of a test. As soon as that happens, you've affected the validity of the test, because you've raised the kid's stress level."

Set up a SWAT team to tackle problems quickly. Forget about scheduling something else for your top technical people on the first few days of online testing, Miedema says: better to have them "sitting by the phones waiting to answer questions and help people out."

Put help close to the hot spots. By the time online testing arrived in Hawaii, the state had done a technology survey and knew the condition of its schools' networks. Although assistant superintendent David Wu's IT department used that information to take corrective action before the fact, it also "sent folks where we knew there was potential for issues," he says. That included placing engineers in schools with "network-sniffing equipment" to diagnose, troubleshoot, and resolve issues.

Collaborate and communicate. As a technology person or an assessment person, don't expect to tackle online assessment readiness alone, advises Hawaii's Cara Tanimura. "You really have to have good communication and reach out to your partners within other departments at both the state and district levels," she says. Colleague Brian Reiter suggests using consortia support too: For those initial forays into online testing, "My office was working the phones, but there was also the AIR help desk that was helpful in troubleshooting and answering problems."



Detailed Specs
Here are links to more detailed information on online testing system requirements from the two consortia:

Smarter Balanced Technology Strategy Framework and Systems Requirements Specifications

Technology Guidelines for PARCC Assessments: Version 2.1

Whitepapers