Technology Challenges and the Move to Online Assessments

The 2014-15 school year is a long way off, isn't it? That depends on your perspective. If you are an eighth-grader, Friday night is a long way off, but if you are a technology leader in a school district or a state, the 2014-15 school year may be here all too soon.

That is the year that 40-plus states will implement their online testing programs, and while more than 30 states now do their summative assessments online, these new tests will be different, demanding changes in instruction, and possibly different devices and more bandwidth.

As has been reported previously, the new assessments being created by two major consortia of states, the Partnership for Assessment of Readiness for College and Careers (PARCC) and the Smarter Balanced Assessment Consortium (SBAC), will be based on the Common Core State Standards (CCSS) and will address higher-order thinking skills, problem solving, and other more rigorous standards.

The assessments will be online and will utilize some traditional multiple-choice questions along with different kinds of item types and tasks such as simulations, computer-based items, short answers, and a lot of writing. They will cover all the standards, not just those that are easy to measure. Because they will be delivered online, the results should be available almost immediately, allowing teachers to actually use the results to affect instruction for specific students.

Recently, when I explained this to a friend of mine who runs the technology program for a good-sized school district one evening (and asked to remain anonymous), she pummeled me with questions and comments:

  • Will the computers I have now work for these tests? 
  • If my district buys a bunch of iPads, will they work? Should I wait?
  • How many computers do I need?
  • Do I have enough broadband?
  • Teachers and students will need to know how to use the technology for the test and for instruction. Computer-based test items and simulations?…I don't know many teachers who are doing that today.

I bought her a drink.

Help of the non-alcohol variety is on the way--for at least some of her concerns. Here is what we know so far.

Known Quantities
Near the end of April, the consortia released a joint press release outlining their approaches to determining what technology specifications will be required for school districts using the next generation tests in 2014-15. The first announcement involved newly purchased devices that will be allowable for testing; that is, the devices now on the market that will work for the assessments--not existing, legacy systems.

Desktops, laptops, netbooks, thin clients, and tablets that meet certain hardware and operating system requirements can be used. In terms of speed, processors must be 1 GHz or faster; they must have at least 1 GB RAM of memory; the screens must be at least 9.5 inches in size; and they must have screen resolution of at least 1024x768. Acceptable operating systems are Windows 7, Mac 10.7, Linux (Ubuntu 11.10, Fedora 16), Chrome OS, Apple iOS, and Android 4.0. 

Virtually every type of device sold today meets these standards, so it would seem to be no big deal. But it actually is, especially regarding tablets like the iPad, because of the variety of security concerns that arise. As PARCC notes in its description of the guidelines, devices "…must have the administrative tools and capabilities to 'lock down' the device to temporarily disable features, functionalities, and applications that could present a security risk during test administration."

These are not trivial problems if one looks at how tablets are constructed and how interconnected many of the operations are. It is also a big deal because of the timing. The consortia's struggle to make a decision about tablets has created uncertainty around the devices for some school districts and has had a chilling effect on some school purchases. But both consortia have spoken with manufacturers of tablets, including Apple, who say they understand the issue and are committed to having solutions in place by the time the assessments roll out.

Great Unknowns
Here's what we still don't know: The consortia have not yet issued guidelines for legacy systems (those devices and their operating systems that you have in your schools now), for bandwidth, or on whether other input devices and accessories will be necessary to handle the tests. The reason is simple: They still don't know.

The next announcement, scheduled for late summer or early fall, will cover legacy operating systems and minimum bandwidth guidelines. Additional announcements will address security concerns, accessories such as input devices, and possible requirements for tests beyond 2014.

Also, both consortia are still in the process of developing test items, determining how many of what kind of test items they will use, designing their testing engine, and seeing how these elements will work together. The result of all that "construction" will go a long way toward determining minimum guidelines for broadband, in particular, but also toward answering questions about the legacy operating systems. 

The Technology Readiness Tool, developed under contract with Pearson with assistance from the State Educational Technology Directors Association, should help determine the efficacy of legacy operating systems. The tool now enables districts to take an inventory of the devices in their schools, including the types of devices and operating systems.

The tool is already in the hands of many schools. In fact, the first window of data collection will close June 30. The consortia will use data from that first collection to inform what legacy operating systems will be allowable for the assessments. For example, if 80 percent of the devices in schools use Windows XP, the consortia would probably find a way to make sure it works for the assessments, even though Microsoft has stated it will stop supporting XP in early 2014. 

On the other hand, if only 10 percent of the devices have XP, the consortia probably will not allow it, as the lack of support, including security patches, could jeopardize not only the machines' operating systems, but also the integrity of the tests.

Are You Prepared?
At this point, it is unclear how many districts in the 40-plus states affected will use the Technology Readiness Tool, but my guess is that the vast majority will and the consortia certainly hope that every district will. By late summer or early fall, the consortia will have had time to analyze the data from the tool and be able to provide some guidelines for legacy operating systems.  

The Technology Readiness Tool is also due for significant changes in how it is implemented. Starting with the fall 2012 data collection and continuing through the fall and spring of every year, the Technology Readiness Tool will shift from an inventory tool to a planning tool.

The consortia will enter the guidelines (for example, as mentioned above, 1 GHz or faster processor and 1 GB RAM or greater memory) into the tool before districts and states upload their data. The tool will then compare the guidelines with the data and provide reports on the extent to which a school, district, or state is "ready" to administer the tests in each of four categories: devices, the ratio of devices to eligible students, bandwidth and networking, and personnel. 

For example, if a school has 100 computers available to use for assessment, but 50 of them have screen sizes of 8 inches and thus don't meet the guidelines for devices, the school will be 50-percent ready for the assessments, at least as far as the devices category is concerned. The same school may have increased its bandwidth significantly due to an adoption of online textbooks and find it exceeds the guidelines for bandwidth. That would make it 100-percent ready for assessment in terms of bandwidth and networking. 

The tool will be able to aggregate information up through the district to the state level, thus providing technology leaders and policymakers at every level with critical information about steps that still need to be taken to become technologically prepared for the new assessments.

The guidelines already issued by the consortia in April, along with the changes in the way the Technology Readiness Tool will be implemented beginning in fall 2012, will go a long way toward answering my friend's questions without the aid of adult beverages. However, there are still good questions left to answer--many of which have more to do with the human side and less to do with technology.

How does one assess students' answers to real world problems? Multiple choice--for the most part, the only option currently available--won't cut it. What can be done to make sure teachers are prepared to address those higher-order thinking skills that the CCSS will focus on? These and other similarly knotty challenges still face us. 2014 is not that far away.


Online Assessment's New Community

Looking to connect with other state and district technology leaders working through the same issues? assess4ed.net is an online community of practice intended to assist states and districts in making the shift to online and computer-based student assessment. It offers webinars, resources, discussions, chats and other opportunities for communication concerning assessment, curriculum, and technology.

States and districts with significant experience as well as companies involved with statewide implementation of large-scale technology-based projects provide much of the content. At the same time, a number of states are starting their own groups in assess4ed to focus on issues specific to their states.

Featured