...

Getting On Board With Online Testing

##AUTHORSPLIT##<--->

Like most technology being adopted in education, computerized testing has the potential to improve individual student learning. That was Idaho’s goal when it selected a new Internet-enabled state test in spring 2002. The move represented years of study and a desire to meet the requirements of the No Child Left Behind (NCLB) Act of 2001. It also positions Idaho as the first state in the nation to embrace a technology-based testing system that measures academic growth. My district, Rigby, was one of a handful of districts that were early adopters of the computerized system.

Comparing Tests

Like virtually every state, Idaho relied on traditional paper-and-pencil standardized testing tools — the Iowa Tests of Basic Skills (ITBS) and the Test of Achievement and Proficiency (TAP) — as primary indicators of school performance. However, these types of tests do not adequately show academic growth; that is, whether a child is being effectively taught what he or she needs to grow and what instructional methods can be used best to achieve that growth. So, we had to find a new tool to address these things, especially considering that growth will likely be the most important measure in meeting the goals of the new education law. Also, since school districts in Idaho began taking advantage of technology long ago, we wanted to investigate whether we could use that technology for our testing.

To study the needs and identify potential solutions, the Idaho Department of Education formed an Assessment and Account-ability Commission of five educators and five representatives of the state’s business community. The commission’s charter was to review and offer recommendations for statewide testing procedures that provided data to assess what’s being taught, as well as assure individual student learning and growth.

The commission repeatedly found teachers unhappy with traditional state-mandated tests, because these tests didn’t truly measure, report or track student growth; i.e., they didn’t provide much insight into how teachers could modify the curriculum to improve learning. In addition, parents wanted to see more information about the scholastic progress of their children. The business community registered a warning about the impact of a less than well-educated workforce on the state’s future economy. And administrators recognized the need to measure their districts’ testing, achievement and accountability processes against the intention of the NCLB Act.

Committed to total objectivity in its review, the commission gathered data from other states, reviewed best and worst practices, and held public meetings throughout the state. Dissatis-faction with traditional high-stakes testing ran high. According to Linda Clark, Ph.D., director of instruction for Joint School District No. 2 in Meridian, Idaho, “Comparing an individual’s progress with others’, as occurs with standardized tests, d'es not truly measure real growth.”

Untimely data was also a problem. Jerry Hutchins, Ph.D., director of technology, testing and database management in the Blaine County School District, says: “There was a lot of frustration with state-mandated tests occurring early in the school year, usually mid-October, with results not available until January. Not only did the test data fail to indicate what the students had learned during the year, but by the time a teacher received the results roughly half the year had passed, leaving minimal time to adjust instruction.”

Computerized Testing

The commission’s initial consensus — that testing should function to measure individual student growth and provide timely results — led the members to recommend a technology solution. We had always hoped that somewhere down the line we would be able to use computerized testing in our district, because when you have the hardware capability it creates tremendous benefits for testing programs. Through its research, the commission found that the system Rigby, Blaine, Meridian and other districts had already been using met all the identified needs. The system, Measures of Academic Progress (MAP), was developed by the Northwest Evaluation Association (NWEA), a nonprofit assessment organization based in Portland, Ore., with a 25-year history successfully working with school districts nationwide.

Technology solutions that provide student assessment range from online coursework to stand-alone and network solutions. These solutions are used to test individuals and entire classes. Students can also take tests online via Internet-delivered systems, or through network-connected computers utilizing Internet-enabled tests, which provide all data and scoring through the link to the testing organization. Internet-delivered tests require continuous access from each student’s computer to the testing organization through the testing period. This makes quality Internet connections essential in order to maintain proper test speeds and graphical displays. It also requires that the stu-dents’ computers be updated regularly with the latest browser software.

Internet-enabled tests like MAP take away this requirement. With this alternative, the only computer that makes the connection to the testing organization to download and upload tests and results is the network administrator’s computer. Internet-enabled testing not only eliminates the shortcomings of standardized models, but also provides an effective way to gauge student progress, measure growth and monitor improvement consistently. For us, this option was a better choice than Internet-delivered testing, because it didn’t require updating every student’s computer.

A New Kind of Test

Measures of Academic Progress are electronically administered and scored achievement tests designed to measure growth for individual students, classrooms, schools and districts. While it is new to many Idaho districts that began using it last year, the MAP system has been used in Rigby, Blaine and other districts for two scholastic years. Blaine uses the system in all seven of its schools, including three elementary schools, a middle school, a high school, a K-12 school and an alternative school. Rigby uses it in all of its schools as well. Testing takes place in grades 2-10 in Blaine and in grades 3-8 in Rigby. Both districts were eager to adopt the technology prior to its selection as a state test.

The MAP system is unlike other tests in education because, while it includes the features and benefits of norm-referenced, criterion-referenced and compu-terized tests, it also offers more benefits and higher quality growth data. These are vital in classrooms faced with helping students meet rigorous standards and the challenges of the NCLB Act. The system is different from other tests in four primary areas:

  • It is appropriately challenging for 97% to 99% of students, including those in special education;
  • It measures individual student achievement;
  • It provides data that can be compared and analyzed across the full spectrum of learning; and
  • It more closely engages stakeholders in the education process.
  • Ready for the Test

    In addition to wanting higher quality data to help improve learning, the Rigby and Blaine districts were well prepared for computerized testing. We have been using computers for about five years and currently have one computer for every five students. Also, we have good Internet connections, labs at each of our secondary schools and some computers in each classroom at the elementary level. In Blaine, the district was completely wired with the schools connected through a WAN; making it more than ready for computerized testing. Superintendent Jim Lewis, Ed.D., says, “We could have used the paper-and-pencil test first, but since we were really moving into technology and getting labs and computers on every teacher’s desk, we decided to wait until the computerized version came out.”

    The MAP system’s flexible design may be implemented in any school that uses the Windows or Macintosh platform. Hutchins says, “One of the unique things about our district is that we have a mix of hardware platforms — one of our elementary schools is 100% Macs, while the others use Windows-based platforms or have a mix of both Macs and PCs. That puts a different twist on requirements for testing software, but with NWEA it d'esn’t matter what platform a school is using.”

    Blaine offers testing in its computer labs for some of its schools, and in classrooms for others. The district uses wireless laptops to maintain computer access for classroom instruction, while the labs are used for testing. Rigby uses a similar set up. Its policy is to strive for consistency in setup within each building, though configurations from building to building change. The system ensures that students are consistently challenged by the items, so they don’t grow frustrated by content that is too difficult or bored by content that is too easy.

    Scale Leads to Quality Data

    It is possible to create tests with different items that measure the same level of achievement, because MAP test items are referenced to the Rasch Unit (RIT) Scale. This scale is the most important difference between NWEA and other tests. It is an equal interval scale that measures a student’s academic growth similar to the way a yardstick measures physical growth. Use of this scale enables teachers to measure a student’s ability to handle specific subject matter, provides data that show where students rank relative to other students, and reveals how close they are to achieving milestones and goals.

    MAP data allows for in-depth analysis of students, classrooms, entire schools and districts, enabling decision making that serves to improve performance at any level. For example, Meridian is using the data to identify students who have already mastered grade-level knowledge, skills and understanding in order to move them to the next level of achievement. And Rigby is using the data for a variety of similar purposes, as well as for placement.

    The system has met, if not exceeded, our expectations for data that demonstrate a student’s individual growth. It has also proven to be more motivating for students who prefer using computers over using a pencil and paper. Rigby teachers have had positive reactions to the MAP system, and have seen more interest and attention from students during testing sessions. Lewis says, “We’re able to analyze the strengths and weaknesses, not only of groups, but of individual students in specific subjects. Also, teachers are able to align their above-average, average and below-average students, so that they can identify what needs to be taught to each specific group.”

    Like Blaine and Rigby, teachers in the districts also use the data within parent-teacher conferences, making these events informed by data and focused on results. Reports, such as longitudinal reports, are supplied to parents to give them a full view of their child’s progress, including where they are excelling and where they may need extra help.

    Timeliness Enhances Usefulness

    Equally important is timeliness of test data, which gives teachers the ability to monitor student progress when it makes sense — at the beginning of the school year and near the end. Data from other tests are often not available until midyear or after students have left for the summer. This ability to get information back almost immediately after the student has finished the test, enables educators to quickly look at scores, see whether students are on track to meet expectations for their grade (whether ahead or behind), and then tailor teaching to address those results.

    Hutchins says, “Instead of the half-year time lag waiting for standardized test results, NWEA offers class reports within 72 hours of testing, and growth reports comparing fall and spring tests.” Students also have the option of printing their test results the moment they have completed their tests. This almost instant turnaround on scores and results provides a comprehensive assessment of student achievement levels, enabling teachers to respond to individual needs more quickly than before.

    Since MAP data monitors a child’s growth specific to what is being taught, and to expectations set by the district, teachers are able to use results to individualize instruction and to serve as criteria for judging the strengths and weaknesses of class materials. Blaine’s goal is to use the spring data to align all students within their new classes each fall, then use the fall data to make any adjustments that will help students make sufficient growth and meet scholastic targets in the coming year.

    NWEA designed the MAP system with the goal of providing high-quality, timely data that could be easily used by educators to improve instruction. Even though Idaho has made technology a requirement for recertification, an in-depth technological background is not a requirement to use MAP. The system is relatively simple; the district’s technology coordinator typically oversees its use in all schools. In addition, the district’s testing coordinator plays a significant role in maintaining test security, explaining results, and assisting teachers in understanding how classroom changes will lead to improved testing results.

    Conclusions

    A goal in the Rigby and Blaine districts for next year is to use MAP test scores to determine specific milestones that must be reached for a student to proceed to the next grade. This not only provides students with personal accountability for their learning, but also provides educators with an early warning indicator if a child is not meeting standards. At that point, teachers can offer the appropriate additional support to assure improvement in the required time frame.

    Currently, some districts like Blaine continue to combine MAP testing with the ITBS, although that is no longer a state-mandated test. According to Lewis, the ITBS augments the MAP data, allowing them to continue to use ITBS to get the big picture. He says, “MAP is invaluable in its ability to give teachers an ongoing look at how their class is progressing, how effective their instruction has been and what, if any, changes should be made to refocus instruction for more effectiveness.”


    Meeting NCLB Goals

    The Northwest Evaluation Association tests have put Idaho in a good position to meet the goals of the NCLB Act. For example:

  • The law requires that the same assessments be used to measure the performance of all children. The NWEA assessments measure all students in the identified content standards of the state. Our state tests are designed with two parts: the first portion focuses specifically on the content standards of each grade, while the second adapts to better measure the child.
  • The law requires the tests to be aligned with our content and student academic achievement standards, as well as provide coherent information about student attainment of such standards; MAP is so aligned. In addition, grade-level benchmarks have been designed to ensure we are meeting our goals. The alignment of our assessment with state and local curriculum standards also assures that the tests measure the curriculum broadly and deeply.
  • Another requirement of the law is that the state educational agency must provide evidence from the test publisher to the U.S. Secretary of Education that the assessments used are of adequate technical quality for each purpose required. The NWEA tests have been validated and revalidated in a variety of school settings. In addition, the accuracy of the assessments for each individual student is quite high.

  • How Map Works

    MAP is typically administered to students twice a year — in the fall and spring — for math, reading and language arts. Science tests are also available, and are expected to be adopted by the Idaho districts in the near future. Some districts also use the tests for midyear evaluations. Once the test administrator has linked to NWEA via the Internet and downloaded student data from previous tests, the testing session is ready to begin. The network administrator hosts the session, eliminating the need for separate software setups on every student computer. In addition, features like spell checkers or calculators are only available where appropriate.

    During the session, MAP automatically adapts the level of difficulty to each student and how he or she responds to the first several items. This means that while the number of items is the same for all students, each sees different items. If a question is answered correctly, subsequent questions become gradually more challenging. If, however, a question is answered incorrectly, easier questions will follow. During a subsequent testing session, the system remembers where the student left off and presents items at that level. This adaptivity ensures that each student receives a test that is appropriately challenging, yet fair.


    6 Steps to Online Success

    By Jim Lewis, Ed.D., Superintendent, Blaine County, Idaho

    As a person actively involved in researching and implementing an Internet-enabled testing system, I’ve learned how to get a computerized testing system up and running successfully in a district. The top suggestions learned by the Blaine and Rigby districts that should help you expedite the process include:

    1. Develop and commit to a plan. First, make sure the test you choose to implement complements the long-range vision for your school district. Also, ensure that it effectively functions to provide the data needed to improve instruction. To assure this goal, you’ll need to assign an individual, most likely a principal, to take responsibility for testing, data collection and follow through; i.e., making sure the right questions are asked, that you retrieve the kind of data needed to inform instruction, and that you know what to do with that information to meet your goal. After all, the principal is the person who authorizes the time needed for testing, data gathering and evaluation of results, as well as for holding teachers accountable for using that data to improve classroom practices.

    2. Leverage the power of technology. While a district may be completely wired in terms of having connections between all the schools, it’s necessary to continually improve the quality of the network and what’s available within each classroom (e.g., experimenting with the use of wireless laptops in the classroom for testing).

    3. Don’t be “technophobic.” Although the idea of computerized testing might seem to require extensive technology expertise, teachers in Rigby and Blaine County districts have found that’s not the case. As in almost every district these days, our schools have tech-support personnel who handle downloading and uploading test data, moving computers around and ensuring that the system is working correctly. Most of our teachers aren’t particularly computer-savvy, and with the MAP system, they really don’t have to be. NWEA provides strong tech support, so you can run the MAP system without the need for extensive technology expertise.

    4. Do your homework. A major key to the success of the testing program is use of NWEA’s MAP Coordinator’s Handbook, a comprehensive manual that covers every element involved in implementing the program.

    5. Training is key. Obviously, when a test program is first initiated, most of the training will involve technical matters: familiarizing the staff with the computers and interfaces with your test organization. But training d'esn’t stop there. In addition to conducting training sessions for administrators, it is essential to provide training for every person involved in the process, including counselors, tech staff and test proctors to ensure that every school is tested the same way, and on schedule. It’s also crucial that training include how to properly use the data that’s retrieved.

    6. Test the tester, pretest the kids. It’s vitally important for every teacher to take a sample test so that they understand what their students are being exposed to. It’s also important, especially for younger children, to be allowed to take a practice run to become familiar with the test format or the process.

    This article originally appeared in the 01/01/2003 issue of THE Journal.

    comments powered by Disqus

    Whitepapers