...

Internet-Based Testing: A Vision or Reality?

ELEANOR BICANICH, Director, Vocational Education Services Center
Pennsylvania State University at McKeesport
McKeesport, Pa.

DR. THOMAS SLIVINSKI, President
WebTester, Inc.
Selinsgrove, Pa.

DR. SUSAN B. HARDWICKE, President
EDUTEST, Inc.
Richmond, Va.

DR. JEROME T. KAPES, Professor of Educational Psychology
Texas A&M University
College Station, Texas

The 18,000 school districts nationwide seem to be scrambling in a mad dash to acquire Internet capability in the classroom or to build an Internet curriculum. Far fewer attempts are underway to harness the power of the Internet for addressing critical issues in education. As the Internet is increasingly deployed throughout the nation's schools, state Departments of Education are finding ways to take advantage of the technology to save resources. One of the applications that states should consider is end-of-course testing for vocational-technical programs.

While the advantages of an Internet approach over the traditional paper-and-pencil approach seems obvious -- immediate results, aggregation and immediate analysis of data, and reduced costs -- the real potential must be identified and quantified. Further, a number of critical issues need to be addressed, such as:

  • What are the differences, if any, in performance and attitudes toward Internet tests compared to paper-and-pencil tests?
  • How can privacy and security be assured?
  • D'es Internet delivery of testing place any subgroup at a disadvantage?
  • What will it take operationally for the state to pursue and successfully implement Internet-based testing?

Overview of the Pa. Pilot Project

This article reports on a pilot project conducted statewide in Pennsylvania. Nearly 400 students from 14 Pennsylvania vocational-technical institutions participated. These institutions include a mix of secondary vocational/technical and comprehensive schools and postsecondary institutions. Internet-based testing software was provided by WebTester, Inc.

Tests and diagnostic instruments consisted of multiple-choice questions, selected from item banks developed by the Vocational Technical Educational Consortium of States (VTECS). VTECS is a consortium of 23 state organizations that provides members states with competency-based, vocational-technical outcome standards, curriculum resources and assessment vehicles. These tests were already in an electronic media and easily incorporated into WebTester architecture. Depending on the occupational specialty, the test for that specialty had as few as 57 or as many as 107 questions. No passing score was defined.

The project had two phases. In the first phase, approximately 160 students voluntarily logged onto the Internet to complete randomly generated sets of VTECS test items for self-assessment and instructional purposes. This phase was known as the "Diagnostic Only" phase.

The second phase was a test-retest equivalency design with control groups. Approximately 360 students received both paper-and-pencil and Internet-delivered item sets as end-of-course achievemen tests.

Demographic information and survey data were collected in both phases. Vocational/occupational specialties selected for the study were Computer Repair, Computer Specialty, Child Care and Guidance, and Autobody Repair.

The Internet Testing Process, in Brief

Figure 1 shows the Internet testing process. Participants logged onto the Web site using passwords. The system at the Web site verified the password and generated the test instrument, which was then downloaded to each participant's computer. Questions, in multiple-choice format, were answered using a mouse and the point-and-click method. To score their tests, participants clicked on a "button" displayed onscreen. The test and answers were then uploaded to the WebTester server, which immediately scored the answers and downloaded the results. This included an analysis of questions that were answered correctly or incorrectly.

Figure 1

Students were requested to print out their test results for review by their instructors; no other access to participant scores was provided. Questionnaires were also administered via Internet before the session ended.

Student privacy was maintained through the use of special student passwords and student ID numbers. No student names were included. The student IDs were known only to the home school. Using this rather simple measure, student privacy was ensured.

Test security was assured through configuration management and controlled loading and residency. Tests were loaded and online only during test periods. All accesses were recorded and traceable to the local user.

Study Findings

A number of useful and significant findings emerged from the study. They have implications for the use of, and transition toward, Internet-delivered tests.

Internet Delivery. Internet-based delivery of a test d'es not affect student performance. The Internet-delivered and paper-and-pencil versions of the tests were shown to be equivalent in the test-retest design. Additionally, Internet delivery of tests introduced no bias relative to gender or special educational needs (economic/educational disadvantaged or disability).

Student Preference and Attitudes. Students preferred Internet delivery to paper-and-pencil versions by a 3-to-1 margin. This finding is consistent with the use of the Internet for practice and self-assessment. In the diagnostic phase, approximately 50% of the students voluntarily practiced with the diagnostic tests, indicating that if practice is offered via the Internet, students will use it. Additional studies can be conducted to determine how much this practice improves learning and performance on tests.

The participants were relatively inexperienced with the Internet. Overall, 68.5% of participants had little or no prior Internet experience. Even so, participants did not have significant problems accessing the Web site, reading test questions on the computer screen, using a mouse to answer questions, or receiving their test scores. Students with greater computer literacy preferred the Internet testing more than students with little experience on computers, but both groups rated the Internet experience as positive.

Test Administrator Time. A questionnaire was also filled out by Internet test administrators at the end of each Internet session. Test administrators reported that the Internet-based testing required somewhat less preparation time, effort and class time, as well as substantially less effort for data analysis.

Internet Operational Problems and Results

Operationally, the project considered several institutional factors relative to Internet-based testing. These include: the organization of Internet support within the institution, the institution's Internet supplier, and the relative capability of the institution's Internet computers.

Most schools have followed one of three models in designing their Internet access -- a classroom of computers linked by a network to the Internet via a server; networked single computers linked by a LAN or other network in multiple classrooms; and single dial-up computers linked via phone lines to the Internet without an internal network. Clearly, the schools with classroom-style Internet configurations were able to process tests with much less administrative time than schools using other implementation models.

Internet access was provided to schools either through a local university, local commercial suppliers or the state/local government. The Internet is a dynamic growth area. Even large, sophisticated firms such as America Online (AOL) and Microsoft (MSN) have experienced significant growth problems. This project experienced significantly greater problems with schools whose Internet service was provided by governmental organizations than those serviced by commercial companies.

Most participating schools were equipped with early Pentium or 486-compatible hardware to access the Internet. These systems are now considered small-to-medium sized. Advances, such as Java, audio, video and enriched browsers, typically are configured for larger, more modern computers. The software used in this project was configured to operate with older systems and browsers under Netscape Navigator 2.0 or Microsoft Explorer 3.0 on an Intel 486 with 4MB of RAM or more.

Only a small percentage of tests were aborted due to external problems. Reasons ranged from Internet transmission problems to network problems at the schools. In one school for example, in the first session of testing, virtually none of the student tests were scored. These errors were traced to problems in the school's network. Only one significant Internet outage was experienced. The Internet service provider for one school was not operational during one morning of testing, so tests were rescheduled for later that day.

Communications and Teacher Training

E-mail links with each of the schools were established to make maximum use of e-mail and Internet-based training on test procedures and data collection. The aim was to limit costs and maximize utility of the Internet. However, with a few exceptions, teachers did not use these Internet training and communications facilities. Site visits and personal contacts were therefore needed.

Many administrative participants did not effectively utilize the online tutorials for either initial familiarization with the project or as reference material for problem solving. Most problems occurring during the scheduled test sessions resulted from an inability, or unwillingness, to access these online tutorials or to carefully follow the directions. Problems also occurred in test administration because teachers did not follow directions. Additionally, at least two test administrators did not grasp the approach for repeated measures testing. However, test administrator performance and overall Internet testing performance did improve during the second Internet testing session.

Internet-Based Testing Costs

Analyses focused on the cost differences between Internet-based testing and paper-and-pencil testing. The study assumed that no additional local site information systems resources (hardware, software or Internet access) would be allocated in order to implement Internet-based testing.

Non-recurring costs for modifying existing tests for Internet delivery and developing test templates were approximately $1,500 to $2,500 per test, based on a test of 100 items and excluding graphics. Recurring costs for disseminating, scoring and presenting results were $2 per participant; additional reports and analyses were $.50 per participant.

If one assumes an average cost per test for paper-and-pencil dissemination, administration and scoring, cost savings would be expected to accrue from Internet-based testing after 375 participants.

Conclusions and Discussion

This pilot project in Pennsylvania demonstrated that the Internet offers a viable, cost-effective alternative to paper-and-pencil testing. Internet reliability and performance are sufficient to support testing requirements for a large group of students concurrently.

The equivalency of the tests and the absence of bias associated with Internet-based delivery also suggest that school systems can transition to Internet-based delivery while using both types of administration. This should prove to be an important benefit as schools install additional computers and increase their access to Internet technology. Figure 2 summarizes the lessons learned from this project.
 

1. Each state needs an overall plan for implementing the Internet in local school districts, including hardware, software, communications, training, funding and support. 

2. Schools should implement Internet-wired classrooms/labs with multiple Internet workstations. 

3. While the computers used for Internet access do not have to be the latest or greatest, they must meet minimum hardware/software capabilities -- preferably Pentiums (or equivalent) with 16MB bytes of RAM. 

4. The Internet is not an effective tool today to communicate with teachers. Teachers are not trained to routinely check their E-mail or to look to the Internet itself for training or instructional material. 

5. Key to Internet testing is the availability and validity of tests. 

6. Test administrator and teacher support is critical. Internet training is vital and Internet capabilities must be integrated with other curricula in teacher-preparation institutions. 

7. Internet testing severely strains the technical and management capabilities of a school's Internet resources. Pre-test trials at projected data volumes are needed to uncover hidden problems in each school's configurations. 

8. Talent to address technical support problems and issues is scarce and expensive. Technical support on LANs, Internet, Internet services and software is needed at local level. This can best be provided on a regional or area basis. 

Figure 2: Lessons Learned

The Internet offers advantages over other test-delivery systems, in that results can be aggregated as test sessions occur and summarized immediately. Also, modifications to instruments, item banks and procedures can be accomplished immediately and inexpensively.

The availability of immediate results, students' preference for Internet-delivered testing and their voluntary use of the Internet for practice, indicate that the Internet holds potential for educational benefit beyond cost savings. Early knowledge of results enables individuals and school systems to respond more rapidly and to address or prevent problems in a timely manner. Students' preference for the Internet can be used for improved learning and test-taking practices.

While the Internet may pose some concern about security, given proper safeguards and controls, Internet-based testing is no less secure than traditional testing methods. And student privacy can be safeguarded with relatively simple student identification procedures, such as those implemented in this study.

Although it is technically feasible to implement a large, multi-school Internet testing program, it may not be practical to do so at this time. This is based on our findings that most Internet usage in public schools today is aimed at ad hoc research or educational entertainment. It is not structured or coordinated between sites, nor managed with the required discipline. Large differences exist in support from regional Internet suppliers, in hardware/software configurations, and in site staffs' Internet expertise and expectations.

To implement a statewide Internet-based testing program, each school must have access to one or more Internet facilities with capacity to accommodate all of the students in one course in two linear sessions in one day. Two sessions in one day can be used if the students are separated into groups and kept apart until all complete the testing.

Lack of training and familiarity with the Internet on the part of teachers increases the cost and complexity of implementing an Internet-based testing program. This deficiency d'es not prevent effective deployment, but d'es increase the need for site visits, test administrator training and pre-testing trial sessions.

If the government is to provide Internet services directly to schools, then such services must be configured with the flexibility and management/ technical skills needed to successfully support schools over the intermediate term. Technical support must extend beyond the boundary of school hours, to ensure integrity and availability of communications during school hours. And of course, the client schools must treated as "prized customers."

Funding for this project was provided by the State of Pennsylvania, Department of Education, Bureau of Vocational -Technical Education, under the Carl D. Perkins Vocational and Applied Technology Education Act (P.L. 101-392, Title II-A, State Leadership Activity).

Eleanor Bicanich received her bachelor's degree from Indiana State Teachers College (Indiana University of Pennsylvania), a master's in education from the University of Pittsburgh and has completed doctoral work at the University of Pittsburgh. Since 1990, she has been Director of the Vocational Education Services Center at Penn State McKeesport. She has held numerous state and national offices in vocational special needs, including President of the National Association of Vocational Education Special Needs Personnel. E-mail: eeb6@psu.edu

Thomas Slivinski received his master's and doctoral degrees in Computer Sciences from University of Illinois. He has held significant management and technical positions in hi-tech organizations including IBM Research, Office of the Chief of Staff of the U.S. Army and Mandex, Inc. He is currently consultant to educational organizations on distance learning and the application of technology to rural vocational-technical education. E-mail: slivinski@webtester.com

Susan Hardwicke received her master's and doctoral degrees from George Washington University. Author of many papers on computerized testing and a book on reengineering, Hardwicke is currently president of EDUTEST, Inc., an online educational assessment company. E-mail: hardwicke@edutest.com

Jerome Kapes received masters and doctoral degrees from Penn State University. He has held faculty positions at Penn State, Lehigh and Texas A&M University where he currently is Professor of Educational Psychology and Distinguished Research Scholar. He has authored or co-authored numerous papers, articles, books and monographs, and is the lead editor of A Counselor's Guide to Career Assessment Instruments. E-mail: JKapes@tamu.edu

Products Mentioned in this Article:
WebTester - Internet testing software; WebTester, Inc. Selinsgrove, PA, (888)-924-8378, www.webtester.com.

This article originally appeared in the 09/01/1997 issue of THE Journal.

comments powered by Disqus

Whitepapers