Reassessing the Assessment of Distance Education Courses

##AUTHORSPLIT##<--->

 A Proposal for an Alternative Framework in Higher Education

 

Institutions that offer distance education programs to adult learners in any form -- from correspondence to Web-based courses -- have been honor bound to establish that such courses provide student learning and content equivalent to that found with campus-based instruction. Often, distance education programs that are in their initial or pilot stages must demonstrate equivalency of learning outcomes in order to gain a more permanent position on campus, and evidence is typically gathered through assessment.

Assessments and evaluations conducted over the past 15 years have attempted to shed light on the academic rigor of distance education courses by considering factors such as the nature of the mediating technology, different instructional approaches and course content. Two characteristics can be discerned amidst the wealth of investigations that have compared distance education with traditional instruction: an emphasis on using data that are student-based and a preoccupation with examining student performance in the semester the distance education course was taken.

Assessing the Situation

The reliance on student-centered data is readily understood. Student data, specifically grades and demographics, serve as an accepted 'currency' in higher education. Distance education supporters recognize the value and necessity of employing the same yardstick to establish the legitimacy of their programs. The relationship between student grades and performance in distance education and traditional courses has undergone considerable scrutiny, with reassuring results. A number of investigators have reported the benign and consistent finding that there is 'no significant difference' in student learning associated with the course setting. Welcome news for distance education proponents.

Attention also has been paid to demographic information. Not all students participate or perform equally well in distance education settings, and high attrition rates have very real financial ramifications for institutions. Attending to students' individual background and experiences, it is thought, provides institutions, distance education programs, and faculty members with insight concerning why some students succeed while others fail to complete their coursework. Case studies, evaluations, and dissertation research have found that distance education students look a little different than their counterparts in traditional settings. Compared with the students in traditional classrooms, distance education students tend to be older and have more professional experience. They often have families and careers to juggle, and benefit from having an internal locus of control for learning.

Student performance in the here and now of a distance education course is typically at the center of assessment. That is, investigators usually attend to how well students score on tests, exams, and assignments within the context of the distance education course itself. If comparisons with student performance in traditional classroom settings are to be made, they generally involve courses taken contemporaneously with the distance education course. In this way, assessments of distance education programs have amassed results that resemble a series of snapshots looking at the program and student performance on a semester-by-semester basis.

Widening the scope

Using student-level data within a particular time frame, institutions, distance education programs, and individual faculty have created a detailed portrait of distance education students and have established the comparability of student learning between distance education and traditional settings. This is a good beginning. Now that institutions have overcome the initial hurdles of establishing the first-generation distance education programs, the need arises for more elaborate, action-oriented information.

Focusing on student-level data tells only a limited tale. For example, generating a profile of the 'successful' distance education student d'es not provide institutions with practical information for program improvement or refinement. What can an institution do with this piece of data? It is anathema in higher education to deny students entry into a course based on their demographic profile. Indeed, pushing the envelope of students' abilities is at the core of instruction. Neither d'es the information really help out individual faculty members interested in improving distance education students' performance. Faculty members simply do not have the power to age a student five years, produce several offspring for them, or boost their G.P.A. half a point. What else can be done to provide institutions and faculty members with action-oriented information about the quality of instruction found in distance education programs compared with instruction in on-campus classes?

We propose a two-pronged shift in distance education investigations. The first shift removes the emphasis on distance education students and places it on the course itself. The second expands the scope of investigations to include distance education students' subsequent performance in other classes. Using these parameters, such an assessment would question how well distance education courses prepare students for further study. Moreover, such an approach would allow institutions to compare student preparation in distance education settings versus their preparation in traditional education settings. This is very useful information for institutions that are expanding their distance education offerings. This assessment model has been used successfully to consider transfer between community colleges and four-year institutions (Quanty et al., 1998).

To clarify our meaning, we will summarize some research we have conducted at Christopher Newport University (CNU). Although the data we draw upon are derived from CNU's online education program, the approach we suggest is transparent to any form of distance education technology.

 

Case Study: How well d'es online instruction prepare students for subsequent study?

Christopher Newport University (CNU) has offered online courses since the early 1990's, and therefore has the luxury to be able to draw upon data from the records of hundreds of students who have taken online courses over the years. It is important to note that, like the situation at other institutions, there really is not a distinct 'online student' group at CNU. Instead, online students at CNU are subsumed in the larger, traditional student population. Most of our online students typically take a combination of online and traditional courses.

Every online course at Christopher Newport University has an on-campus counterpart, which offers the opportunity for comparison. Many of our online courses are 100- and 200-level courses that act as prerequisites for more advanced courses, some of which are also offered online. Using this information, we can describe four pathways to enrollment in advanced courses that are available to CNU students.

Students enrolled in traditionally offered advanced courses may have taken the prerequisite course either through traditional means or through the online program.1 Alternatively (but, in practice, much less frequently), students can enroll in an online version of an advanced course after having taken the prerequisite in either an online or a traditional form. In our research, we focused our attention on the former scenario, and considered how well students perform in a traditionally offered advanced course based on the kind of prerequisite involved. A single question guided our study: Do online courses prepare students for advanced study as well as traditionally accepted forms of prerequisites?

To answer our question, we began by examining the enrollment records from six departments at CNU that offer a majority of the lower level online courses. We reviewed a total of nine courses offered between one and four times each between the Fall 1994 and Spring 1998 semesters. We traced all the online students' course of study after their participation in the lower level online course to see whether they went on to enroll in a traditional advanced course. Using this approach, we located a total of 44 enrollments for whom the online course acted as a prerequisite for a traditionally offered advanced course.

We defined student success in the advanced course as obtaining a final grade of C or higher. To determine whether the online courses had prepared students as well as the traditional prerequisite courses, we compared the final grades of the 44 enrollments with the grades of their classmates in the advanced courses (Table 1).

Using Fisher's Exact test of significance, we found that there was no statistically significant difference in the students' final grades. Based on this finding, we conclude that the online courses included in the sample prepared students for advanced study at least as well as the traditionally accepted forms of prerequisites at CNU.

Ramifications of the Assessment Approach

What is immediately apparent from the approach we employed is the practical information it provides. If a difference had been found in student performance that related to the nature of the prerequisite, institutions would be able to take a number of steps. For example, if it were shown that students taking online courses as a prerequisite were at a disadvantage, then departments and faculty members would be able to reconsider the format of online instruction, course content, or instructor's approach. Departments and faculty members would be able to devote similar scrutiny if it turned out that students with traditionally accepted forms of prerequisites were the ones at a disadvantage. Even when no statistically significant differences are detected, institutions can still make decisions based on the information. For example, since it appears that online students are not at a disadvantage when it comes to more advanced study, departments that have been reticent about developing courses may choose to begin to offer them.

Reviewing the Assessment of Distance Education Courses

We have described a new, parsimonious model that investigators interested in distance education can use to ask meaningful questions about the relative quality of distance education courses. The approach we suggest removes the emphasis from student-level data and places it squarely on the course itself. Admittedly, it might be a while before institutions can employ this model. Although we considered hundreds of student records over a four-year period, we could only locate 44 enrollments that followed the sequence in question. We are optimistic, however, that if institutions pool their data (which are easily gathered and analyzed), a more complete understanding of distance education courses can emerge. As each semester passes, we will continue to add more cases to our study and look forward to contributing to an extended dialogue on the important topic of assessing the quality of distance education.

 

Paula Szulc Dominguez, Ed.D. manages the online education program at Christopher Newport University. Dr. Dominguez recently received her doctorate in Human Development and Psychology from the Harvard Graduate School of Education, where her research focused on social cognition and instructional technologies. In addition to her continuing efforts in distance education research and evaluation, Dr. Dominguez works in the field of telemedicine. E-mail: [email protected]

 

Dennis R. Ridley, Ph.D. is the Director of Institutional Research and Planning at Virginia Wesleyan College. He has acted as an officer in the Virginia Assessment Group and is a frequent contributor to professional literature in psychology and higher education assessment. E-mail:

 

 

References

Quanty, M.B., Dixon, R.W. and Ridley, D. 1998. 'Community college strategies: A new paradigm for evaluating transfer success.' Assessment Update, vol. 10, no. 2, March/April.

 

1 For the purpose of this study, "traditional" forms of prerequisites included NU on-campus courses, courses transferred in from another college or a community college, or credit given for performance on an examination.

Featured

  • Abstract geometric pattern with interconnected nodes and lines

    Microsoft 365 Copilot Updates Offer Expanded AI Capabilities, Collaboration Tools

    Microsoft has announced updates to its Microsoft 365 Copilot AI assistant, including expanded AI capabilities in individual apps, the ability to create autonomous agents, and a new AI-powered collaboration workspace.

  • laptop on a clean desk with colorful image icons dynamically emanating from the screen

    Stability AI Intros Stable Diffusion 3.5 Text-to-Image Generation Model

    Stability AI, developer of open source models focused on text-to-image generation, has introduced Stable Diffusion 3.5, the latest version of its deep learning, text-to-image model.

  • illustration of a teacher in a classroom using AI technology

    Survey: Top Teacher Uses of AI in the Classroom

    A new report from Cambium Learning Group outlines the top ways educators are using artificial intelligence to manage their classrooms and support student learning.

  • person signing a bill at a desk with a faint glow around the document. A tablet and laptop are subtly visible in the background, with soft colors and minimal digital elements

    California Governor Signs Off on AI Content Safeguard Laws

    California Governor Gavin Newsom has officially signed a series of landmark artificial intelligence bills into law, signaling the state’s latest efforts to regulate the burgeoning technology, particularly in response to the misuse of sexually explicit deepfakes. The legislation is aimed at mitigating the risks posed by AI-generated content, as concerns grow over the technology's potential to manipulate images, videos, and voices in ways that could cause significant harm.