Technology-Rich Standards-Based Statistics: Improving Introductory Statistics at the College Level

##AUTHORSPLIT##<--->

Reasons for universitiesto teach with a standards-based approach are increasing. High schoolsin many states, often by decree of the state legislature, havealready gone to a system in which curricula are rewritten to alignstudent outcomes, instructional delivery and assessment. Typically,specific goals of content standards are broken down into severalindicators &emdash; objectives that can be measured by aperformance-based assessment. The assessment is scored by a rubricwhich specifies in a meaningful way (as opposed to a letter grade,which could vary greatly from teacher to teacher) performance levelsfor each of the pre-specified criteria.

In 1993, the ColoradoGeneral Assembly enacted HB93-1313, making K-12 public schoolsdevelop content standards and performance-based assessments by 1997.So these students are expecting more compatibility with universitiesin terms of how admission criteria will be established and in termsof assessment in university classes themselves. (A traditional“index” may no longer cover all schools, especially thosewith more integrated or interdisciplinary curricula) This is evenmore critical for universities which serve preservice teachers, sincethey will need to experience and have modeled the very structure theywill be implementing when they graduate.

There is a great deal ofresearch literature on technology in the mathematics classroom, fromHeid[1] and others, and some scattered literature onstandards based education, for example from Loacker andMentkowski[2]. There do not seem to be any research studiesspecifically involving the intersection of these two domains,however. Robson is an example of an emerging expository literature inthis intersection that comes from the first states to mandatestandards-based education[3]. Robson’s focus is on thesystemic statewide educational community, while the author focused ona specific course.

The author was one oftwelve from a variety of curriculum areas selected to be part of ateam of faculty in the first year of the university’sEducational Technology Improvement Project, funded by the ColoradoCommission on Higher Education. The author redesigned theintroductory (non-calculus-based) statistics course that serves bothas a general education requirement and as a course specificallyrequired by a variety of majors at the university. In thisquasi-experimental study, data was collected and analyzed about thiscourse by an independent project evaluator, by the author and bymathematics education doctoral students. The research discussed inthis article is actually part of a larger study examining studentpreconceptions and attitudes in introductory statistics. This articlefocuses on possible interrelationships and effects of technology anda standards-based approach.

The Department ofMathematical Sciences typically offers about six 40-student sectionsof introductory statistics each semester, and before this project thedepartment had not consistently or thoroughly integrated atechnology-rich standards-based approach to the course. During thespring 1997 semester, the author taught one section of the redesignedversion of the course while other instructors continued to teach inways that were more traditional.

The three instructors whotaught the sections from which data were collected were allexperienced instructors with a history of solid teaching evaluations.(Encouraged by the results obtained, many of next year’sinstructors will be incorporating or utilizing parts of theauthor’s approach and resources.) Students did not know at thetime of registration whether or not their section was the“treatment” section. Various quantitative and qualitativedata were collected from the treatment section and two of the“traditional” sections.

TheContent Standards

With feedback from otherProject members and the Project Director, the author developed thefollowing two standards to drive the course and itsassessments:

1) Students will be ableto critically evaluate statistics in the media and in their majorfield of interest.

2) Students will be ableto plan, implement and communicate the results of a real-world dataanalysis to investigate an appropriate hypothesis they havegenerated.

These standards turned outto be very similar to the ones recently articulated by Gal etal.[4] Since the material in the introductory course is verysimilar to what would be covered in a high school statistics course,it is also relevant to examine connections to standards for highschools. For example, the second standard is similar to the third ofColorado’s Model Mathematics Content Standards (http://www.cde.stat.co.us/math.htm)and to the National Council of Teachers of Mathematics’ call forstudents to be able to “design a statistical experiment to studya problem, conduct the experiment and interpret and communicate theoutcomes”.[5] The NCTM also supports the first standardby various calls for students to have the tools, for example,“for rejecting such television advertising claims as one thatportrays a series of people choosing the same commercialtoothpaste.”

Opportunities related toStandard #1 included weekly class discussion/critique of recentstatistics from local media as well as each individual doing Project#1. Various labs and other assignments gave students practice with“pieces” of Standard #2, and Project #2 put those piecestogether. Both Projects gave students the opportunity to beanalytical and creative and gain a greater sense of ownership andrelevancy of their knowledge by being able to tailor the assignmentsto their interests. This sentiment was ech'ed by 25 of 28 students(89.3%) surveyed by independent ETIP project evaluator Dr. RoseShaw.

Breakingfrom Tradition

In practice, however, fewstatistics courses appear to aim for both standards, often viewingthe first one as “off the subject” and viewing the secondone as impossible to do on a realistic scale within the structure ofavailable time and resources. Although things seem to be slowlychanging, the status quo still seems consistent withShaughnessy’s observation: “Most of the courses inprobability and statistics that are offered at the university levelcontinue to be either rule-bound recipe-type courses for calculatingstatistics, or overly mathematized introductions to statisticalprobability that were the norm a decadeago.”[6]

On a Web site(http://etip.unco.edu/lesser/home.htm)the author created/compiled (using Netscape Navigator Gold 3.0) thestandards as well as indicators for each standard, serving asexamples of ways students could demonstrate attainment of eachstandard. The four labs, two projects and final exam werespecifically described in terms of which standard(s) they were gearedfor, forcing greater clarity in both instructor’s andstudents’ minds about why they were doing what they were doing.The projects, and the labs to a lesser extent, requiredself-contained writeups from an open-ended task, allowing students tochoose directions or topics meaningful to them. While the books’chapters were still “covered” in sequence, time was spenton each section in proportion to how it supported the coursestandards, emphasizing building an understanding beyondchapter-dependent context. Memorization, derivations and long handcalculations were deemphasized as all assignments allow use of thebook, by R. Johnson[7], and technology.

InsertionPoints for Technology

In addition to the courseWeb site, the other major technology component of the course was aweekly class meeting in a laboratory room equipped with 30 IBM 486PCs, two laser printers, a portable MAC demonstration workstationcart, and marker boards. In addition, the textbook came with a diskwith datasets for all exercises, thus allowing students to easilyaccess and explore datasets that were more realistic in terms of sizeand “messiness”, rather than being limited to datasetssmall enough to easily type in or analyze by hand. This feature wasfound useful by 24 out of 28 students (85.7%) surveyed by theindependent project evaluator. The computers provided access to theInternet, a word processor and Excel, which was used on thestudents’ two projects and four lab assignments.

Using any software tofacilitate calculations allows students a more realistic dataanalysis experience or exploration with datasets of more realisticsize or complexity and thus creates more opportunities to experienceconceptual understanding. Spreadsheet software has general advantagessuch as its inexpensiveness, widespread availability and easy dataentry/editing. Moreover, students can modify or update an entry andimmediately see the consequences, as well as produce self-containedreports that integrate text, graphics and tables. The specificpackage Excel was chosen because of its balance between power andease for novices and its widespread availability.

The Roleof the Web Site

On the course Web site,students could access class announcements, course policies andrubrics, glossaries, statistical calculators, media examples, writingsupport, information about the Internet, statistics history,datasets, and exercises (with answers). They could even communicatewith each other or the instructor (either via e-mail messages or byparticipating in discussion “chat rooms”). Thisaccessibility helped students with nontraditional schedules or wholived out of town. (Some students who rarely spoke in the classroomreadily joined online discussion.)

The author hoped that byusing the Internet to display the course standards from the beginningand to show students the specific rubrics by which their assignmentswould be assessed beforehand that assessment standards of Opennessand Coherence advocated by the NCTM would be addressed. In addition,the Web site supported the Equity Standard with its links to multiplerepresentations, diverse datasets and modes of illustration, thusreaching out to a wide variety of students’ learning styles,backgrounds and interests. The Web site specifically supported bothContent Standards. Standard #1 was supported with many external linksto examples (both good and bad) of statistics in the media, so thatstudents had the opportunity to examine and critique a variety ofexamples. A key example of the latter was the link to the CHANCE Newsdatabase.

Project #2 requiredstudents to design their own study to investigate a question ofinterest to them, collect the data, analyze the data, and thenreport, interpret and discuss their findings in a self-containedwriteup. To support this, the Web site included sample questions forstudent projects as well as links to past students’ completeproject writeups of high quality. The author did not have time in theinitial phase of the project to develop this latter feature as fullyas Lavigne and Lajoie[8], whose library of exemplars includedexamples of “average” quality as well as “aboveaverage”.

Lavigne and Lajoie mentionseveral general advantages of technology, including diverse learningopportunities (the author’s Web site included external links tosimulations and animated sequences as well as simply text andgraphics) and consistency in communicating criteria to all classes.The author’s Web site also included seven chat rooms, enablingstudents to post questions &emdash; including some they might nothave had the desire or time to ask in class meetings &emdash; orpoints of view. One of the chat rooms was called “CourseManagement Questions” and gave students the opportunity at anytime to discuss or ask about assignments and rubrics. Unlike the chatrooms designated for postings on technology or content, thisparticular chat room did not get any postings during the semester, apossible indication that technology had indeed helped accomplishclear and coherent communication about the standards-based aspect ofthe course.

AttitudeSurvey Results

From the analysis ofvariance for each relevant (5-point Likert scale) item from theauthor’s attitude post-survey, the research hypothesis of asignificant link between a technology-rich standards-based approachand student ability to critique and relate statistics to theireveryday world was supported. In particular, treatment sectionstudents agreed more strongly, but not quite at a level ofstatistical significance (p = .096), than students in the other twosections to the statement: “In your chosen major, you willprobably use statistics.” This difference is likely due to theopportunities treatment students received (via the Internet,open-ended projects, etc.) to make real connections to the subjectmatter. Treatment students disagreed significantly more strongly (p =.026) to the statement: “Statistics published in the media areaccurate and unbiased.” This indicates the growth of criticalthinking skills, facilitated by the examples available on theInternet and by doing one of their projects.

Treatment students alsoagreed significantly more strongly (p = .014) that “A statisticsclass should provide opportunities for working in teams.” Thisattitude may have been reinforced by the teamwork required bytechnology usage (number of available terminals in the lab requiredstudents to work in pairs) and by the way working in teams honesabilities to communicate knowledge.

When students in the threesections (the author’s “treatment” section and twocomparison sections) were given the same attitudes survey on thefirst day of the course, these three items mentioned showed nosignificant differences between sections. Although we could not dothe pre-post comparisons we would have liked because students couldnot be matched (no names were collected so students would feel freerto be honest, and some students added or dropped between surveyadministrations), effect sizes were calculated for each item for thetreatment group section. Effect sizes larger than .33 are consideredto have practical significance by educational researchers[9].Of the six attitude items relevant to this part of the research, thesmallest effect size (after items were keyed in the same direction)was .25, and the other five were between .42 and .64. This shows thatstudents significantly preferred the various aspects of the treatmentcourse, including: group work, lab and project approach (as opposedto textbook homework problems), criterion-referenced (as opposed tonorm-referenced) assessment, and using examples from themedia/real-life.

In a separate surveyadministered by ETIP’s independent project evaluator Dr. RoseShaw during one of the last regular class meetings, 26 of the 28(92.9%) treatment section students said they would not have enjoyed atraditional format (i.e., straight lecture and regular tests) more.At least three-quarters of the students agreed or strongly agreedthat the course had in fact helped them meet the two standards. Acommon fear of educators who resist technology-rich courses is thatstudents may use technology in a way that shortcuts conceptualunderstanding. Perhaps due to the deliberate support of thestandards-based approach, 25 out of 28 (89.3%) of the students agreedor strongly agreed that the course “emphasized understanding ofconcepts rather than cookbook, mechanical ways of dealing withconcepts.” Finally, there is strong evidence that the assessmentstandards of Openness, Coherence and Equity mentioned earlier weresatisfied, as Dr. Shaw’s survey indicated that 27 out of 28students agreed or strongly agreed with the statements “Theinstructor was fair” and “The instructor expected areasonable level of performance.”

Analysisof Performance Data

A modest sample of studentperformance data was obtained by having a mathematics educationdoctoral candidate, Ron Wisniewski, grade three problems which weregiven to each of the three sections at the end of the course. Theproblems were scored according to a rubric the author and the otherresearcher refined together with some of the data to ensurereliability. The problems involved critically interpreting a piegraph from USA Today, computing a disjunctive probability (aprobability of the form “at least one success”) and astraightforward two-population hypothesis test. From an analysis ofvariance on scores for the first problem, the treatment sectionscored significantly higher than the other sections (p = .02).Analyses of variance for the other two problems did not result insignificant differences (p = .06 and p = .28), but the highest meanswere in one of the comparison sections. Thus, the treatment group didsignificantly better with the problem involving the kind of criticalthinking more common to reform curricula (see course Standard #1) andpedagogy, yet did not do significantly worse on the two traditionalproblems.

While instructors were notsurprised by the pattern of these results, a limitation should bestated. The author intended the problems to appear as “commonproblems” on all sections’ final exams, but the other twoinstructors instead gave it as a last-week-of-class quiz. Thestudents who had it on their final perhaps had more incentive to“try harder”, but this effect may have been balanced by thefact that that section also had every student (not just the studentsmotivated enough to attend one of the final regular class sessions)present.

SomeFinal Thoughts

There is little questionthat the standards-based approach will be more widespread as callsfor accountability and demonstration of specific performanceincrease.

There is not, however, a“standardized” means of attaining a particular set ofstandards, and that is where more exploration is needed. Indeed, eventhe author’s means are being refined and reevaluated eachsemester. Based on student feedback, the independent projectevaluator recommended that the spreadsheet technology aspect of thecourse needed to be supported better in the future, perhaps with moreclearly detailed job aids and ideally with a lab assistantavailable.

Nevertheless, it seemsthere are many ways that technology, guided by the explicitness ofintention forced by the standards-based approach, can greatlyfacilitate the development, communication and attainment of thecourse standards. This study suggests that students seemed to enjoyand thrive in a technology-rich standards-based environment with nosignificant evidence of deficits compared to their moretraditionally-taught counterparts.

Dr. Lawrence Mark Lesserconducts research and teaches courses in mathematics, statisticsand mathematics education at the University of Northern Colorado inGreeley, Colo. His dissertation for his Ph.D. in MathematicsEducation concerned the introductory statistics curriculum. He is theco-author of ACT in Algebra, a new technology-rich collegealgebra textbook published by McGraw-Hill.
E-mail: lmlesse<@>unco.edu

References:
1. Heid, M. (1997), “What Have We Learned and What Do We StillNeed to Investigate,” paper presented at the Tenth InternationalConference on Technology in Collegiate Mathematics, Chicago, IL.
2. Loacker, G. and Mentkowski, M. (1993), Making a Difference:Outcomes of a Decade of Assessment in Higher Education, SanFrancisco, CA: Banta and Associates.
3. Robson, R. (1997), “Taming the Tiger: The Role of Technologyin Standards-Based Education,” paper presented at the TenthInternational Conference on Technology in Collegiate Mathematics,Chicago, IL.
4. Gal, I. Ginsburg, L. and Schau, C. (1997), “MonitoringAttitudes and Beliefs in Statistics Education,” TheAssessment Challenge in Statistics Education, Amsterdam: IOSPress.
5. National Council of Teachers of Mathematics (1995), AssessmentStandards for School Mathematics, Reston, VA: NCTM.
6. Shaughnessy, J.M. (1992), “Research in Probability andStatistics: Reflections and Directions,” in D.A. Grouws (Ed.),Handbook of Research on Mathematics Teaching and Learning, NewYork, NY: Macmillan.
7. Johnson, R. (1996), Elementary Statistics (7th ed.),Belmont, CA: Wadsworth.
8. Lavigne, N.C. and Lajoie, S.P. (1996), “CommunicatingPerformance Criteria to Students Through Technology,”Mathematics Teacher, 89(1), pp. 66-69.
9. Borg, W.R. and Gall M.D. (1989), Educational Research (5thed.), New York, NY: Longman.

Productsmentioned:
Excel; Microsoft Corp., Redmond, WA, (800) 426-9400, www.microsoft.com.
Netscape Navigator Gold 3.0; Netscape Communications Corp., MountainView, CA, (415) 937-3777, www.netscape.com.

This article originally appeared in the 02/01/1998 issue of THE Journal.

Whitepapers