All of the Above

##AUTHORSPLIT##<--->Maryland's new multiple-choice-only state exam a) will leavekey skills untested; b) is a step backward to the 20th century;c) rejects technology; or d) is a poor gauge of learning?

Geoffrey H. FletcherI'M STEPPING OUT on a limb here, buteither we have some very bizarre decisionmaking going on in Maryland, orsome very poor reporting from TheWashington Post. In light of the Post'sarticle on the supposed ineffectivenessof educational software (see Policy &Advocacy, June), I would've banked onpoor reporting. It turns out the Post isjust the messenger this time.

In a story published Sept. 13, the paper reported that "Maryland plans to eliminate written-response questions from its high school exit exams to address long-standing complaints about how slowly test results are processed." Currently, Maryland's state test includes both "brief constructed responses" and "extended constructed responses." Constructed response questions, whether brief or extended, require students to construct— write—responses rather than just choose from answers arranged for them by the test makers.

The reason cited for the change was speed. By eliminating written-response questions, the state can cut the time it takes to get results back to districts from nine weeks down to three weeks. Stephen Bedford, the chief school performance officer in Montgomery County, MD, believes the nine-week turnaround time is "too long, making it difficult to enroll students in appropriate courses or plan for interventions" for students who fail.

Ronald A. Peiffer, Maryland's deputy superintendent for academic policy, says, "The kinds of things we could only test with constructed-response items before now can be done…with selected-response items in a way that's just as good or better."

I have a hard time understanding that one. How can choosing a correct answer from a list be a better way to gauge learning than having the student create an original answer? On top of that, two key skills that higher education and employers consistently tell us are lacking in our high school graduates are critical thinking and problem solving. And those skills are much better judged by a student's written response than by what might be a lucky stab at a multiple-choice answer.

The article states, "Written-response questions take much longer to grade than multiple-choice questions because they have to be evaluated by humans, not computers." Says who? There have been significant technological advances in test scoring, including software that has been shown to grade written work as reliably as human graders (see "A Virtual Treasure Trove," April).

I wonder why we are so quick to discount technological fixes to some of our problems. Maybe Maryland officials considered the technological options and discarded them due to cost or some other factor. Maybe they don't trust the technology.

Either way, we cannot afford to keep using 20th-century methods to test for a 21st-century set of skills. When we are in desperate need of forward thinking, the Maryland decision is a significant step back.

-Geoffrey H. Fletcher, Editorial director

This article originally appeared in the 10/01/2007 issue of THE Journal.

Whitepapers