Assessment | News

Researchers Team Up with Lexia To Rewrite Literacy Testing

A research center at Florida State University dedicated to helping young students learn to become better readers is working with a division of a company known for its foreign language instruction to develop a new line of PreK-12 reading assessments. The Florida Center for Reading Research is working with Lexia Learning, a subsidiary of Rosetta Stone to create the new interactive and adaptive literacy tests.

A major goal for the assessment system is to reduce the amount of time teachers spend administering assessments and connecting the data that's generated to action plans. A recent Lexia survey among 7,000 educators found that they spent the equivalent of 17 days on assessment-related activities. The integration of multiple assessment functions related to literacy is expected to drive that amount of time down.

The new assessments offer "quick screening tasks" to help identify where a student is on norm-referenced reading achievement tests. With that information, students can then be given listening or reading comprehension passages and specific diagnostic tasks to develop an accurate profile of his or her literacy strengths and weaknesses. The company's assessments are intended to be given three times a year, in the fall, winter and spring. Progress monitoring on reading skills will be available monthly.

"What's particularly exciting about [this] new assessment system are the new oral and academic language tasks that tap the listening and speaking skills required by states' new academic standards and allow for measurement of student growth in language as well as in reading," said Barbara Foorman, director of the Center and a professor of education. "Equally exciting, the interactive nature of the assessments keep students engaged at a level of challenge where they can be successful, rather than frustrated and bored as is common in existing assessments."

Along with Foorman, the Center's research director, Yaacov Petscher, and Developmental Psychology Area Director Chris Schatschneider in the university's Department of Psychology, helped write the new assessments. "The technology-based assessment will provide an adaptive experience that requires students to respond to fewer test items in order to reach a highly valid and reliable ability score for each student," they said in a statement. "As a result, the assessment process will become more time-efficient and keep students engaged at the proper level of challenge."

Following a pilot in fall of 2014, the assessment will be available for state- and district-level implementation in fall 2015. Beginning in 2016 the assessment system will predict to outcomes on the Common Core State Standards as measured by the PARCC and Smarter Balanced assessments.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Whitepapers