Teaching & Learning

Forget What You Think You Know about Summer Slide

Is the "summer slide" real? One education researcher recently questioned that assumption. The idea that students lose ground in their learning when they leave school for the summer break, suggested Paul von Hippel in an essay on Harvard's Education Next Institute website, can't be proved.

von Hippel, who is an associate professor at the University of Texas at Austin in the School of Public Affairs, and his colleagues have tried to replicate results in studies in the "summer learning literature" with no success. The best-known research on the subject, the "Beginning School Study," first appeared some 30 years ago, he wrote, using a test that "had problems." As he described, the study began in the fall of 1982 with 838 first graders in the Baltimore City Public Schools. Students were tested twice yearly, in fall and spring, "so researchers could tell how quickly they were learning during the school year and during summer vacation."

Those first graders in high-poverty schools scored 16 points below other first graders on the California Achievement Test. That gap jumped to 56 points by the end of eighth grade, in the spring of 1990. "Remarkably," von Hippel wrote, "all of the gap growth took place during summer vacations; gaps did not grow during the school years." The result morphed into the "idea that more than two thirds of the eighth-grade achievement gap comes from summer learning loss."

In spite of the fact that the study is "old," noted von Hippel, it's still being pulled out and used as evidence of the summer slide even as recently as 2018. Yet modern data belies the findings.

He pointed to the national "Early Childhood Longitudinal Study" of the kindergarten class of 2010-2011, which used a sampling of children in both public and private schools across the country. That study estimated the gap in reading scores between students in high- and low-poverty schools. As von Hippel pointed out, "the gap barely changed between the start of kindergarten and the end of second grade. There is no sign of the gap growing during summer vacations."

The same was true for the Measures of Academic Progress, given in 7,800 schools and districts. As von Hippel explained, that research project found that the gap did grow between kindergarten and eighth grade, "but only by one third, and the gap grew no faster during the summer than it did during the school year."

Why the differences in findings? Part of it, von Hippel said, was due to changes in how student performance is tested and scored now versus the early 1980s. And as test questions have become more challenging, test-score gaps have seen the effects. "Depending on what questions you add, you can get any gap that you want," he stated.

In fact, when the 1980s tests were rescored using the more modern scoring method of "item response theory" instead of the old-style Thurstone scaling, the gaps didn't triple between grades 1 and 8, as the 30-year-old study showed. Just the opposite: The gaps shrank.

Another aspect of testing affected the outcomes too, von Hippel said. During the 1980s and up to 1999 for the most part, students took "fixed-form" tests. In each grade, they faced the same set of questions twice, once in the fall and once in the spring. But the set of questions they faced changed with each subsequent grade. That meant that when they entered school the next fall, the questions were different. Yet those were the results used to calculate summer learning (or loss).

"Before 2000, pretty much all summer-learning studies used fixed-form tests that changed at the end of summer vacation," von Hippel wrote. "If we could go back in time and give those kids modern tests, it's quite possible that the results wouldn't replicate."

Modern tests, on the other hand, use adaptive technology; they ask harder questions after students respond correctly and easier questions when they respond incorrectly. That means students aren't facing the same questions each time, nor are the changes quite as abrupt at the start of each new school year. While this could mean researchers have the potential for doing "a more accurate and reliable job of estimating summer learning," more current studies aren't conclusive.

Summer-learning loss measurements for the Early Childhood Longitudinal Study of 2010-2011, for example, have uncovered that on average, students "lose just two weeks of reading and math skills their first summer vacation. During their second summer vacation, they lose two weeks of reading again, and they actually gain a little in math." Results from the other set of tests used for the Measures of Academic Progress found that children lose roughly a month of reading and math skills during their first summer vacations and even more in their subsequent summer vacation.

"So what do we know about summer learning loss?" asked von Hippel. "Less than we think. The problem could be serious, or it could be trivial." But the important point, he added, is that "it is almost surely not the case that summer learning loss accounts for two thirds of the achievement gap at the end of eighth grade." Most of that gap that's present by eighth-grade, for modern tests has already shown up at the start of kindergarten.

We should view summer vacation, he emphasized, as an opportunity to shrink the gap and help students who are behind "catch up," -- whether that's through summer learning programs, extended-year calendars or some other approach.

The full essay is openly available on the Education Next website.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Whitepapers