A new look at the rankings
New state data suggest the toniest towns don’t necessarily have the best schools
Every fall Boston magazine releases a back-to-school issue ranking the state’s best high schools. And every fall the skeptics sneer. These critical observers regard the magazine’s rankings as little more than a list of the state’s most exclusive suburbs, where student test scores often simply reflect the income levels of a community. It’s much harder to judge whether the schools are actually better or whether they simply are educating students who arrive with all the advantages of a more affluent upbringing.
What makes a school good, after all, is not simply a demographic make-up that packs those with the greatest advantage into one building. Good schools have a rich curriculum and high-caliber teaching staff that promote real growth in student learning, regardless of what level students are at when they arrive in the fall. New data being compiled by the state education department finally allow us to see how much schools help students progress relative to similar students attending other schools. These “student growth” data give critics of the Boston magazine rankings some ammunition, but we need to know much more about their reliability before parents can trust that their child might actually learn more at schools where all the students aren’t posting sky-high scores.
To rank the best schools, the magazine looks at a number of factors, including results on the MCAS, the SATs, the percent of students graduating and going on to college, and the number of sports teams and clubs. But noticeably absent is how much the school contributes to student success. The state started addressing that deficiency last year with what the Department of Elementary and Secondary Education calls Student Growth Percentiles (SGP), a statistic that measures how much students progress from year to year compared with their peers at other schools.
How did Boston magazine’s best high schools fare when success is measured from this angle? As a group, the top 10 schools outperform the state average, but only by a slight margin, largely due to the stellar performance of Dover-Sherborn (ranked second by the magazine). Newton North (ranked 9th by the magazine) also posted a standout SGP score. Beyond these two notable exceptions, most of the other top 10 schools look fairly mediocre. Statewide, Lexington and Wellesley were middle of the pack. Wayland’s SGP (ranked 8th by the magazine) was well below the state average.
This variation among the top 10 schools could mean that some high schools really are much better at helping their students achieve than others. But it could also be a statistical fluke. Maybe the SGP is highly sensitive because students perform similarly and thus cluster together around the middle. Or perhaps the SGP for high schools just isn’t that useful. Theoretically, in the strongest districts, by the time students arrive at high school they are already maximizing their talent. Can teachers really help a group of high achieving students continue to outperform?. Now that we have two years of SGP data, we can learn a little bit more about their reliability.
Across all public schools, the numbers seem relatively stable. The correlation between a school’s 2009 and 2010 score is 0.73 (This suggests a strong association between 2009 and 2010 scores. Correlation coefficients range between 0 and 1. A value of 1 means 2009 and 2010 scores are identical, whereas a value of zero means 2009 scores had no bearing on 2010 scores.) But just glancing through the results, there are many noticeable blips. Bedford High’s SGP, for example, went from 64 in 2009 to 55 in 2010. Did the quality of education really change that dramatically in one year?
Comparisons between a school’s math SGP and English language arts (ELA) SGP also raise questions. Students at Wellesley High scored in the 56th percentile on math but just the 45th in ELA. This is ironic given Boston magazine actually spotlights Wellesley High’s standout English department. And Wellesley isn’t just a random case. Many schools perform much differently on math than English (across all public schools, the correlation of math to ELA scores is just .45). It’s hard to understand why excellent schools wouldn’t have success across core subjects.
Ben Forman is research director at MassINC. Kelsey Muraoka is a senior at Boston College and an intern at MassINC.