A new look at the rankings

New state data suggest the toniest towns don’t necessarily have the best schools

Every fall Boston magazine releases a back-to-school issue ranking the state’s best high schools. And every fall the skeptics sneer. These critical observers regard the magazine’s rankings as little more than a list of the state’s most exclusive suburbs, where student test scores often simply reflect the income levels of a community. It’s much harder to judge whether the schools are actually better or whether they simply are educating students who arrive with all the advantages of a more affluent upbringing.

What makes a school good, after all, is not simply a demographic make-up that packs those with the greatest advantage into one building. Good schools have a rich curriculum and high-caliber teaching staff that promote real growth in student learning, regardless of what level students are at when they arrive in the fall. New data being compiled by the state education department finally allow us to see how much schools help students progress relative to similar students attending other schools. These “student growth” data give critics of the Boston magazine rankings some ammunition, but we need to know much more about their reliability before parents can trust that their child might actually learn more at schools where all the students aren’t posting sky-high scores.

To rank the best schools, the magazine looks at a number of factors, including results on the MCAS, the SATs, the percent of students graduating and going on to college, and the number of sports teams and clubs. But noticeably absent is how much the school contributes to student success. The state started addressing that deficiency last year with what the Department of Elementary and Secondary Education calls Student Growth Percentiles (SGP), a statistic that measures how much students progress from year to year compared with their peers at other schools.

The concept behind the SGP is fairly simple: To calculate a score, the state groups students based on their MCAS results from the previous year. The education department then measures how much their scores improved the next year relative to other students with similar scores, and averages these gains for all students in the school. If a 6th grade student’s SGP score in mathematics is in the 65th percentile, this means he did better than 65 percent of students statewide who had similar scores in 5th grade. So, if a school’s SGP is 65, its average student’s improvement over the previous year was better than 65 percent of all students in the state.

How did Boston magazine’s best high schools fare when success is measured from this angle? As a group, the top 10 schools outperform the state average, but only by a slight margin, largely due to the stellar performance of Dover-Sherborn (ranked second by the magazine). Newton North (ranked 9th by the magazine) also posted a standout SGP score. Beyond these two notable exceptions, most of the other top 10 schools look fairly mediocre. Statewide, Lexington and Wellesley were middle of the pack. Wayland’s SGP (ranked 8th by the magazine) was well below the state average.

chart

This variation among the top 10 schools could mean that some high schools really are much better at helping their students achieve than others. But it could also be a statistical fluke. Maybe the SGP is highly sensitive because students perform similarly and thus cluster together around the middle. Or perhaps the SGP for high schools just isn’t that useful. Theoretically, in the strongest districts, by the time students arrive at high school they are already maximizing their talent. Can teachers really help a group of high achieving students continue to outperform?. Now that we have two years of SGP data, we can learn a little bit more about their reliability.

Across all public schools, the numbers seem relatively stable.  The correlation between a school’s 2009 and 2010 score is 0.73 (This suggests a strong association between 2009 and 2010 scores. Correlation coefficients range between 0 and 1. A value of 1 means 2009 and 2010 scores are identical, whereas a value of zero means 2009 scores had no bearing on 2010 scores.) But just glancing through the results, there are many noticeable blips. Bedford High’s SGP, for example, went from 64 in 2009 to 55 in 2010. Did the quality of education really change that dramatically in one year?

Comparisons between a school’s math SGP and English language arts (ELA) SGP also raise questions. Students at Wellesley High scored in the 56th percentile on math but just the 45th in ELA. This is ironic given Boston magazine actually spotlights Wellesley High’s standout English department. And Wellesley isn’t just a random case. Many schools perform much differently on math than English (across all public schools, the correlation of math to ELA scores is just .45). It’s hard to understand why excellent schools wouldn’t have success across core subjects.

chart

Meet the Author

Ben Forman

Research Director, MassINC

About Ben Forman

Benjamin Forman is MassINC’s research director. He coordinates the development of the organization’s research agenda and oversees production of research reports. Ben has authored a number of MassINC publications and he speaks frequently to organizations and media across Massachusetts.

About Ben Forman

Benjamin Forman is MassINC’s research director. He coordinates the development of the organization’s research agenda and oversees production of research reports. Ben has authored a number of MassINC publications and he speaks frequently to organizations and media across Massachusetts.

Meet the Author
Student growth data are central to the future of education reform in Massachusetts. They will be used to measure performance of teachers and administrators. But they have potentially even deeper implications. If we really want more economically integrated schools, parents will need confidence that urban schools like Brighton High, which dramatically outperforms (combined SGP of 68) all but one of the magazine’s top 10, truly provide excellent education even when their average MCAS score is just average.

Clearly, we need to better understand and refine these data. Boston magazine’s annual rankings capture the region’s priciest communities. Less certain after an initial look at the state’s new student growth data, however, is whether they actually identify the best schools. With help from the Department of Elementary and Secondary Education, maybe the magazine could introduce the SGP data in the next issue, and ask and answer questions regarding their reliability.

Ben Forman is research director at MassINC. Kelsey Muraoka is a senior at Boston College and an intern at MassINC.