Time to fix the charter school cap
Use of growth scores deprives thousands of kids of better options
THE STATUTORY RED TAPE restricting the number (and percentage) of students who can attend charter public schools has the Swiftian feel of Lilliputians tying down one of the best solutions we have to rampant inequality in our urban districts.
There are, of course, statewide caps on charter schools. Add to that curbs on new charters not established by existing providers. And there is the 9 percent limit for local school spending on students who opt for charter schools, unless the affected district ranks in the bottom 10 percent statewide, in which case the percentage doubles to 18 percent.
So, as you might imagine, how the state defines the bottom 10 percent of school districts is a contentious issue. But just to give you a sense of the underperformance in bottom performing districts, consider Fall River, where on the 2019 10th grade MCAS only 38 percent of students met or exceeded expectations in English language arts versus 61 percent statewide, and 29 percent did the same in math versus 59 percent statewide.
The state used to determine which districts rank in the bottom 10 percent based entirely on annual MCAS scores. Starting in 2014, however, the Department of Elementary and Secondary Education (DESE) began also considering a calculation called the student growth percentile (SGP), which measures improvement gains, not absolute achievement levels. By factoring in student improvement – it counts for one-quarter of the calculation — even when that improvement was from a very low level to a slightly higher level, DESE hoped to catalyze the spirit of improvement in districts.
Why this debate matters now is that the MCAS was cancelled last spring due to the COVID-19 pandemic. As a result, DESE recently has been trying to figure out how to carry out their SGP calculation without longitudinal data.
The problem DESE is not talking about is the SGP is a flawed metric. First, it suffers from wide margins of error — especially in small districts. In districts with low enrollments the aggregate value of individual students’ SGP scores is less likely to approximate a typical mean than in districts with more students (and therefore more data points).
According to Dr. Cara Candal, who authored a 2019 study on the topic for Pioneer, “if one school [in a small district] received an SGP of 30 while another got a score of 70, researchers couldn’t be confident that the school with the higher score actually helped its students progress more than the one with the lower score.”
Conversely, because the SGP scores in districts with more students cluster around a typical mean, they often appear to be doing better than they actually are, especially when compared to smaller districts. In this way, the use of SGP can “hide” pockets of failure in larger districts.
What that means is that small districts with a year or two of atypically poor MCAS scores often displace larger districts in the state’s calculation of lowest performing districts. The result is that the use of SGP pushes smaller districts into the bottom 10 percent, and charter school seats become unavailable in the large, urban districts, where need and parent demand is greatest.
Consider Lynn. In the 2018-19 school year, Lynn moved out of the bottom 10 percent due to an increase in its SGP score, even though it remained solidly in the lowest 10 percent in terms of absolute proficiency. The change lowered the charter cap in Lynn from 18 percent to 9 percent of district school spending and meant none of the 1,920 students on charter waitlists in the city will gain access to charter schools, which generally outperform the district.
In 2018, a charter school application from Lynn “substantially met the criteria for approval,” according to then-acting commissioner of Elementary and Secondary Education Jeff Wulfson. But Wulfson was unable to recommend that the Board of Elementary and Secondary Education approve the school “because of upcoming changes in the [net school spending] cap for Lynn.”
The resulting reduction in the number of available charter school seats is contrary to the intent of the 2010 law that raised the charter cap in underperforming districts. It is useful to remember the purpose of student growth measures. They were introduced in the 2010 law entitled “An Act Relative to the Achievement Gap” with the purpose of encouraging districts to improve their performance not just a tiny bit and not temporarily.
On that score, SGP is a spectacular failure. Not one large urban district in the bottom 10 percent has ever improved enough to move out of the bottom 20 percent. That is a jarring statement considering how poorly a district has to perform in order to fall into the bottom 10 percent statewide and how many students are affected.
In the absence of MCAS scores for this year, the Department of Elementary and Secondary Education is recommending that the last two available years of scores – rather than just the last two years – be used to determine which school districts rank in the bottom 10 percent. In a narrow, technocratic sense, that is not unreasonable.The bigger picture is that the state education department’s technocratic calculation of student growth ignores the spirit and purpose of the 2010 Achievement Gap law. A better path would be to raise the bar on SGP; that is, to place more emphasis on annual MCAS scores unless districts demonstrate the sustained and substantial progress envisioned in the law.
Jim Stergios is executive director of Pioneer Institute, a Boston-based think tank.