Doing the math on education reform
Education reform has long been the favorite pastime of Massachusetts state government. Since 1888, more than 100 commissions and official studies have put public education under the microscope. Many of these examinations have been of specific aspects of schooling–teacher training, textbooks, immigrant education. About a dozen were more comprehensive and looked at education policy and practice more broadly. Some of these investigations came to naught, while others resulted in major legislation intended to correct the perceived shortcomings of Massachusetts public schools. But no education overhaul in the state’s history was more sweeping than the Education Reform Act of 1993.
How that education-reform law came about is a complicated tale of legislative politics, but also a straightforward saga of litigation. In 1978, a group of frustrated citizens hired lawyers and sued–in the name of 16 students enrolled in school districts ranging alphabetically from Brockton to Winchendon and geographically from Salisbury to Springfield–to end the disparities in educational opportunity between high-spending and low-spending districts. The lawsuit, which began as Webby v. Dukakis and was resolved in 1993 as McDuffy v. Robertson, was based on the theory that the major barrier to equal education was the difference in school finances between advantaged and disadvantaged districts.
The Supreme Judicial Court agreed, ruling on June 15, 1993, that Massachusetts had failed to meet its constitutional duty to provide an adequate education to all public school children. The court based this decision on the wide disparities in educational opportunities available to children attending public schools in the state’s richest communities compared with those in the poorest. Writing for the majority, Chief Justice Paul Liacos declared that the education clause of the state Constitution commanding state government to “cherish…the public schools” is “not merely hortatory or aspirational” but places an enforceable duty on the executive and legislative branches to provide an adequate public education to all children. The court left it to the governor and the Legislature to come up with a plan–and money–to close the gap.
In addition, by requiring the development of statewide curriculum frameworks that would specify what skills and course material students would be expected to master throughout their school careers, the education-reform law set new standards for achievement that all schools could strive for, and be accountable to. In this way, the Education Reform Act sought not only to guarantee students in all communities the opportunity for an adequate education but to push all schools to raise their standards for educational adequacy.
Now, almost nine years later, it is possible to assess the efficacy of the Education Reform Act according to its two main goals: equalizing educational opportunity and improving school performance. To determine the impact of education reform, both financial and educational, in ways that make sense in relation to where each school district started out in 1993, I’ve grouped the Commonwealth’s communities into four broad categories by demographics:
ADVANTAGED communities are the 50 most affluent districts facing the least challenge in terms of student preparedness and support at home. These include Weston, Wayland, and Lexington.
MIDDLE Massachusetts covers the 140 or so districts in the middle of the state’s wealth and need distribution, places like Westborough, Canton, Dedham, Woburn, Stoughton, and Haverhill.
CHALLENGED communities include towns and smaller cities, such as Gloucester, Hull, Gardner, and Everett, that face serious demographic challenges in improving educational results.
MAJOR CITIES consist of the state’s 15 largest urban areas–including Boston, Worcester, Springfield, Chicopee, and Lawrence–which generally contain the largest concentrations of poor and underperforming students.
By 1999, the first goal of education reform–equalizing spending among districts–was largely accomplished. In 1994, there were sharp differences in the amount of money communities were spending on education, with demographically advantaged communities spending more. Everett, for instance, spent $4,400 per pupil, while Weston spent $8,900. Comparing average spending among demographic categories (see graph), the discrepancies were less stark but still significant: nearly $6,000 per student in advantaged communities, just under $5,000 in challenged districts, and just over $5,000 in major cities and mid-range communities.
The second goal of the education-reform effort, and spending, was to improve student achievement across the board. The additional money–which went to all school districts, though in varying amounts–was supposed to buy better results, using higher standards and statewide accountability, via MCAS, to drive improvement in all schools, whether in impoverished cities or affluent suburbs. So it is appropriate to ask: What did the new funding buy in terms of improved educational achievement?
The most important indicator of achievement is the Grade 10 MCAS, since that is the test that must be passed for graduation, and therefore the one that students–and their teachers–take the most seriously. This was especially true for last spring’s test-taking, which was the first time Grade 10 MCAS counted toward graduation. As reported last November, scores were up across the board, even in districts that face serious challenges in improving schools.
It’s impossible to know how much of that bounce in MCAS scores had to do with the stakes now attached to the test, but it certainly had some effect. For example, Grade 10 students in Cambridge probably took the 2001 MCAS much more seriously than did their predecessors who took the test the previous year. In 2000, Cambridge had an English Language Arts (ELA) Fail rate of 67 percent–37 percent who took the test and failed and 30 percent who were absent when the test was given and were given a failing score (Fail Absent). In 2001, the ELA Fail rate dropped to 30 percent, none of it attributable to absenteeism. Lincoln-Sudbury had a 23 percent ELA Fail rate in 2000–very high for an Advantaged district; that dropped to 1 percent in 2001. The percentage of students in ELA Advanced tripled between 2000 and 2001. Nothing can explain gains like these but students focusing on the assessment as opposed to blowing it off.
Still, it’s impossible to deny that much of the improvement in MCAS scores statewide, with the greatest progress in cities facing the biggest educational challenge, is a result of educators working hard to boost student achievement. In addition, as a result of being tested every year, schools have determined to end social promotion. With students less likely to have arrived in Grade 10 deficient in basic skills than were students in earlier years, test-takers in 2001 had a better chance of demonstrating competence on MCAS. This, too, is a sign of educational progress.
But rising pass rates are just one aspect of improvement, one indicator of whether the state’s investment in schools has paid off. After all, what we should look for in terms of improvement from a superintendent in Winchester or Belmont, where more than 95 percent of students pass the Grade 10 MCAS on the first try, is very different from what we hope for from an urban superintendent faced with a 40 percent failure rate. After four years of MCAS, we have enough data to do some preliminary analysis of how well education reform is working in districts of various types. While overall scores improved between 2000 and 2001, some systems have made exemplary progress over the four years of MCAS. I use an average of 1998 and ’99 MCAS scores as a baseline to which I compare 2001 performance, keeping in mind that success in Lexington may be different than success in Taunton.
In more affluent school systems, where nearly every student already passes MCAS, moving student scores up to the Advanced category is an important measure of progress (see graph).
Not surprisingly, some communities improved much more than others, by this measure. Wayland increased its representation in English Language Arts Advanced by 39 points, from 14 percent in 1998/99 to 53 percent in 2001; Longmeadow improved by 26 points, from 7 percent to 33 percent in Advanced. Lexington and Hopkinton both increased Advanced scores in ELA by 27 points and in math by 27 points. Groton-Dunstable Regional and Wellesley gained 28 and 26 points, respectively, in math Advanced, much stronger performance than exhibited by similar communities.
What is interesting about these overperforming communities is that none received much new state aid–only a few hundred dollars, compared with the state average of $960 and the $2,000 to $3,000 received by urban systems. One could say the state got the biggest bang for its buck from these communities. But the dramatic increases that took place in demographically advantaged districts more likely came about as a result of old-fashioned competition. With MCAS publicizing performance indicators for the first time, people in Wellesley or Winchester certainly want their schools to do as well or better than schools in Wayland or Westwood.
In typical communities–the 130 districts around the midpoint of the state’s demography–there also are systems that are ahead of the class in improving student achievement. In moving students into ELA Advanced, Nauset Regional improved 26 points over the 1998/99 base, from 9 percent to 35 percent; Braintree 24 points, from 12 percent to 36 percent; Norwood 23 points, from 4 percent to 27 percent; and Northampton 23 points, moving from 6 percent to 29 percent. In math Advanced, Newburyport increased by 33 points, up from 9 percent to 42 percent; West Bridgewater, up 24 points, from 6 percent to 30 percent; and Danvers 21 points, going from 4 percent to 25 percent.
But in many mid-level communities, expanding the ranks of “proficient” students is the more pressing educational goal. Overall, these districts increased the portion of their 10th-grade students in the Proficient category by 4 percentage points in English Language Arts and 12 percentage points in math (see graph). But once again, some communities made much more progress than the rest. Middleborough, East Bridgewater, Granby, Maynard, and Uxbridge sharply increased their students’ representation in ELA Proficient by 17, 20, 21, 22, and 26 points, respectively. In math Proficient, Canton, Foxborough, Tyngsborough, and Sutton did particularly well, improving by 23, 24, 25, and 27 points, respectively.
Some of these high-performing districts have received substantial amounts of education reform aid (East Bridgewater, Uxbridge, and Tyngsborough, for instance), but most received relatively modest aid, at or below average. It is likely that the additional funding helped boost achievement in the middle-rank communities that had been underspending prior to reform. But it is also important to identify policies and practices that have improved learning in those districts that made dramatic progress without a major infusion of state funding.
In these communities, the focus is appropriately on raising students out of Fail in the Grade 10 MCAS, while helping other students move into higher scoring categories. Quincy had a 38-point improvement, reducing its failure rate from 58 percent in 1998/99 to 20 percent in 2001; Peabody improved by 36 points, from 56 percent Fail to 20 percent; Medford by 38 points, from 66 percent to 28 percent; and Everett by 36 points, from 61 percent to 25 percent. In moving students out of ELA Fail, Fairhaven, Ludlow, Holbrook, Medford, and Methuen posted gains of 20, 18, 17, 16, and 16 points, respectively.
Easthampton and Taunton demonstrated strength that was particularly broad, posting gains in both English and math, and all the way up the scoring ladder. Easthampton notched a 47-point improvement in moving students out of math Fail, improving from 55 percent Fail in ’98/99 to 8 percent in 2001, and a 21-point boost out of ELA Fail, down from 31 percent to 10 percent. Taunton improved by 38 and 23 points, respectively, in getting students out of Fail in math and ELA. In ELA Proficient, Taunton improved by 15 points, from 18 percent to 33 percent. But Easthampton led the field of challenged communities with a gain of 23 points in ELA Proficient, up from 24 percent to 47 percent between ’98/99 and 2001.
In terms of reform funding, many of the districts in this category received substantial amounts of aid. But some of the top performers (Easthampton, Medford, Holbrook, Quincy, and Maynard) received less than average education-reform funding.
Massachusetts’s urban districts face monumental challenges in improving educational achievement, yet some have posted impressive gains in helping more students pass MCAS. Brockton has made the most progress here, improving by 41 and 20 points, respectively, in moving students out of math and English Fail. In 2001, 35 percent of Brockton 10th-graders failed math, compared with 76 percent in 1998/99. In 2001, 24 percent did not pass ELA, compared with 44 percent failure in the 1998/99 base. Brockton also led the major-city field in gaining 10 points in ELA Advanced, moving from 2 percent to 12 percent. Somerville made a 38-point gain in moving students out of math Fail, while Boston and New Bedford posted solid gains of 16 and 17 points in getting students out of ELA Fail. In terms of new state funds, virtually all of the major cities receive $1,500 to $3,000 per pupil in school-reform aid–a major investment that, in some cities, is clearly paying off in improved achievement. (It is worth noting that Brockton was one of the original districts that brought the 1978 Webby legal action.) Brockton’s education-reform aid is among the top 10 districts in terms of per-pupil amount. Given the progress of Brockton in improving achievement, the money has apparently been spent well.
As we finish Year Eight of the current effort, several points are clear. Education reform has dramatically directed new funds to those districts that need them most for moving their students up to high standards. Achievement is improving all across the state, with the 2001 Grade 10 results reflecting solid gains from city to suburb. Some systems do better than others, even with similar resources and similar demography, and we don’t understand why. There is much analysis that needs to be done by nonprofits, higher education, and public agencies to figure out what’s working and how to spread educational success across the Commonwealth.
The Education Reform Act has been remarkably successful at equalizing funding and beginning the process of standards-based reform. Despite solid successes, however, the limits of the 1993 law are becoming clear.
After four years of MCAS we know the dimension of the educational challenges we face in Massachusetts. Most students in the state are doing well, not only on MCAS but on the National Assessment of Educational Progress, the SATs, and most other measures of educational achievement. But performance continues to lag in certain areas and among certain groups. Many special education students are finding success on the MCAS elusive. Students whose ability to read and understand English is limited continue to be at risk of failure. Vocational education students are more likely to not do well on MCAS than other students. And many students in our cities still do not possess the basic skills they need to succeed in life.
Vocational students made solid gains on the 2001 MCAS, a positive portent for their ultimate success. Targeted policies can potentially deal with the issues of special education and language-challenged students. Utilizing permitted accommodations to specific special needs and, in limited cases, developing alternative assessments, should help special education students demonstrate competence. Innovative language acquisition policies and programs that in fact do teach students English in a timely fashion will greatly boost both learning and MCAS scores of those whose first language is not English.
Major urban districts, however, present a more daunting challenge, since more than one-quarter of our students go to school in the cities. High schools especially need to be reconfigured for success. More flexibility and management power for principals; formal, productive relationships with community colleges and other community-based educational resources; and class sizes sharply reduced by bringing in specialists in teaching reading and math to older students are all parts of the prescription for urban-education success.It is also time to incorporate additional years of instruction into the urban curriculum. All children can learn, but we know that all students do not learn at the same rate. We also know that students from advantaged backgrounds will have much greater access to learning-enrichment opportunities, including private tutors, than students of more modest means. For students in Wellesley and Woburn, K-to-12 makes sense. For many students in the cities, K-to-13 or K-to-14 should be an option.
Has Massachusetts gotten its money’s worth out of MCAS? The answer can only be yes. The billions of dollars distributed since 1993 have gone where they were most needed and produced measurable educational results. The good news is that most of our students are doing well, and nearly all of them are doing better. Now it’s time to focus on those students still in danger of being left behind.
Robert Gaudet is a senior research analyst at the Donahue Institute of the University of Massachusetts.