MCAS vs. PARCC: Dead-heat in predicting college readiness
Study finds two tests equally good at gauging whether students are prepared for higher ed
THE MOST CONCRETE evidence state officials will have available in deciding whether to switch to the new PARCC test or to stay with MCAS turns out to be little help in answering the question of which test is a better measure of readiness for college.
The PARCC test was developed as a “next generation” assessment to be aligned with the new Common Core curriculum standards that Massachusetts and more than 40 other states have adopted since 2010. The great promise of PARCC was that it would be a better gauge than existing state tests of students’ readiness for college. But the first study in the country to directly test PARCC against a state test, a comparison of PARCC with the MCAS exam taken by Massachusetts students, shows that it performs no better in predicting which students are ready to do college-level work.
The analysis, commissioned by the state Executive Office of Education and carried out by the Princeton, New Jersey-based firm Mathematica Research and Policy, found that the tests “provide equally useful information about college readiness.” The study findings were publicly released today.
The results are a blow to those hoping PARCC would demonstrate a clear edge in providing a signal of college readiness. The study may only serve to cloud, not clarify, the issue as the state Board of Elementary and Secondary Education prepares to vote next month on which test to use going forward. (See main story, “State faces testing showdown.”)
Overall, PARCC and MCAS were equally predictive of college grades: Higher scores on either test were correlated with higher freshman-year grades. Performance on the two tests was also equally predictive of whether students would place into a remedial math or English course in college. The two tests also did equally well as SAT scores at predicting first-year college performance.
There were differences when researchers compared students meeting the basic benchmarks set by the two tests. Those students meeting the PARCC math standard for “college and career readiness” had a higher first-year grade-point average in college math courses (2.81) than did those meeting the MCAS “proficient” standard (2.39). Those meeting PARCC’s college-ready benchmark for math also were 11 percentage points less likely to need math remediation in college than those who met the MCAS math proficiency standard.
For English language arts, there were no statistically significant differences in how students did in freshman college courses based on the thresholds on the two high school exams. There also were no statistically significant differences in their rate of placement into remedial English language arts courses.
Jim Peyser, the state education secretary, whose office commissioned the report, says the study is “an important piece of the puzzle,” though not “by any stretch” the only thing education board members will weigh.
Whether PARCC was a better indicator of college readiness, however, was perhaps the central question as state officials began reviewing whether to scrap the state-based MCAS for the new assessment, which was developed by a consortium of states. “This study settles that, as much as it can be settled, and they’re both equally good,” says Peyser.
The weakness of the MCAS proficiency benchmark in predicting college readiness, says Peyser, is something that could be addressed by raising the passing standard on the test, a step that he says the state should take if the decision is to retain MCAS.
The study authors from Mathematica make much the same point. “Because the underlying scores on the MCAS and PARCC assessments are equally predictive of college outcomes,” they write, “Massachusetts policymakers have more than one option to align high-school mathematics test standards with college readiness: either adopt the PARCC exam, or continue using MCAS while simply setting a higher score threshold for college readiness.”
Chester says the state assessment should do more than just provide information on college readiness, but should also “mirror the kind of curriculum and instruction we want to see in our classrooms.”
On that score, he says, the feedback from educators he has received “is suggesting that PARCC is clearly requiring students to use higher order skills, to reason, to do research, and to do higher-level kinds of performance.”
Advocates of the Common Core curriculum standards and the assessments such as PARCC that were developed to test them say they require more of the critical thinking and problem-solving skills that students need to succeed in college and the workplace.
Peyser agrees that there are features of the PARCC test that seem to go beyond MCAS in ways that might be worthwhile. “The other side of the coin is that it’s not as if that’s rocket science,” he says. “MCAS could be improved to mirror some of those advantages PARCC has of test content.”
The principal arguments in favor of keeping MCAS, says Peyser, relate to maintaining control over the test at the state level, including decisions over the public release of exam questions after a round of testing.
Chester, appointed by the state education board during Gov. Deval Patrick’s reign, is closely linked to the PARCC exam. He serves as chairman of the governing board of state education commissioners overseeing PARCC. He will make a recommendation to the state education board, but he does not have a vote on the 11-member education panel.
Chester says he is still “very actively” studying the issue and the information available on the two tests and has not yet made a decision on what he’ll recommend.
Peyser, a longtime confidant of Gov. Charlie Baker’s who was appointed by the governor to the top education post in his cabinet, has seemed more skeptical of switching to the new Common Core-based PARCC test. Peyser is a voting member of the education board.
Baker has held off, to date, from weighing in on the test decision. However, he opposed the state’s adoption of the Common Core State Standards in 2010, arguing that it was a mistake to trade the highly-regarded Massachusetts standards for a set of curriculum benchmarks that the state would not have the same control over.
The board is slated to vote on the test issue at its meeting on November 17. “It would be great if at least at a leadership level we can come to some consensus as we approach the final meeting as to what the recommendation should be,” Peyser saysAs for whether such an agreement is likely, “we’ll see,” says Peyser. “I’m sure there’ll be a huddle.”