Seattle Education

For the news and views you might have missed

Study shows the Common Core PARCC test does not determine college readiness

Is the SBAC any different?

The PARCC tests have been criticized for being administered in high-stakes circumstances before they were validated. PARCC’s rejoinder is they had content validity, meaning that the test was built according to their committee reviewed specifications. But what is missing is predictive validity. That is, does the test validly measure the much vaunted touchstone criteria of “College and Career Ready?” After all, that is the entire rationale for the testing emphasis in schools.

The following is an article posted in Wait! What? that breaks down the study.

Fail

The Common Core PARCC tests gets an “F” for Failure

By Wendy Lecker and Jonathan Pelto

The entire premise behind the Common Core and the related Common Core PARCC and SBAC testing programs was that it would provide a clear cut assessment of whether children were “college and career ready.”

In the most significant academic study to date, the answer appears to be that the PARCC version the massive and expensive test is that it is an utter failure.

William Mathis, Managing Director of the National Education Policy Center and member of the Vermont State Board of Education, has just published an astonishing piece in the Washington Post. (Alice in PARCCland: Does ‘validity study’ really prove the Common Core test is valid? In it, Mathis demonstrates that the PARCC test, one of two national common core tests (the other being the SBAC), cannot predict college readiness; and that a study commissioned by the Massachusetts Department of Education demonstrated the PARCC’s lack of validity.

This revelation is huge and needs to be repeated. PARCC, the common core standardized test sold as predicting college-readiness, cannot predict college readiness. The foundation upon which the Common Core and its standardized tests were imposed on this nation has just been revealed to be an artifice.

As Mathis wrote, the Massachusetts study found the following: the correlations between PARCC ELA tests and freshman GPA ranges from 0.13-0.26, and for PARCC Math tests, the range is between 0.37 and 0.40. Mathis explains that the correlation coefficients “run from zero (no relationship) to 1.0 (perfect relationship). How much one measure predicts another is the square of the correlation coefficient. For instance, taking the highest coefficient (0.40), and squaring it gives us .16. “

This means the variance in PARCC test scores, at their best, predicts only 16% of the variance in first year college GPA.  SIXTEEN PERCENT!  And that was the most highly correlated aspect of PARCC.  PARCC’s ELA tests have a correlation coefficient of 0.17, which squared is .02. This number means that the variance in PARCC ELA scores can predict only 2% of the variance in freshman GPA!

Dr. Mathis notes that the PARCC test-takers in this study were college freshman, not high school students. As he observes, the correlations for high school students taking the test would no doubt be even lower. (Dr. Mathis’ entire piece is a must-read. Alice in PARCCland: Does ‘validity study’ really prove the Common Core test is valid?)

Dr. Mathis is not an anti-testing advocate. He was Deputy Assistant Commissioner for the state of New Jersey, Director of its Educational Assessment program, a design consultant for the National Assessment of Educational Progress (NAEP) and for six states.   As managing director for NEPC, Dr. Mathis produces and reviews research on a wide variety of educational policy issues. Previously, he was Vermont Superintendent of the Year and a National Superintendent of the Year finalist before being appointed to the state board of education. He brings expertise to the topic.

As Mathis points out, these invalid tests have human costs:

“With such low predictability, you have huge numbers of false positives and false negatives. When connected to consequences, these misses have a human price. This goes further than being a validity question. It misleads young adults, wastes resources and misjudges schools.  It’s not just a technical issue, it is a moral question. Until proven to be valid for the intended purpose, using these tests in a high stakes context should not be done.”

PARCC is used in  New Jersey, Maryland and other states, not Connecticut. So why write about this here, where we use the SBAC?

The SBAC has yet to be subjected to a similar validity study.  This raises several questions.  First and most important, why has the SBAC not be subjected to a similar study? Why are our children being told to take an unvalidated test?

Second, do we have any doubt that the correlations between SBAC and freshman college GPA will be similarly low?  No- it is more than likely that the SBAC is also a poor predictor of college readiness.

How do we know this? The authors of the PARCC study shrugged off the almost non-existent correlation between PARCC and college GPA by saying the literature shows that most standardized tests have low predictive validity.

This also bears repeating: it is common knowledge that most standardized tests cannot predict academic performance in college.  Why , then, is our nation spending billions developing and administering new tests, replacing curricula, buying technology, text books and test materials, retraining teachers and administrators, and misleading the public by claiming that these changes will assure us that we are preparing our children for college?

And where is the accountability of these test makers, who have been raking in billions, knowing all the while that their “product” would never deliver what they promised, because they knew ahead of time that the tests would not be able to predict college-readiness?

When then-Secretary Arne Duncan was pushing the Common Core State Standards and their tests on the American public, he maligned our public schools by declaring: “For far too long,” our school systems lied to kids, to families, and to communities. They said the kids were all right — that they were on track to being successful — when in reality they were not even close.” He proclaimed that with Common Core and the accompanying standardized tests, “Finally, we are holding ourselves accountable to giving our children a true college and career-ready education.”

Mr. Duncan made this accusation even though there was a mountain of evidence proving that the best predictor of college success, before the Common Core, was an American high school GPA.  In other words, high schools were already preparing kids for college quite well.

With the revelations in this PARCC study and the admissions of its authors, we know now that it was Mr. Duncan and his administration who were lying to parents, educators, children and taxpayers. Politicians shoved the Common Core down the throat of public schools with the false claim that this regime would improve education.  They forced teachers and schools to be judged and punished based on these tests.  They told millions of children they were academically unfit based on these tests. And now we have proof positive that these standardized tests are just as weak as their predecessors, and cannot in any way measure whether our children are “college-ready.”

The time is now for policymakers to stop wasting hundreds of millions of dollars, and thousands of school hours, on a useless standardized testing scheme;   and to instead invest our scarce public dollars in programs that actually ensure that public schools are have the capacity to support and prepare students to have more fulfilling and successful lives.

 

3 comments on “Study shows the Common Core PARCC test does not determine college readiness

  1. Pingback: From Seattle Education: Common Core PARCC test does not determine college readiness | The Forum of Washington State K14 Faculty Senate & Research Alliance

  2. skrashen
    May 31, 2016

    Studies showing grades are a good predictor of college success and that adding SAT scores does not increase the power of prediction:
    Bowen, W., Chingos, M., and McPherson, M. 2009. Crossing the Finish Line: Completing College at America’s Universities. Princeton: Princeton University Press.
    Geiser, S. and Santelices, M.V., 2007. Validity of high-school grades in predicting student success beyond the freshman year: High-school record vs. standardized tests as indicators of four-year college outcomes. Research and Occasional Papers Series: CSHE 6.07, University of California, Berkeley. http://cshe.berkeley.edu

  3. ciedie aech
    May 31, 2016

    After many years of working with low-income and mostly non-dominant-culture students, if anyone had simply asked I could have told them that a student’s attendance and grades in high school, especially the last two years of high school, were much more a predictor of future college success. Test scores were often dead wrong.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: