The dishonesty of the assessment debate

The dishonesty of the assessment debate

By Dale Chu

New Mexican students in grades 3 through 8 will be taking a new state assessment this upcoming spring. The “New Mexico Measure of Student Success and Achievement” — which also goes by the shorter though no more elucidating acronym “NM-MSSA” —replaces the PARCC test after it was scrapped with much fanfare.

These days, the testing churn seems to be a competitive sport among states, and it’s certainly their prerogative to do so regardless of the suboptimal policy repercussions or the adverse effect upon students and teachers. But the reasons generally offered for switching tests reek of disingenuity, so much so that it requires a suspension of disbelief to look the other way.

Consider the explanation recently given by a high-level state official for New Mexico’s decision to adopt NM-MSSA:

“It will be unique to New Mexico, and not a standardized consortium test. While still aligned to the Common Core State Standards, specifically, the test is different in two ways. First, the test has a significantly shorter time frame. Second, while leveraging some existing test items in order to allow for longitudinal data, the test will also incorporate New Mexico-developed test items that are culturally relevant and reflect the voices of our teachers and communities.”


First, it’s unclear what aspects of literacy and numeracy—the two areas assessed by PARCC and now NM-MSSA—are “unique” to New Mexico. Last time I checked, the ability to closely and attentively read texts or to grasp mathematical procedures was fairly universal in nature, with no state having a monopoly on either subject.

Second, what is a “standardized consortium test?” Do they mean a standardized test that’s administered in more than one state? Like the SAT, which New Mexico recently selected as its high school assessment?

Third, what’s “culturally relevant” and “reflective of New Mexico’s voices” when it comes to decoding, comprehension, multiplication, or fractions? Although New Mexico has the highest Latino population of any state as well as a significant Native American populace, there’s little about the assessment of reading and math that should differ as a result of the state’s demography.

But let’s take a look at the “significant” time savings. One of the common excuses proffered for switching tests is being able to dedicate more time to teaching and learning. (Never mind that assessment is integral to good instruction). In New Mexico, students in grades K-6 are required to have a minimum of 180 days or 990 hours of instruction annually. According to a state press release, the NM-MSSA is three hours shorter than PARCC (i.e., 6 hours versus 9 hours). What’s an additional three hours in the context of an entire school year? A whopping 0.3% increase in instructional time.

How about price? The per student cost for PARCC in New Mexico was about $31; it’s about $39 for NM-MSSA. And this doesn’t take into account the time and effort expended by state personnel in service of the histrionics (e.g., “community conversations”) that are de rigueur whenever state agencies endeavor to make these sorts of changes.

State mandarins have jumped in with additional justifications, saying that PARCC was “high-pressure and counterproductive” while NM-MSSA will be “more meaningful,” “less burdensome,” and “better responsive to students’ needs.” This lavish trafficking in feel good language belies the aversion to accountability provided by high-quality assessments. To wit, in New Mexico and nationally, there has been a concerted effort to render accountability meaningless. Unless we’re honest about the underlying forces against testing, the outlook for a robust next generation assessment system—one that remains muscular on equity and excellence—appears murky.

To ensure forward progress, New Mexico must focus on making NM-MSSA a high-quality assessment: one that continues to be aligned to state standards and provides critically important longitudinal data. The NM-MSSA item bank—which will be expanded over time—will be central to keeping the state’s assessment and accountability systems on track. Although the switch away from PARCC was less than ideal, there’s a real opportunity here to place an emphasis on test quality. Let’s hope state officials make the most of it.

No Comments

Post A Comment