[Just days after this blog went “live,” the Calif. Dept. of Education released some of the teacher-related data that I cite as missing-in-action for the last five years. This is a good start.]

In the face of the damage the gleeful Trump administration has been wreaking on the world, I’ll confess that I’ve had a hard time keeping my focus on the mismeasurement of K-12 education. Harm to humans caused by Trump’s White House and MAGA Republicans in Congress is so vast that it makes harm to data and evidence seem tiny. But these two varieties of harm – one small and one large — are related. When the House passed the budget bill into law on July 3, it provoked me to get back to work. On July 14, the Supreme Court gave the Trump team a green light to begin laying off 1,400 Dept. of Education staff, even before lower courts had ruled on the legality of this executive action. My anger has sparked me back to action.

The larger trend: falsehoods are battling facts, as evidence is disappearing

The sheer volume of lies and half-truths have made it difficult for many to find facts at all. Others have stopped searching, and rather than try to sort fact from fiction, have come to distrust all sources of information. To further cloud the earth in a fog, the Trump administration has defunded practicing scientists, some of whom are harvesting data and building it into evidence. The span of disciplines ranges from astronomy to microbiology, from the weather to economic data. On July 3, the Washington Post ran a story with this headline: “Why some fear government data on the U.S. economy is losing integrity.” The story revealed that Commerce Secretary Lutnik wants to change how economic growth is measured, and cut the Bureau of Labor Statistics by about 8 percent. When he who holds the ruler also holds the funding, watch out.

Data damage in the U.S. Dept. of Education

Education data has been damaged, as well. The Institute of Education Science and the National Center for Education Statistics are two of the organizations that have been red-penciled out. Almost all staff have been fired. NCES shrank from 100 to 3, and IES from 175 to 20 as of March 14, according to the Hechinger Report. Even the nation’s report card, the test dubbed the National Assessment of Educational Progress (NAEP), has been cut back. To learn more, read this news story from FedScoop, “What the Dismantling of the Education Department Means for Its Data” (April 22, 2025).

Consider one example: EdFacts. In May, the U.S. Education Department (ED) announced they would not return to pre-COVID gathering of students’ detailed scores for every state. This means no scale score data. Only proficiency counts: the proportion of students meeting or exceeding each state’s proficiency threshold.  One recently fired Education Department leader, Susan Newman, commented to FedScoop: ““My guess is the system will be shut down at any given time — I don’t know how it could still operate without the folks that were working on it and without an actual [transition] plan in place.”

This prompted Sean Reardon’s team at Stanford, who created the Stanford Education Data Archive and their Educational Opportunity Explorer, to fire a yellow-flare alert email asking for comments in opposition to this policy be sent to the Federal Register. That email explains the harm that results from this policy.

“This is a five-alarm fire, burning statistics that we need to understand and improve education,” said Andrew Ho, a psychometrician at Harvard University and president of the National Council on Measurement in Education, on social media.

Calif. Dept. of Education (CDE) has been cutting back on evidence for years

Despite working within the blue bubble of California, the CDE has followed a parallel course of retreating from its responsibility for stewarding data about K-12 education. The last year for which rich data about teachers was published: 2019.* Course-taking and class-size data were also last published in 2019. College-going data is only available now for the class of 2022.

The testing program introduced in 2015 (CAASPP) tells us less about student learning than its predecessor the California Standards Tests (CSTs). Grade 2 results were visible with CSTs, which made baseline changes for grade 3 visible. But more consequential, the suite of high school assessments for the CST covered courses in all four core areas. These were end-of-course tests, matched to the curriculum. Every science course, every math course, had its corresponding CST. Participation rates were a clear measure of students’ appetites for demanding coursework.  Today the only high school courses that are tested are science (10th grade) and ELA and math (11th grade). The ELPAC adds information of a different sort about those who are emerging bilingual students. But the SAT and ACT have fallen out of fashion, and the CDE no longer reports this data through DataQuest.

Why is the CDE’s new growth measure off to a weak start?

In 2014, the CST and its accountability instrument, the Academic Performance Index, were retired. In its place, prompted by Federal legislation, was a new test, California’s version of the Smarter Balanced assessment (SBAC). Although that new test, called CAASPP, abandoned all the end-of-course high school tests in the CST portfolio, the CAASPP made possible what the CST could not: the measurement of growth. CAASPP was based on vertically equated scale scores. So grade-level progress could be inferred. But this benefit remained undeveloped until 2019, when the CDE finally released the first results. COVID then hit in 2020, which led the CDE to halt the calculation of growth scores. (It requires three consecutive years of test results, so the COVID interruption took an understandable toll.)

But now the CDE has rereleased the growth scores for 2024, based on results from 2022, 2023 and 2024. But rather than put them at the center-stage of their accountability system, the CDE has kept its growth scores in the wings. While they appear on the Dashboard, the results have no consequence when the CDE assigns schools or districts extra “help.” Instead, a fatally flawed combo of year-to-year change and score level prevails as the CDE’s indicator of which schools or districts are soaring or diving. That indicator is rendered as a color, and it is that color that is front-and-center for school board trustees, principals, reporters, realtors, parents and the public.

I will leave you with a question and a hunch. The question: why has the CDE and the State Board kept its long-awaited growth measure in the wings? The hunch: could the growth measures conflict with the Dashboard’s assigned colors frequently enough that it would call into question the Dashboard’s credibility?

Stay tuned. I will soon publish a follow-up post on this topic, looking at the high-stakes implications of this question for charter schools.

___________
* The Calif. Dept. of Education released data about teacher education and experience, plus student/staff ratio data, soon after this blog post was published.