High school principals and leaders of unified and high school districts are about to trek through the data desert again. Another cycle of LCAPs and Differentiated Assistance and Technical Support and charter renewal hearings and more. And all of these are dependent upon interpreting signals of varying quality about student learning using a Dashboard whose logic rules are broken.

The consequences for charter high schools are the highest. Their survival may hang in the balance. But for all high schools, their reputations are on the line.

The trek for elementary and middle school leaders is bad enough. But high school leaders face a challenge of a different sort. As you know all too well, the CAASPP math, science and ELA tests are simply not taken by high school students in consecutive grade levels. Lacking that, high school leaders have no way to estimate the learning gain of the same students over consecutive years. All that remains is a collection of snapshots: one-time test events. No estimate of learning gain can follow.* That’s why the CDE growth measure is available only for schools serving students in grades 4-8.

Even worse, the snapshot measure of 11th grade math achievement is given to all students regardless of whether they’ve taken the courses that cover the standards that might appear in the CAASPP math test. This violates common sense, as well as the bible for assessment professions, Standards for Educational and Psychological Testing (2014).

It wasn’t this bad in the prior accountability era. During the era of the Academic Performance Index (API), the prevailing test was the California Standards Test (CST). It was an end-of-course exam, given only to the students who took that course. There were CSTs for almost all high school courses spanning all core subjects. The benefit of this design was clear. Although learning gains year-to-year could not be inferred (the test was not vertically equated), the CST did what they were designed to do: measure student mastery of courses students studied at each grade level.

But the absence of evidence in the Dashboard about high schools goes well beyond artifacts of academic accomplishment.

College Enrollment Data is Two Years Behind

Given this era’s inability to estimate the year-to-year gains of high school students, you’d think that the Calif. Dept. of Education would license the most current available data about college enrollment from the National Student Clearinghouse. No such luck. The most current data available today (September 8, 2025) is for the graduating class of 2022. For comparison, New York state and others provide this information through its accountability system for the graduating class of 2023. (Correction: as of September 21, the enrollment data for the grad class of 2023 was finally posted.)

Even better would be reporting of the rate at which a high school’s grads who attend CSU or UC are compelled to take remedial courses. Illinois reports their grads’ remediation rates. Why does the CDE choose not to?

Course Participation Data is Missing in Action

The rate at which students choose to enroll in courses reveals a great deal about the school’s climate. Is it a school where it’s cool to strive, to study foreign languages, to study music or art, to take challenging math or science courses? How many students are enrolled in remedial courses? That information used to be readily available. It stopped with the 2018-19 academic year.

Class-Size Data Is Missing, Too

A principal has little authority in most districts to determine how many teachers will be in their budget. But a principal usually has considerable discretion in deciding who gets to teach, and what subject they’ll teach. The key lever: the master schedule. One consequence of a master schedule is the rate of out-of-field teaching. Another is average class size. This used to be reported in DataQuest for core course subjects, but that stopped with the 2018-19 academic year. It has never appeared on the Dashboard. Because class-size is a proxy for working conditions, I’m surprised that the teachers unions have been willing to let the CDE stop reporting it.  Note that class-size data is reported in SARCs, but is not available through DataQuest, Ed-Data or in research files.

Staffing

How does teacher staffing vary by core course area? What about counselors and psychologists and librarians? High school principals in most districts can bargain with their district’s HR department, swapping certificated positions for classified positions. This results in dramatic variations in the number of counselors and librarians working in high schools. Why aren’t these data reported in DataQuest? Why are they absent from the Dashboard? They are certainly collected by the CDE.

Student Attendance

Yes, the CDE possesses attendance data, but includes in the Dashboard only the data for chronic absences. Why? Much more could be learned if more scalar, granular measures of attendance were visible. Because revenue to districts results from attendance, this has direct impact on funding. More details are needed.  I offer kudos to the CDE for reporting continuous enrollment and stability. The differences among schools reveals much.

High School Principals and District Leaders Deserve More Evidence

Let’s start telling the CDE higher-ups we expect more evidence for high schools. I think the CDE should return to the higher standard of the API era, and fulfill its duties as stewards of public data about public schools – and stop keeping key data inside its castle walls. With the state superintendent of public instruction on the ballot in November 2026, isn’t it a good moment to voice our higher expectations?

_______________

* Unified school districts are the one exception. They can relate the CAASPP results of students who were in their districts in 8th grade and who remain enrolled through 11th grade three years later, and look at the different in scale scores. By comparing those results to those of districts where students are highly similar (in parent education and FRPL especially), these district leaders could assert: “Compared to districts with students very much like our own, our district is doing (better / about the same as / worse) in contributing to our students’ academic growth in (math / ELA / science).”