Blog2023-03-03T07:34:49-08:00

Why would any scholar oppose interim assessments?

Times are strange. Scholars in this post-enlightenment era are supposed to favor reason and champion scientific inquiry. Evidence is the raw material for science. One would hope that reasoned arguments also rest on evidence. That's what tests produce. They make student learning visible. So why would any scholar oppose tests, particularly the interim assessments designed to measure the progress of [...]

By |March 16th, 2022|Categories: assessment, research|

Combining multiple measures of reading so teachers refer the right kids to Tier 2

I’m eager to share a story of our use of two different measures of emerging readers’ skills to gauge whether teachers are referring the right kids to Tier 2 reading support. But not before I blow off some steam. I’m just tired of hearing the mantra of “multiple measures.” It usually is chanted as an admonishment, at times with a [...]

By |September 17th, 2021|Categories: analysts versus practitioners, analytic methods, assessment, reading|

Dyslexia screening is a litmus test of a district’s commitment to teaching all students to read

What is the implied promise in the contract between school districts and the parents who bring them their children to educate? The first promise is that their children will learn how to read. In some districts, that may come to pass. But in too many districts, only 6 or 7 or 8 out of every 10 students learns to read. [...]

By |August 28th, 2021|Categories: dyslexia, legislation, reading|

School and district plans require a long view of the progress of learning

While California district planning teams sweat the last weeks of LCAP work, many are missing recent measures of learning. Without the CAASPP, districts that lack solid interim assessment results may feel adrift, lost at sea. But the missing test results provides the ideal incentive to get some perspective, and take the long view on how your district has been doing. [...]

By |June 8th, 2021|Categories: analytic methods, assessment, LCAP, planning|

Are too many ed researchers talking mainly to each other?

At this year’s annual American Educational Research Association (AERA) conference, education scholars presented 2,188 papers, symposia and roundtables–all of it online, of course. I was happy to “attend” without the cost of travel. But I was especially eager to scout new findings on two topics–dyslexia and identification of English learners–relevant to districts in California my firm serves as analytic partner [...]

By |May 12th, 2021|Categories: conference, research|

How far from the evidence should you stand in order to make sense of it?

If analysis is what you’re doing, you face some of the same decisions as when you view a work of art. How close to a painting should you stand? If you’re looking at a sculpture, from what vantage points should you look? If you stand too close, what you see may form a pattern. The pattern may look pleasing. It may [...]

By |March 30th, 2021|Categories: analytic methods, applied analytics|

Napa CoE is taking new steps to help its districts build better LCAPs

Supt. Barbara Nemko has taken steps to help Napa County districts plan smarter. Barbara Nemko, Napa CoE’s superintendent, wants districts’ plans to get smarter, and she’s ready to shoulder more responsibility for making that happen. Already she's invested in better evidence for planning, and in analytic support and coaching for her LCAP chief. And she's welcomed district leaders [...]

By |February 19th, 2021|Categories: LCAP, legislation, planning|

Why writing an LCAP in this pandemic is an opportunity to transcend the status quo

I have a hunch that you may not be looking forward to starting your district’s new three-year LCAP. You’re up to your eyeballs with short-term planning for getting instruction delivered. Who’s teaching remotely? Who’s teaching in classrooms? How is remote assessment working? Who’s the best teacher in our elementary team at delivering instruction via Zoom? It makes three-year planning seem [...]

By |January 22nd, 2021|Categories: district management, LCAP, planning|

Let’s stop trying to turn teachers into analysts

In ten years, will historians look back at the 30-year effort to get teachers to interpret test data as a failure? I’m impatient. I don’t want to wait ten years. Although I’m no historian, here’s my verdict. Yes, it was a failure. This long push to make teachers do “data-driven decision-making” has flopped. But of all those who share responsibility [...]

By |November 25th, 2020|Categories: assessment, district management, mismeasurement|

Who’s gathering diagnostic evidence when schools open?

This is a story of connecting the dots, of diagnostic riddles. This is a story about the value of practicing science rather than teaching science. Yogi Berra, Florence Nightingale, and a doctor who cares for the homeless have some guidance for you. A doctor working at homeless shelters in Boston named Dr. Jim O’Connell noticed an odd pattern. By the [...]

By |August 22nd, 2020|Categories: analytic methods, COVID-19, district management|

Prepare for surprises when you assess students in the fall

I am weary from reading predictions of pandemic-related learning loss. They all share two assumptions I question: (1) that missing days of school must mean losing knowledge and skills, and (2) that when students eventually take interim assessments in the fall, they will all show declines relative to their prior score. Many of those asserting this learning loss cite no [...]

By |July 28th, 2020|Categories: analytic methods, assessment, COVID-19|

Measuring gaps the right way: Stanford’s Educational Opportunity Project

Although the California Dept. of Education’s Dashboard continues to mismeasure gaps, a team of social scientists at Stanford are interpreting gaps wisely. Meet Sean Reardon and his talented colleagues at Stanford’s Center for Education Policy Analysis, the Stanford Education Data Archive, and the Educational Opportunity Project. I’m not simply applauding the quality and quantity of research their team is producing [...]

By |June 16th, 2020|Categories: analytic methods, gap analysis|

Gap measures in the Dashboard are wrong

As education leaders declare their support for equal justice, and repeat their commitment to reduce the achievement gap, they are hard-pressed to answer the most direct questions about that gap in their own districts. “How big is the math achievement gap, based on students’ ethnicity?” “Does it grow larger or smaller as students move from elementary through middle school?” I [...]

By |June 11th, 2020|Categories: accountability, dashboard, gap analysis, mismeasurement|

Three dimensions to parent and student engagement well suited to this COVID-19 era

The moment in the fall when school opens will be a moment of reckoning for districts. They will learn to what degree their parents and students have confidence in their abilities to minimize the risk of infection with COVID-19, and maximize the learning opportunities. It's a measure of parent engagement districts will take seriously. Funding will hang in the balance. [...]

By |June 4th, 2020|Categories: accountability, applied analytics, COVID-19, parent engagement|

The root of “accountability” is counting

What do the CDE and CDC have in common? They share a problem counting people correctly. The Centers for Disease Control’s (CDC’s) counting problem was that they were pressed by the White House to boost their testing numbers. So a deferential functionary in the Centers for Disease Control had the not-so-bright idea of merging two utterly different tests: the swab [...]

By |May 23rd, 2020|Categories: accountability, dashboard, Policy, State Board of Education|

Norms, test results and Lake Wobegon: a cautionary tale

As you are putting finishing touches on your district’s LCAP or your site’s SPSA, I hope you're valuing your interim assessments more highly. With this April's CAASPP/SBAC testing cycle scrubbed, that’s all you’ll have to work with next year when drawing conclusions about student learning. Making sense of those results at the school or district level requires the same knowledge [...]

By |May 8th, 2020|Categories: accountability, assessment, planning|

Free of the CAASPP/SBAC, and free to create new evidence of student learning

The appearance of the COVID-19 virus has led to the sudden disappearance of the CAASPP/SBAC tests here in California. Other states have also lost their spring standardized tests. Is this a moment to dread the loss of one estimate of students' academic progress? Or is this a moment to welcome the use of other measures to estimate students’ level of [...]

By |April 24th, 2020|Categories: assessment, district management, LCAP, planning|

From virus infection rates to students’ test scores … How interested in the numbers are we?

Now that I’m stuck indoors, I’m reading more, especially news stories that attempt to squeeze meaning from numbers about infection rates, projected deaths, hospital beds, respirators … Yes, it’s gruesome, but I am curious to learn how everyone is trying to make sense of the data. Most everyone is wondering: “What do the numbers mean, and what should I do [...]

By |April 10th, 2020|Categories: analysts versus practitioners, Policy|

The Legislative Analyst’s Office Smart Ideas About LCAPs

Not far from the Capitol Building where legislators make laws, and not far from the CDE’s tower where higher-ups make policy, sits the Legislative Analyst’s Office (LAO). That’s where more than 50 smart and rational souls critique sloppy bills and clumsy policies, with only one concern in mind: whether those bills and policies are best for Californians. They look at [...]

By |March 12th, 2020|Categories: gap analysis, LCAP, legislation|

Would you use flawed evidence as the foundation of your school and district plans?

Supt. Fred Navarro at Newport-Mesa USD has taken a stand on the question of the quality of evidence. He has told his planning teams to use no bad evidence. That includes demoting the Dashboard, as needed. But why is this prudent and reasonable step noteworthy? It's unfortunately noteworthy because the CDE has built a Dashboard that produces flawed conclusions from [...]

By |March 3rd, 2020|Categories: dashboard, LCAP, planning|