I hear a lot of things about charter schools in Oakland, some good, some bad, some ugly. And while there are some hard numbers available from the OUSD Board’s engagement session, around both student demographics and performance, we are missing some critical data that ultimately informs how we think about what those numbers mean.
The key variable is choice, in one obvious way and one less obvious one.
Charter parents are different from non-charter parents in one key aspect—they made an educational choice. I am not saying that all charter parents are the most informed or most active or that there are not some very challenged and challenging parents in the charter sector. And it may be that they are more disadvantaged than the average Oakland parent, and not able to make a zip code choice to move to better schools. We just don’t know.
But that is a difference, and we need to analyze more clearly what this variable means for the overall equation.
The less obvious choice variable is in schools allegedly choosing students, and “creaming” top students while returning more challenging ones to the District. I have covered this before in Getting Honest on Charter School Admissions and Catching Bad Actors, and I believe that this does happen, exactly why, and how much, are open questions.
It really does matter though. If charters are taking top students and screening out more challenging ones, then it’s hard to argue that the superior academic outcomes are really school effects. And if the increased flexibility charters have, is used for nefarious purposes then their claim for that freedom is diminished, alongside apples to apples comparisons.
Public Data We Need
- Attrition rates for all schools- disaggregated by race and key subgroups (SpEd, EL, etc)
- Suspension/Expulsion rates for all schools- disaggregated
- Achievement Gap data for all schools- this can be calculated from existing data
- Disproportionate discipline data for all schools- again this could be calculated from existing data
Data I don’t think we have, but should create and publicize
- Individual student growth year to year- disaggregated
- Uniform school culture survey data from students, staff, and families
- Peer groupings of schools based on student demographics for comparison (including charter and district schools)
I came from NYC where we had over 1200 schools, and 1.1 million students, every school gave annual high quality surveys to staff, students, and families. Individual student growth was tracked year to year, and schools are rated based on absolute performance, student growth and also student subgroup growth. Here is the NYC school report card, which actually has relatively robust information for families, about both school performance and culture. Maybe not the most digestible format, but good data. Oakland could take a lesson.
The data here will not answer all of our questions, but it will get us talking the same language, and allow us to dig deeper, uncovering root causes, bad actors, and misaligned incentives.
Much of the debates in Oakland around charters are based on rumors—they may be true rumors, but it’s very hard to have substantive conversations based on second or third hand anecdotes. Good policy depends on public deliberations with good data. I think we could raise and focus the debate, as well as make better policy decisions with better data.
So here is the latest demographic and performance data for Oakland sectors, what do these numbers really mean—I am not sure. Next year let’s have some better data, analysis and answers.
|Race/ethnicity/group||Charter run schools||District run schools|
|Students with IEPs||7.8%||11.6%|
Student performance on the Smarter Balanced Assessments