Follow up on Ed Waivers, Junk Rating Systems & and Misplaced Blame – New York City’s “Failing” Schools

About a week ago, I put up a post explaining a multitude of concerns I have with the current NCLB waiver process and how it is playing out at the state level. To summarize, what we have here is the executive branch of the federal government coercing state officials to simply ignore existing federal statutes, by granting waivers to state officials who adopt the current administration’s preferred education reform strategies. Setting aside the legal/governance concerns, which are huge, few if any of these preferred strategies are informed by any sound research/analysis.

Equally if not more disturbing is how this waiver process is playing out at ground level, and the message it sends.  Once again (as in Race to the Top) the administration has encouraged the adoption of ill-conceived homogenized policy frameworks across states. States are encouraged, through the waiver application process, to propose how they will abuse data yielded by their generally inadequate data systems to inappropriately classify local public schools as a) in good standing, b) focus or c) priority.  States have been granted some flexibility as to how they will abuse their own data to gerrymander local schools into these categories.

Once schools are placed into these categories – regardless of any validity check on the meaning of those categories – those schools become subject to a prescribed set of largely unproven state intervention options – with yet another layer of complete disregard for statutory and constitutional authority (state statutes & constitution) of states to take such action.

In short, what we have is the federal executive branch  using authority it doesn’t have to grant states authority they don’t necessarily have, to unilaterally impose substantive changes on individual local public schools.

But I digress, yet again.

So, what about those categories? And how do they break down? In other words, which districts and which children are most likely to be subjected to these interventions/experimentation?

I started last week with the state of New York, pointing out that on average, New York State has (ab)used its currently available data to characterize as priority schools and focus schools, primarily those schools that are a) high in poverty, b) high in minority concentration, c) low in taxable property wealth and d) low in aggregate household income per capita.

To rub salt in the wound, even though back in 2003 New York State was ordered to correct deficiencies in school funding across districts, and even thought state itself proposed a relatively inadequate formula to address those disparities, the state has continued to ignore its own formula – shorting by the largest amounts, those districts that are home to the most priority schools. (for a thorough analysis, see: https://schoolfinance101.com/wp-content/uploads/2010/01/ny-aid-policy-brief_fall2011_draft6.pdf)

This is where we last left off:

But, my analysis last week largely left out New York City schools. Bear in mind that New York City like many other high need districts around the state continues to be substantially shorted under the state’s own proposed foundation funding formula.

Demographics in New York City

First, let’s do a walk-through of the demographic characteristics of New York City schools by their classification under the new state rating system. Bear in mind that the degree of variation in demography across schools in New York City is somewhat more limited than across the state as a whole.  On average, in New York City, more schools have higher average minority concentrations and higher average low income concentrations than the state as a whole.

The following figures play out largely as one might expect – with priority schools having a) the highest concentrations of low income students, b) elevated concentrations of black and Hispanic students, c) and the highest concentrations of LEP/ELL and special education students.

In other words, it would certainly seem that while reformy-rhetoric dictates that demography should not determine destiny, demography remains a pretty strong determinant of a school’s post-nclb-waivery-classification-status.

When I run statistical tests of the relationship between demographic factors and the likelihood that a school is identified as a “priority” school, I find:

  1. A 1% increase in % Free Lunch (controlling for grade level) is associated with a 4.6% (p<.01) increase in the likelihood of being classified as a “priority” school.
  2. A 1% increase in % Black (controlling for grade level) is associated with a 1.2% (p<.01) increase in the likelihood of being classified as a “priority” school.
  3. A 1% increase in % Hispanic (controlling for grade level) is associated with a 2.1% (p<.01) increase in the likelihood of being classified as a “priority” school.
  4. A 1% increase in % Special Education (controlling for grade level) is associated with a 13% (p<.01) increase in the likelihood of being classified as a “priority” school.

Resources by Demographics in New York City

My next cut at the NYC data explores the position of priority schools with respect to a) low income students, b) special education students and c) per pupil spending (school level).

In the first two figures, we see that priority secondary schools (or at least those serving students through the secondary grades) are somewhat spread out by % free or reduced lunch. Note that bubble/diamond/triangle size indicates school size. However, there don’t appear to be any priority high schools among the lowest poverty schools, and a relatively large share appear among the higher poverty schools. They are relatively average, compared to schools with similar % free or reduced lunch, in terms of their school site spending.

In the second figure, we can see that those high schools identified as priority all have at least a minimum threshold of children with disabilities. But, all high schools with few or no children with disabilities are in good standing. That includes numerous relatively small high schools that also have much higher per pupil spending even though they have far fewer children in special education.

Priority middle schools also have relatively average per pupil spending compared to schools with similar concentrations of low income or special education students – but they all tend to have relatively high concentrations of low income children and children with disabilities.

None of the middle schools with lower concentration of low income children or children in special education are classified as priority schools.

For elementary schools, priority schools (again in red… but somewhat hidden behind others) tend to be very high in concentrations of low income students and moderately high in concentrations of children with disabilities. Again, they are relatively average in their per pupil spending compared to similar schools.

Outcomes in New York City

Now, the findings in the previous section might… might… on the outside chance suggest that there really is something about these priority schools that warrants additional investigation. After all, they do have similar resources to other schools serving high need populations. And while the priority schools tend to have high need populations, other schools with similarly high need populations and similar resource levels are either “focus” schools or “in good standing.”

But, it is still important to remember that the state has NOT identified as priority schools, any schools that serve low need populations and do less will in terms of outcomes compared to other schools serving low need populations and with comparable resource levels.

For this next graph, I took the NYC teacher level value-added data and averaged them to the school level for all teachers in each school, as in my previous post on NYC charter school teachers. (caveats included on previous post). Note that I’ve removed quite a bit of the variation in these value-added scores by aggregating them to the school level prior to constructing this graph.

While the differences in mean teacher value-added do fall in the right rank order – highest mean for good standing, second for focus and lowest for priority, the variations among schools around these means certainly muddy the waters a bit. Yes, the means are different, but the boxes overlap quite substantially!  And the consequences of being in one group versus another are quite substantial.

Shouldn’t it be the case that no priority schools have “better” average teachers (by the limited noisy measure used here) than schools in “good standing?” How does it make sense that there are schools in “good standing” that fall well below the average for teacher value-added of “priority” schools, even if those cases are relatively rare?

For one last statistical shot at teasing out what’s going on here, I ran a logistic regression to figure out a) whether and to what extent these differences in value-added are significant predictors of landing in priority status and b) whether a school that has more low income and minority students is more likely to land in priority status – even if it has the same teacher value-added scores?  That is, even if those statistically “bad” teachers aren’t to blame!

In other words, is there racial/socio-economic bias in the school ratings among schools with similar teacher value added?

Here it is:

For interpretation, I used the average value-added percentile of teachers here (in place of the standardized value-added score).

What this output shows is that for a 1 percentile rank increase in the school average teacher value added, a school is about 8.4% less likely (91.6%% as likely) to be classified as priority. Having a higher aggregate value-added teacher percentile rank significantly reduces the likelihood that a schools is identified as priority. That makes sense… but certainly isn’t the whole story… and as the graph above shows, there’s a fair amount of variation within each category.

Here’s the problem. Even among schools that have the same aggregate teacher value added percentile:

  1. schools with 1% higher free or reduced price lunch are 4.4% more likely to be classified as priority
  2. schools with 1% more black children are 8.7% more likely to be classified as priority
  3. schools with 1% more Hispanic children are nearly 8% more likely to be classified as priority
  4. schools with 1% more children with disabilities are 9% more likely to be classified as priority.

And each of these biases is significant, and of non-trivial magnitude, as well.

Let’s review what this means in simple, blunt terms.

These findings mean that even in schools where the teachers have the same average value added rank/percentile, schools with more low income, minority and special education children are more likely to face anti-democratic intervention!!!!!

Now that doesn’t seem to make a whole lot of sense when the supposed reason for the need for these interventions is that these poor, minority schools simply have all of the “bad teachers,” and when a central strategy to be employed is to replace/displace the school staff.

Closing Thought

I can hear the reformy outcry that this whole multi-level coercive illegal power grab to impose reformy intervention is in fact a critical step toward guaranteeing that demography isn’t destiny… to make widely known the fact that we’ve continued to provide minority and low income students the worst teachers and the worst schools. And that those teachers and schools must go [even if many of them outperform teachers in schools in “good standing” on their noisy/biased VAM estimate?]! Even if that means complete disregard for our current system of government. And even if that means that the parents, children and employees in these schools have to be forced to forgo additional constitutional and statutory protections (depending on the imposed reform).

It would be one thing if the measurement systems we were using to classify these schools as “failing” were valid for making such decisive declarations. That is, valid for making the assertion that it is the quality of the school and its teachers – and other controllable factors – that are primarily responsible for the performance.

Such arguments would be more reasonable if the disparate impact shown here was actually a disparate impact of school quality variation rather than a disparate impact in the application biased of school ratings, constructed from inadequate measures and inappropriate analysis (utter disregard for error, bias, etc., etc., etc.).

Ed Waivers, Junk Rating Systems & Misplaced Blame: Case 1 – New York State

I hope over the next several months to compile a series of posts where I look at what states have done to achieve their executive granted waivers from federal legislation. Yeah… let’s be clear here, that all of this starts with an executive decision to ignore outright, undermine intentionally and explicitly, federal legislation. Yeah… that legislation may have some significant issues. It might just suck entirely. Nonetheless, this precedent is a scary one both in concept and in practice. Even when I don’t like the legislation in question, I’m really uncomfortable having someone unilaterally over-ride or undermine it. It makes me all the more uncomfortable when that unilateral disregard for existing law is being used in a coercive manner – using access to federal funding to coerce states to adopt reform strategies that the current administration happens to prefer. The precedent at the federal level that legislation perceived as inconvenient can and should simply be ignored seems to encourage state departments of education to ignore statutory and constitutional provisions within their states that might be perceived similarly as inconvenient.

Setting all of those really important civics issues aside – WHICH WE CERTAINLY SHOULD NOT BE DOING – the policies being adopted under this illegal (technical term – since it’s in direct contradiction to a statute, with full recognition that this statute exists) coercive framework are toxic, racially disparate and yet another example of misplaced blame.

States receiving waivers have generally followed through by using their assessment data in contorted and entirely inappropriate ways to create designations of schools and districts, where those designations then permit state officials to step in and take immediate actions to change the governance, management and whatever else they see fit to change in these schools (and whether they have such legal authority or not).

Priority schools are the bottom of the heap, or bottom 5% and are subject to the most aggressive, and most immediate unilateral interventions (seemingly with complete disregard for existing state statutory or constitutional rights of attending children, their parents or local taxpayers, as well as explicit disregard for existing federal law).

Implicit in these classifications – and the proposed response interventions – is the assumption that priority schools are simply poorly run schools – schools with crummy leaders and lots of bad, lazy, pathetic and uncaring teachers… who have thus caused their school to achieve priority status. They clearly must go… or at least deserve one heck of a shaking up! Couldn’t possibly be anyone else’s fault. After all, the state must have clearly already done its part to provide sufficient financial resources, etc. etc. etc. It must be the bad teachers and crappy principals. That’s all it can be! Therefore, we must have immediate wide-reaching latitude to step in and kick out the bums – and heck – just close those schools and send those kids elsewhere, or convert those schools to “limited public access, privately governed and managed institutions” (privately manged charters) where layers of constitutional rights for employees and students may be sacrificed.

New York State’s Waiver Hit List

New York State Education Department released their hit list of schools recently.

http://www.p12.nysed.gov/accountability/ESEADesignations.html

Here’s quick run down of some key characteristics of schools and districts under each designation.

Demographics of Hit List Schools Statewide

I did a quick merge of the above classification data with data from the 2011 NYSED school report cards to generate the graph below, weighted by school enrollment.

https://reportcards.nysed.gov/

Notably, schools in “good standing” are lowest BY FAR in % of children qualified for free lunch, percent of children who are black, or Hispanic, and are also generally lower in percent of children who are limited in their English Language Proficiency. Race and poverty differences are particularly striking!

In short, the Obama/Duncan administration has given NY State officials license to experiment disproportionately on low income and minority children – or for that matter – simply close their schools. No attempt to actually legitimately parse “blame” or consider the possibility that the state itself might share in that blame.

AFTER ALL, NEW YORK STATE CONTINUES TO MAINTAIN ONE OF THE MOST REGRESSIVE STATE SCHOOL FINANCE SYSTEMS IN THE COUNTRY! 

And the underlying disparities in that system are quite striking!

But perhaps, if we just require all of these priority schools to be turned over to charter operators from New York City, they can work their miracles statewide – for less money – with the same kids – and generating decisively better outcomes????? And of course, dramatically reducing labor costs??? Okay… so the data don’t really support any of that.

Economics of Hit List School Districts Statewide

Here’s another perspective on the districts that house schools across these categories. New York State’s school funding model derives a combined wealth ratio based on two key factors that seem to be strong determinants of the local revenue those districts can raise on their own – Income per Pupil (aggregate income of residents divided by district enrollment) and Taxable Assessed Property Values per Pupil. Here’s how these two measures play out across categories (excluding New York City schools, where income and property wealth are a) difficult to accurately compare with the rest of the state and b) disproportionately weigh on the comparisons).

Say it ain’t so? Really, are the districts of schools that are in “good standing” that much higher income and that much higher in taxable property wealth than the schools that are identified as priority schools? Couldn’t be. The economics of these local communities clearly has nothing to do with it? Does it? It’s those damn lazy, apathetic teachers and their greedy overpaid administrators!

State Funding of Hit List School Districts Statewide

A while back, I wrote this brief:  NY Aid Policy Brief_Fall2011_DRAFT6

In the brief, I explain the layers of problems of the New York State foundation aid formula. I also wrote this blog post: https://schoolfinance101.wordpress.com/2011/11/08/more-inexcusable-inequalities-new-york-state-in-the-post-funding-equity-era/

In the brief, I explain how the current New York State school foundation aid formula is hardly equitable or adequate for meeting the needs of children attending the state’s highest need districts. But to rub salt in the wound – FOR THE PAST SEVERAL YEARS, THE GOVERNOR AND LEGISLATURE HAVE CHOSEN TO DISREGARD ENTIRELY THEIR OWN WOEFULLY INADEQUATE STATE AID FORMULA.

Even worse, when the Governor and Legislature have levied CUTS TO THAT FORMULA, they have levied those cuts such that they disproportionately cut more state aid per pupil from the higher need districts. As of 2011-12, some high need districts including the city of Albany had shortfalls in state funding (from what would be expected if the foundation formula was actually funded) that were greater than the total foundation aid they were actually receiving.

So, here’s one last graph for my statewide analysis – in which I summarize the foundation formula shortfalls per pupil by district, for the schools in each class. The foundation formula shortfall compares:

  1. What should be: full foundation formula aid that would be received per total aidable foundation pupil unit, using the 2011-12 Foundation Level (on Page 21, here: http://www.oms.nysed.gov/faru/PDFDocuments/Primer12-13A.pdf) and multiplying that foundation level times the regional cost index and pupil need index (and then determining the state share per the formula as described in the above linked document).  To be concise, this is the “what should be” target, but is based on last year’s target. So I’m being generous to the state, because the foundation level should have gone up again from 2011-12 to 2012-13 (see p. 21)
  2. What is: actual state foundation aid to be received from 2012-13 Aid Runs, with foundation aid adjusted for Gap Elimination Adjustment and partial restoration of GEA.

So… here are the funding gaps by status, with respect to the state’s own inadequate funding targets:

So, with respect to its own formula, the state is pretty much underfunding everyone. Of course, as I’ve noted on many previous occasions, the state is also dumping disproportionate unneeded tax relief aid into the wealthiest districts, which also happen to have the schools in “good standing.”

What we have here is that the state is most substantially underfunding – WITH RESPECT TO ITS OWN FUNDING FORMULA – the districts that house the majority of children enrolled in schools that the state has identified as “priority” schools.

Hey… here’s an idea for ya’…WHY NOT MAKE IT A PRIORITY TO ACTUALLY FUND THESE SCHOOL DISTRICTS AT AT LEAST THE LEVEL THAT YOU, THE STATE HAVE DECLARED ADEQUATE FOR THEM?

Is it reasonable to play this subversive “blame the teachers and administrators” game and kick them out and close their school when it is you… the STATE that has shorted them disproportionately on funding – FOR YEARS – with respect to your own funding formula benchmarks.

We can discuss the adequacy of those funding benchmarks another day. (read the brief: NY Aid Policy Brief_Fall2011_DRAFT6)

Additional analysis of New York City schools hopefully in the near future!

What do the available data tell us about NYC charter school teachers & their jobs?

This post is about rolling out some of the left over data I have from my various endeavors this summer.  These data include data from New York State personnel master files (PMFs) linked to New York City public schools and charter schools, NYC teacher value-added scores, and various bits of data on New York City charter and district schools including school site budget/annual financial report information.

Here, I use these data combined with some of my previous stuff, to take a first, cursory shot at characterizing the teaching workforce of charter school teachers in New York City. All findings use data from 2008 to 2010.

To summarize the following figures, New York City charter school teachers:

  1. Are relatively inexperienced (but not all in their first 3 years)
  2. Are young (but not all 22 or 23 years old)
  3. Have longer contract years
  4. Are paid well for their experience, with a portion of the additional pay covering the additional time
  5. Have smaller classes to teach
  6. Work in schools that spend much more than surrounding district schools
  7. Work in schools that serve much less needy student populations than surrounding district schools
  8. Have 4th grade students with relatively “average” to below average scale score outcomes compared to schools serving similar populations
  9. In some cases, have 8th grade students with high average scale score outcomes compared to schools serving similar populations
  10. Where data were available, have value-added scores which vary from the citywide average in both directions, with KIPP being the lowest and Uncommon schools the highest (in the aggregate). Notably, Uncommon schools also have consistently smaller class sizes and the fewest low income students.

That’s it. Nothing particularly surprising. Nothing astounding. No miracles, but, a subset of schools that are in some ways different from district schools. Further, they are different in ways that perhaps aren’t that sexy… aren’t that “reformy.” Salaries rise with experience and they just happen to rise faster than district school salaries in some cases. Class sizes are small… especially at the middle school level… even though we are often told that class size reduction is only important in early grades. And, well, the outcomes are kind of mixed.

Experience & Age

The first figure shows that the typical NYC charter school teacher has about 6 tears experience in 2010. Some anomalies occur in the 2008 data. NYC district school teachers have about double that. But, it should be noted that some of this difference is likely explained by the average age of the schools themselves, in addition to higher turnover (a topic for a later date).

One would expect that as the schools mature, their staff will also somewhat – unless they actually make a concerted effort to excess teachers beyond a specific experience level. If they do begin to accumulate more experienced staff, they will also accumulate the higher expenses associated a more experienced staff and accumulated retirees (assuming any stay long enough to retire from these schools).

The average charter school teacher is between 25 and 30 years old, compared to the average district school teacher being around 40.

I’ll admit, I’m having some trouble reconciling average years of experience being around 6 and age around 25, but it would appear for the most part that the experience and age line up such that most went directly into teaching from their undergraduate studies. That is, there aren’t likely many later career changers here.

Contracts & Compensation

Personnel master file data report that teachers in Harlem Village Academies, Achievement First, KIPP, Uncommon and Lighthouse schools are on 11 and 12 month contracts. Harlem Childrens’ Zone was similar but did not report 2010 data. By contrast BOE contracts were 10 month (according to averages taken from full time individual teachers in the data).

Average salaries were highest in 2010 in KIPP schools, with BOE schools second, and other high profile charter chains not far behind. However, these averages are a function of degree levels, experience levels and contract months.

So, we can use a regression model to isolate those effects and compare “otherwise similar” teachers, and to determine the wage differential associated with the additional contract months (to the extent that contract months vary within the charters).

This next figure uses a regression model to compare the salaries of “otherwise similar” teachers by job classification, full time status, degree and experience level.  On average, a teacher with all of the same attributes can make a salary that is $8,000 higher in an Ucommon school and nearly $6,000 higher in an Achievement First School. Teachers in HCZ and KIPP schools make an annual salary about $4,000 higher than those in BOE schools.  Controlling for months worked, KIPP and HCZ salaries are comparable to district salaries and Uncommon and Achievement First salaries remain higher.

I’ve often asserted that many teachers might be willing to/interested in working a longer year for a higher annual salary. After all, it might be desirable to earn more doing what you do best and are professionally trained to do, rather than searching for other part time Summer work that does not always take advantage of your expertise.  That may not be the case for all teachers, but would likely be the case for at least some (as it is for these charter teachers). More pay for more time. Makes sense.

The following figure uses the regression model to lay out the predicted salaries for a teacher in 2009 across experience levels. Notably, much like BOE school salaries, NYC charter teacher salaries do increase with experience. In particular, consistent with what I’ve shown previously for salaries in Texas, New Jersey and Connecticut high profile charters, Achievement First and Uncommon schools pay what appears to be a significant salary premium. None pay significantly less than district schools, and none are flat with respect to experience (like Gulen school salaries in NJ or TX). KIPP and others track for the most part with district salaries.

Resources & Working Conditions

Now, not to beat an issue into the ground, but the following figure summarizes the per pupil operating expenditure differences between NYC charters and district schools (with the data becoming cleaner, more accurate and precise each time I take a swipe at them). These margins of difference are ever so slightly higher than those in my previous study, but the model is similar. High profile charters in NYC substantially outspend district schools serving similar student populations.

I’ve addressed critiques of these figures previously and have provided sensitivity analysis and discussion here: https://schoolfinance101.wordpress.com/2012/05/07/no-excuses-really-another-look-at-our-nepc-charter-spending-figures/

The full length report on these and related figures is here:

  • Baker, B.D., Libby, K., & Wiley, K. (2012). Spending by the Major Charter Management Organizations: Comparing charter school and local public district financial resources in New York, Ohio, and Texas. Boulder, CO: National Education Policy Center. Retrieved [date] from http://nepc.colorado.edu/publication/spending-major-charter.

Now, for this next figure, I use a regression model to compare the demographics of NYC charter schools to district schools, where the dependent measure is the population characteristic, and the independent measures include a) grade range served, b) year of data and c) borough of location of the school. That is, these comparisons are of the student population compared to same grade level schools in the same borough.

Uncommon schools in particular have uncommonly low rates of the lowest income children. This is typical in New Jersey as well (for North Star Academy in Newark). They also have low LEP/ELL rates, but so do they all. Achievement First and KIPP schools also have low rates of the lowest income children – quite a bit lower than district schools. Success Academies in particular have very low rates of LEP/ELL children. All have relatively low rates of children with disabilities.

These differences – when combined – make for very different school environments than for district schools in the same borough.

Now, despite serving much less needy student populations, these charter schools tend to have much smaller class sizes than district schools serving the same grade levels. HCZ schools have small elementary and middle school class sizes – from six to eight fewer students in a class than district schools. Uncommon schools have very small middle grades classes, as do Village Academies and Achievement First Schools. Interestingly, despite their financial dominance, KIPP schools have smaller class sizes than district schools, but not to the extent of others (note that some of KIPPs financial resources are allocated to such endeavors as KIPP to College).

A really interesting twist here is that the emphasis on smaller classes appears to be at the middle school level.  My gut instinct as a former middle school science teacher tells me that this makes sense. But, if one sticks strictly to the strongest findings in research literature, one would expect targeting at the elementary level and would have less justification for targeting small classes to the middle level. But perhaps small middle school class sizes are actually part  of the secret sauce of successful charter schools?

NYSED School Report Cards: https://reportcards.nysed.gov/

Outcomes

Now, with smaller class sizes, higher paid teachers working longer years, and less needy students we would not only expect a higher average performance level, but we’d also likely expect some higher average rates of gain in these schools.  And lottery based studies of NYC charters have revealed some positive findings of differences in performance between lotteried in and lotteried out students. It’s critically important to understand that while the sorting process may have been randomized in these studies the contexts – peer groups, etc. – into which they’ve been sorted is anything but. Again, different peers, different levels of resources – class sizes – length of year – teacher salaries, etc.  And some of this stuff is the stuff of difference that is of particular interest for setting policy.

The figures I offer below are merely descriptive. Again I use regression models to compare the outcomes of schools that are similar in terms of students served. But regression models in this case are used for descriptive purposes (as they often are). I’m really just describing the average difference in outcomes between the charters and schools serving similar populations. In the third slide below, I do use the average teacher value-added outcomes for those schools – but note that very few teachers in any given NYC school actually have sufficient numbers of students in tested grade levels for generating the value added estimates.

In the first figure here, we see that 4th grade assessment performance in many NYC charters is, on average, lower than district schools. The vertical axis indicates the number of points (around a mean near 680) above or below the mean of similar district schools. Most differences are well less than 1 standard deviation ( which is about 17 pts., but a standard deviation would be a pretty big difference. Again, these are levels, not gains.). 4th grade math scores are higher in Uncommon and Achievement First Schools. Most other 4th grade scores are similar to or lower than district schools. In v1 (version 1) models, I use % free or reduced price lunch (which varies little across many of these schools) and in Version 2, I use percent Free only. Charter schools compare less favorably when I evaluate them on the basis of their free lunch populations alone.

Uncommon Schools and Village Academies appear somewhat stronger on their eight grade performance, while others are a mixed bag. Again, these are level – scale – scores, relative to schools serving similar populations. Now, there has been at least some discussion of attrition as a factor in Village Academy performance. I’ve personally found attrition to be a major issue in Uncommon’s Newark School, North Star. But that’s an issue for another day.Matt Di       Carlo over at Shankerblog did a nice explanatory piece on the role of attrition the other day.

Here, what we have are average scale scores that are quite a bit higher than “demographically similar” schools. That said, when I account for free lunch instead of free or reduced lunch, those differences are somewhat muted. That is, when I compare against schools with comparable shares of the lowest income children, positive charter performance differences are muted and negative ones larger.

Again, Harlem Chidlrens’ Zone scores are relatively low, especially when comparing on the basis of fee lunch (v2 models). KIPP scores are pretty much comparable to demographically similar schools and Achievement First and others a mixed bag.

Finally, this figure shows the average value-added scores for teachers in these schools. Uncommon Schools are the only ones that appear to have noticeably above average teachers. One might stretch the data here beyond their capacity to argue that perhaps Uncommon schools are getting results for the premium they pay their teachers… and the fact that they generally provide those teachers with smaller classes. Not an entirely unreasonable story, though a) the underlying dynamic is likely far more complex than this and b) the value added metrics used here are anything but stable and/or decisive.

NYC Teacher Value Added Scores: http://www.ny1.com/content/top_stories/156599/now-available–2007-2010-nyc-teacher-performance-data#doereports

Implications

So, what does this all tell us? Perhaps not much… while at the same time, quite a bit more than we may have already know. At the very least, these data should serve to clear up common misconceptions & reformy misrepresentations of the NYC charter sector.

No… these schools are not staffed by peace core like, minimum wage missionaries (not that I’ve really heard this one all that much). They’re getting paid reasonably and getting paid a premium for at least some of the extra time and effort. Is it enough to sustain the model? Can these schools afford this down the road? Who knows?

No… these schools are NOT proving that money and class size don’t matter.  That would be difficult to prove with a subset of schools that for the most part spend more than district schools and provide smaller class sizes.

And No… these schools are not totally and invariably kicking butt on all student outcome measures, be they performance level measures or performance gains.

And finally… No… these schools are not doing it all… with the same kids!

But this isn’t just a laundry list of stuff to try to hold against these schools. Rather, it’s an attempt to lay out more clearly with publicly available data, what’s going on here, in an effort to move the conversation forward, beyond the usual talking points and into stuff that may really matter!

Helicopters can improve minority college attendance & other misguided policy implications: Comments on the Brookings “Voucher” Study

Here’s my quick response to the Brookings report released yesterday on the long term effects of vouchers on a randomized pool of participants in New York City.

Let’s say I conducted a study in which I rented a fleet of helicopters and used those helicopters to, on a daily basis, transport a group of randomly selected students from Camden, NJ to elite private day schools around NJ and Philadelphia. I then compared the college attendance patterns of the kids participating in the helicopter program to 100 other kids from Camden who also signed up for the program but were not selected and stayed in Camden public schools. It turns out that I find that the helicopter kids were more likely to attend college – therefore I conclude logically that “helicopters improve college attendance among poor, minority kids.” The simple policy solution then is to rent more helicopters and use them to send kids, well, wherever. After all, it’s the helicopters that matter????? Clearly, that would be a ridiculous assertion.

Similarly, the “major” findings of the new Brookings study were that, in particular, black participants in the voucher program seemed to have an increased likelihood of attending college. The study involved a randomized pool of individuals who qualified, applied and received the vouchers (and attended a private school) and those who qualified, applied and didn’t receive the vouchers.

The study purports to find [or at least the media spin on it] that “vouchers” as a treatment, worked especially for black students. I won’t  spend my time quibbling over design and statistical issues here, because I think the simplest issue to address – the big one – is the definition of the treatment itself.

This is not necessarily a study of whether “vouchers” as a treatment affect long run outcomes. Just like my hypothetical above had little to do with helicopters! Rather, it is a study of using “vouchers” as a funding mechanism to place a relatively small sample of low income minority children into a set of schools with fewer low income and minority peers. Schools that happen to be private schools. So it’s not really a study of whether “private schools” are more (or less) effective either. As such, the study really has little or no policy implications for “voucher systems” themselves, or private schooling.

Personally, I was struck to find that the only reference to peer group or peer composition was in a single sentence at the end of the report – but this sentence really says it all:

To the extent that student learning is dependent on peer quality the impacts reported here could easily change.

Yeah… that’s no throwaway line here! In fact it has the potential to completely re-frame the entire paper.

So what is the treatment?

Well, the use of a “voucher” system to alter the educational setting for a group of kids is most certainly not the treatment. Voucher is merely the mechanism used here to achieve the treatment.  It may be a policy mechanism that is useful under limited circumstances to achieve changes in educational setting. But the “voucher” is NOT the treatment.

And the use of vouchers in this narrow context may have few if any policy implications for new voucher programs in Indiana or Louisiana if they do not result in low income minority students being better integrated with higher income peers predisposed to have college aspirations.

The sector of schooling is most certainly not the treatment either – public or private school – catholic or non-religious school. While the sector of schooling is a variable in this analysis, it also may or may not have anything to do with the characteristics of the educational setting that most influenced college going behaviors. There’s a whole separate body of literature on that topic that is notably absent in this report. And it’s quite possible that we could find a policy mechanism which has nothing to do with either vouchers or private, or catholic schools which shifts more low income minority students into educational settings that promote college going behaviors.

So before we get in some huge tizzy about “vouchers” and “private school” effects, lets go back and define the treatment in this study for what it actually is and then try to figure out what it means in terms of effective policies for increasing college attendance among low income and minority students.

Tangent:  I found this paragraph particularly interesting:

The voucher offer also has a much larger impact than does exposure to a more effective teacher. Elementary school teachers who are one standard deviation more effective than the average teacher are estimated to lift their students’ probability of going to college by 0.49 percentage points at age 20, relative to a mean of 38 percent, an increment of 1.25 percent (Chetty et al. 2011b). If one extrapolates that finding (as the researchers do not) to three years of effective teaching, the impact is 3.75 percent. The impacts identified here for African American students—an increase of 24 percent—are many times as large.

Basically, what this paragraph/extrapolation boils down to is the distinct possibility that the variations in setting (largely peer group) achieved in this voucher study yield what appear to be stronger effects than the measured (really noisy measured – which may matter here) variations in teacher effect in the Chetty study. In other words, quite possibly… peer composition is actually the strongest in-school effect on long term student outcomes!?? [note that this is speculative based on the somewhat questionable comparison made in the above paragraph].

Second Tangent: Quite honestly, the cost comparison comments in this paragraph are so shoddy, poorly documented, etc. that they do much to undermine the report and should be cut.

These impacts are somewhat larger than the long-term impacts of the much more costly class-size intervention in Tennessee. Dynarski et al. (2011) estimate that being assigned to a smaller class in the early elementary grades increased college enrollment rates among African Americans by 19 percent (5.8 percentage points on a base of 31 percent). Reduction of class size in Tennessee was estimated to cost $12,000 per student (Dynarski et al. 2011), whereas the social cost of the SCSF intervention was about $4,200 per student to the foundation and reduced costs to the taxpayer by reducing the number of students who would require instruction within the public sector. If the government had paid for the voucher, the expenditure could have taken the form of a simple transfer from the public sector to the private sector, which in the long run need not add to the per-pupil cost of education. In fact, it could decrease costs because Catholic schools spend less on average than public schools. Around the time of the SCSF evaluation, New York City public schools spent more than $5,000 per student, as compared to $2,400 at Catholic schools (Howell and Peterson 2006, 92).

First, this paragraph and the sources it cites provide little if any solid evidence regarding “costs” and “expenditures” citing only that the Archdiocese of NY at the time said so (p. ii) regarding Catholic school costs being about $2,400 per pupil (and ballparking out of nowhere the $5k public district figure). This paragraph also compares estimated expenditures on one strategy (class size reduction) in one context (TN) and point in time to partial subsidies on another strategy in another context at another point in time.  A while back, I criticized another report, also by Matt Chingos (nothin’ personal, I generally like his work) in which he referred to Class Size Reduction as the “Most Expensive School Reform” without legitimately comparing the costs of CSR to any other reform.  The above paragraph is strikingly similar in its gaping holes of logic and evidence. I could go on and on. The authors also make no attempt to provide reasonable assumptions & estimates for the full cost of operating and scaling up a large scale voucher system. As such, this stuff really has not place in this paper.  For a more thorough discussion/analysis of public/private school spending, see: http://nepc.colorado.edu/publication/private-schooling-US

Since the authors didn’t actually conduct any real analysis of schooling resources/finances, they really shouldn’t have gone there in their conclusions. This kind of back of the napkin, half-baked cost savings assertion really cheapens a study that does have some interesting findings to offer.

Parsing Poverty: Charter Market Segmentation across & Within U.S. Cities

Late Thursday, I posted a follow up on the distribution of children with disabilities by disability classification across charter and district schools in New Jersey and Pennsylvania. This post explores the distribution of children who qualify for free lunch in charter schools and district schools within the city limits of major cities. Note that the unit of analysis – the charter market, per se – that I am using here is the “city limits” and all schools – charter and district that lie within specific large central cities (urban centric locale code 11). Why does this matter? And why do I do it this way? One reason is data convenience. Another is that it’s important to recognize that many U.S. Cities are carved into multiple school districts, often times relating to a long history of housing and school segregation. I’ll provide some Kansas City examples below. In other words, the traditional district system already creates some artificial boundaries of segregation, onto which the charter system is now being superimposed.

Again, in this post, I’m focusing on low income children. When it comes to low income children, Charter schools in many settings do tend to serve smaller shares of the lowest income children, leaving larger shares behind in district schools. Here, my interest is in evaluating major urban centers across the country to determine the extent to which these patterns are systematic. And also, the extent to which these patterns vary by charter market share. I believe I’ve mentioned on previous posts that it seems most likely that charter schools can maintain a cream-skimmed population when charters generally have smaller market share. That is, when there’s enough cream to go around.

So, here goes… using data from the 2009 Common Core (there were more gaps in free lunch and enrollment data in 2010) of Data from NCES.

Visualizing Market Segmentation

This first figure conveys my strategy for my first cut comparisons of charter market segregation by student low income status. Imagine for any given city, there are a certain number of total enrolled students and a certain number of low income enrolled students. If charter schools in the aggregate in that city are serving an equitable share of low income children, the share of low income children served should match the overall market share. That is, if charter schools serve 10% of all children enrolled in schools in the city, then charter schools should also be serving 10% of the low income children. One can graph cities accordingly.

When charters – or the citywide charter sector as a whole are operating at parity – they will fall along the red dotted diagonal line. But, when charters are serving a lower share of low income children, they will fall below the line. When they are serving a higher share than the city schools (which may include multiple districts/entities), they will fall above the line.

Major U.S. Markets

Here, I review the charter market  share and free lunch shares for small market share cities,  medium market share cities, and large market share cities.

An important clarification is in order with regard to certain cities like New York and Boston. In the NCES Common Core, these cities are actually broken up into many separate cities. Boston is broken up into Boston (central city),  Dorchester, East Boston, etc. New York is broken up into Manhattan/New York, Bronx, Brooklyn and Queens is broken up into separate cities (essentially neighborhoods).  As such, they each appear as smaller markets than one might expect.

Small Market Share

This first figure shows markets with relatively small charter market share – less than 5% of children enrolled in charters. Circle size indicates the aggregate enrollment across charter and district schools – or market size. In most of these markets including the Bronx, New York (Manhattan) and Philadelphia the charter free lunch share is below parity. This means that these charters are leaving behind more lower income kids than they are taking in.

Medium Market Share

In lower middle-market cities, things seem to even out a bit,  but many are still below the parity line, including Newark, which I have pointed out on many occasions. San Antonio charters have a higher free lunch share than the districts serving San Antonio.

Large Market Share

In larger market share cities there seems to be even greater parity, and in Minneapolis and Kansas City it would appear that the charters on average are serving more lower income children than the district schools within the city limits. But, some caveats are in order here…

Kansas City provides an interesting case where these descriptions are perhaps a bit deceiving. Kansas City – through strategic housing planning and school district boundary planning – was carefully carved into racially identifiable neighborhoods throughout the early half of the 20th century (and then some  – actually as late as 2007).  If we look at all zip codes within the broad city limits (below) we include several school districts. Among these, we include the central city district KCMSD, but we also include the predominantly white, higher income communities north of the river. Notably, as a function of the state’s original charter statute, there are no charters north of the river (as they were only permitted in KCMSD and St. Louis originally). So, this comparison needs some tweaking and more fine grained local analysis – like that which follows.

Finally, here’s a snapshot of the largest market share cities. Notably, even in Columbus Ohio and Washington DC, with relatively large total market size and large market share, charters seem to be under-serving the lowest income students. With such large market share, one can expect that this has a significant adverse effect on district schools.

Digging into the Cities

One can take the same approach and bring it down to the zip code level within cities. Now, because of data constraints, all of these analyses focus on school enrollments of schools that happen to be located in a particular zip code. I’m not able to look, for example, at the zip codes from which the students actually come to these schools. So, one must assume there to be some fuzziness around the zip code boundaries. That some lower income students do travel outside of their zip code to attend charter schools.

Los Angeles

Across all zip codes in Los Angeles, many operate at parity, but in some cases, charters are clearly underserving low income students at least compared to district schools in the same zip code.

Zooming in on zip codes with charter market share below 10%, we can see that there are actually quite a few where low income populations are underserved, and not so many where they are significantly overserved, per se.

Philadelphia

Philadelphia displays a similar pattern across all zip codes, with low income children significantly underserved in some markets including those with relatively high charter market share (over 40%).

Focusing on zip codes with smaller market share,  we see that some particularly large markets have charter schools that are underserving low income children.

This next figure paints an alternative picture of charter market segmentation in Philadelphia. Here, I do the more typical relationship between total zip code % free lunch and charter % free lunch. But, I also include all of the zip codes that don’t have any charters. AND, I plot as squares the overall market size (total enrollment) and as circles the charter market size (charter enrollment). Yeah… way to much for any one graph… but bear with me.

So, if there is a large square, with a small circle in it, that’s a high total enrollment zip code, with relatively low charter enrollment. If there’s a smaller square with large circle, that’s a smaller zip code market with a large charter enrollment. All of those squares along the bottom are zip codes with no charter market share at all. In Philadelphia, there exist several high poverty zip codes with no charters. And there are some large high poverty zip codes with large charter market shares – BUT… where those charters are underserving low income children.

Perhaps even more interesting but not surprising is that in this predominantly low income city, there are three relatively large markets, with very large charter market share, that are NOT zip codes with high low income share. That is, the charter sector has grown and taken large market share in what might generally be considered more advantaged zip codes.

Baltimore

Here’s Baltimore (I always get asked to include Baltimore… so here it is). And Baltimore’s low charter market share zip codes display what seems to be a typical pattern, with most having charter sectors that serve fewer than expected low income children (again, those qualified for free lunch).

Larger market share zip codes are more of a mixed bag in Baltimore, as one might expect.

As with Philadelphia, we see that some charters have established in generally lower poverty zip codes, but in this case, the charters are serving a lower income population than the other schools around them. But, in the higher poverty zip codes, charters tend to be serving lower poverty populations. And this includes several of the larger markets and includes those with particularly large charter market share. Again, my policy concern here is the effect that this has on the district schools which must serve those not siphoned off into charters. And this does not include the special education or ELL sorting, which in most cases I’ve evaluated thus far tends to be more extreme.

Kansas City

I close this one out with Kansas City. Recall that it appeared that charter schools in Kansas City actually serve a larger share of low income  kids than district schools in the city. But, that finding was complicated by the fact that the city limits actually include what many might consider to be the equivalent of suburban districts (the ones presently fighting in court against accepting KCMSD students on interdistrict transfer – another story for another day).  For the most part, across all KC zip codes that have charters, charter enrollment share and free lunch share are in line – but for 64113.

Citywide, charter market shares are concentrated in higher poverty zip codes, but for one, again, as a function of district organization and the original charter statute.

But even Kansas City has some issues regarding the extent to which charter schools actually reach the lowest income neighborhoods in the city. In fact, a really cool Kauffman Foundation (Charter Advocates for the most part) report about a year ago pointed out how charters had largely established around the edges of the poorest neighborhoods but were not embedded in those communities. I discussed that report here:  https://schoolfinance101.wordpress.com/2010/10/30/biddle-me-this-or-flunkout-nation/

The maps above more or less reinforce the Kauffman report findings. On the left are zip code free lunch shares, showing the core of higher poverty up through the center of the city (blue box). Charter schools are green, district schools black. Note that charters are mostly lined up along the western edge of the blue box, and to the north toward downtown. And on the right we see that charter market shares are largest to the west of the low income core, but not within it. This is one case, however, where I’d love to have the locations of residence for the students, to see how many cross over from the poorer zip codes to attend the charters. That said, the previous figures indicate that the low income concentrations in those charters resemble their surrounding schools in the same zip code – not the low income concentrations of the poorer zip codes to the east.

Closing Thoughts

So, this brings me back to the point I’ve been reiterating over and over these past several weeks. We need to figure out how all of this stuff fits together. From a policy perspective, we need to concern ourselves with the overgrowth of schools that do not serve representative populations in part because of the effect those schools have on the others around them. Indeed, it may not be wise to force some of these schools to take on students that they simply aren’t capable of handling. As I’ve said before, perhaps some of them do well with their select population because of their select population, and perhaps they’ve learned to work well with that population.

Allowing or even encouraging an unfettered parasitic decomposition of urban schooling is not a reasonable policy solution as it will ultimately harm large numbers of the lowest income, disabled and non-English speaking students disproportionately left behind. Some market segmentation is tolerable, and existed long before charter expansion became a primary cause.

Finally, there’s also that pesky concern I have about shifting larger and larger shares of children into schools where they and their parents may be unknowingly compromising constitutional and statutory rights. I am especially concerned given that charter school market shares tend to be largest and expanding most quickly in urban settings serving low income and minority children. As such, we are moving toward a system where low income and minority children are more likely than their white suburban peers to be attending a school that calls itself public, but one in which those students may be sacrificing constitutional and statutory rights they would otherwise have in a real public institution.

Again, this is not to say that charter operators are generally out to deprive kids of rights. But rather, that legal protections for these children are being quietly eroded by the emerging ambiguities of the “public until it’s legally convenient to argue otherwise” charter sector. We need to pay attention to this erosion of rights, and its disproportionate impact on low income and minority children.

The bright spot (he opined cynically) in the figures above is that in most charter markets – which do tend to be low income cities – the lowest income children are generally being under-served by these schools.

To the extent that we wish to make charters a significant player in the mix of providing “public” schooling, these issues must be resolved (and are resolved to a greater extent in some states than others).  If charters are going to be major players here, their various layers of governance must either be explicitly considered public (board members considered public officials) with all of the statutory and constitutional obligations (and protections) of public officials (open meetings, open records, bidding, etc.), or they must be bound by the same statutory and constitutional requirements (whether labeled as public officials or not). And it must be explicitly guaranteed in state charter laws that all statutory and constitutional rights pertaining to students and employees in traditional public/government (state actors) agencies also apply to charter schools. In other words, the rhetoric of “public” must be accompanied by the legal obligations of being “public.” Alternatively, if they’re simply NOT public, then just admit that fact (and dump the whole deceptive “public” charter label and call it a voucher system instead). In which case, a true public option must be made available to all children. Simply closing down all true public options – New Orleans style – for large portions of cities, leaving only pseudo-public and true private options is also likely to disparately deprive low income and minority students of constitutional and statutory rights.

More on this topic another day….

Parsing Charter School Disability Enrollments in PA and NJ

Here are a few quick figures that parse the disability classifications of children with disabilities served by charter schools in Pennsylvania and New Jersey.

Two previous posts set the stage for this comparison. In one, I explained how charter schools in the city of Newark, NJ, by taking on fewer low income students, far fewer LEP/ELL students and very few children with disabilities other than those with the mildest/lowest cost disabilities (specific learning disability and speech/language impairment) are leaving behind a much higher need, higher cost population for the district schools to serve.

Effects of Charter Enrollment on District Enrollment in Newark:

https://schoolfinance101.wordpress.com/2012/08/06/effects-of-charter-enrollment-on-newark-district-enrollment/

In another post, I walked through the financial implications of Pennsylvania’s special education funding formula and specifically the charter school special education funding formula on districts where large shares of low need disability students are siphoned off by charters and where high need disability students are left behind to be served by districts with depleted resources.

The Commonwealth Triple-screw:

https://schoolfinance101.wordpress.com/2012/06/05/the-commonwealth-triple-screw-special-education-funding-charter-school-payments-in-pennsylvania/

In short, under the Pennsylvania charter school funding formula, for each child classified as having a disability and choosing to attend a charter school, the sending district must pay the “average special education expenditure” of the district – regardless of the actual IEP needs of that student. So, there’s a strong financial incentive to serve large numbers of low need special education students in PA charters. But this, of course, leaves a mess behind for local districts, who then have a far higher need special education population and have lost substantial shares of their available funding (due to a completely arbitrary and wrongheaded calculation of the sending tuition rate).

This post merely provides a few more comprehensive follow up figures on the issue of higher versus lower need disability students and charter school enrollments.

First, in New Jersey, here’s the statewide breakout of charter special education enrollments and market shares based on data from 2010 (same as used in Newark post)

  • In short, charter schools in NJ serve about 1.7% of the population.
  • They serve about 1.05% of the population of children with disabilities.
  • AND… they serve only  about .23% of the population of children with disabilities other than Specific Learning Disability or Speech/Language Impairment!

That’s a big deal! It’s a big deal because this leaves behind significant numbers of high need disability children to be served by districts. And, to the extent that charter expansion follows the same trend, this will lead to even greater concentration of children with disabilities in general in district schools and children with more severe disabilities in particular.

Here’s the average disability classification profile for NJ public districts and for NJ charter schools.

Now, for Pennsylvania, where there exists a significant incentive for charter schools to boost their special education populations but to avoid serving children with more severe disabilities. Here are the counts for counties with at least 500 students in charter schools:

Here are the enrollment shares within counties:

And finally, here are the population shares served:

So, for example, in Philadelphia county, which is the city:

  • Charter schools serve 16.2% of the student population
  • Charter schools serve about 14.6% of the children with disabilities
  • BUT… charter schools serve only about 6.3% of children with disabilities other than SLD or SLI!

Even in those counties where charters serve a larger share of the county-wide total special education population, they only occasionally serve an equitable share of children with more severe disabilities (often in specialized schools).

In Delaware County, charters do serve a higher overall special education population share than districts in the county, but serve a much smaller share of non-SLI/SLD disabilities. And Chester-Upland in particular bears the fiscal brunt of this practice!

That said, clearly, PA charter schools are generally serving more comparable aggregate shares of children with disabilities than NJ charter schools and perhaps the financial incentive plays a role.

Again, a critical issue here is the nature of the population left behind in district schools.

These figures also dispel a common assertion of charter advocates/pundits who, when challenged as to why special education rates tend to be generally low in charters, often argue that it’s because the charters are implementing better early interventions and thus avoiding classifying children in marginal categories like “specific learning disability.” To begin with, there’s absolutely no evidence to support this claim. That aside, these figures show that in fact, many charters do seem to have plenty of students in these marginal categories. What they don’t have is students in the more severe disability categories such as mental retardation and traumatic brain injury and it is certainly unlikely that charter school early interventions are successfully preventing children from being later misclassified into these categories.

Statewide, of 724 children with TBA, only 7 were in charters. Of 21,987 mentally retarded children, only  396 (1.8%) were in charters.  But about 4.1% of all enrollments were in charters.

I’ll admit… I am losing my patience on some of these issues. Excuse me for a moment while I vent. I’m losing my patience in large part because of the ridiculous responses/reactions I get every time I simply post some data either relating to charter school enrollments or finances.  I seem only to get a flood of ridiculous responses when I’m presenting information on Charter schools. Not when I criticize value-added estimates, or point out misuse of SGPs. Pretty much exclusively when I present data on charter schools.

It’s time to cut the crap and start digging into what’s really going on here, and how to move toward a system that best serves all of the children rather than ignoring and brushing aside these issues and pushing forward with what appears to be an emerging parasitic model.

Let’s evaluate the incentives. And instead of protecting perverse, damaging financial incentives like those in PA, simply because they drive more money to charters, let’s do the right thing. Hey, it may be the case that charter allocations are otherwise too low, but raising them for the wrong reasons, with a wrong mechanism  and with bad incentives is still, wrong, wrong and bad.

It may also be the case that the data we are using for making comparisons – using total of free and reduced lunch, rather than parsing income categories, comparing total special education rates instead of by classification, are encouraging charter operators to boost their enrollment subgroups by focusing on the margins. In which case, we need to make it absolutely clear by increasing data reporting precision and availability, that serving kids just under the threshold (or in marginal categories) isn’t enough. More fine grained comparisons are necessary!

I’ve said before that I don’t really believe that every school – every magnet school – every charter school – every traditional public school – can or should try to serve exactly the same population. I do believe there’s room for specialization in the system. I also believe that many charters that “succeed” so-to-speak, do so because they’ve figured out how to serve well their non-representative populations. And many would likely fail miserably at trying to serve children with more severe disabilities (as many district schools have).

BUT… accepting that there’s room for some specialization within the system and some uneven distribution of students is a far cry from what is now emerging, as charter market shares increase significantly in some cities and in some zip codes. And that must stop!

Still Searching for Miracle Schools and Superguy: Updates on Houston and New York City

I was following a conversation on Twitter a short while back in which one student activist – Stephanie Rivera of Rutgers asked another – Alexis Morin from Students for Education Reform – why SFER chooses to focus almost exclusively on charter schools as beacons of “success” and thus a significant part of the “solutions” for urban education moving forward. Observing this interaction brought me back once again to the astounding gaps in logic which are so pervasive in the current reform rhetoric which seeks to find policy solutions almost exclusively in charter schools and in changing teacher compensation and dismissal policies.  The reformy solutions are pretty much a given regardless of the original question or what the analyses yield.

Too often, the following faulty reasoning is applied in search for solutions in the education reform debate:

  1. Scan the horizon for successful charter schools (even though charters are no more successful, on average than their more dominant counterparts – district schools). [Charter schools on average, are average. Some are above average and others, well, not so much. Because charters are still much smaller in total numbers, if one was to simply look for “good” schools, the odds of picking a charter would in fact be smaller.]
  2. Assume that better-than-average charter schools – successful ones – are better than traditional public schools. [ignoring that while, by virtue of being the upper half of a similar distribution, they are really no different from the upper half of traditional public schools].
  3. Assume that because the better-than-average charter schools are better than the average traditional public school, that being a charter school  – bearing the label/classification “charter” – has something to do with it. [even though a comparable – or even larger – share of other schools bearing the label “charter” are actually doing worse than the average traditional public school].
  4. Assume that bearing the label – “charter” – necessarily means that these schools have and use creatively and inventively the substantially greater autonomy granted to them. [That is, they certainly don’t waste their time on stuff like spending more money, providing smaller class sizes and paying teachers more].

I have seen this utterly ridiculous stream of contorted logic rolled out on numerous occasions in the past few years, and even in the past few weeks.

A while back, I posted two separate entries called “Searching for Superguy” – one for New York City and one for New Jersey – in order to display the distribution of performance, corrected for demographics, for New York City and New Jersey charter schools.

Since that time, I’ve compiled quite a bit more data on charter (and other) schools in a variety of settings. I’ve also developed a clearer vision of exactly what constitutes one of those “miracle” schools we’re all searching for. A miracle school is characterized by at least the following four factors:

  1. Serves the same kids (poverty, language proficiency, disability)
  2. Spends less than other schools serving similar kids
  3. Has high average outcomes compared to schools serving similar kids
  4. Achieves better value-added on measured student outcomes than other schools

For today’s post, I offer you a tour of charter schools in New York City and in Houston Texas – two cities with significant concentrations of charter schools and two cities with significant numbers of charter schools affiliated with major charter management organizations.

A Tour of Houston and New York City Charter Schools

Serve the Same Kids?

The first question, of course, is do charter schools in these cities serve the “same kids?” as traditional public schools in the same city/borough and at the same grade level. To answer this question, I estimate regression models to three years of data (2008 to 2010) where the population characteristic of interest is the dependent variable, and a)year of data, b) location of school (city) and c) grade level/range are the independent variables. Houston is treated as a single city (uh… because it is) and New York is carved into boroughs for this analysis. That is, schools are compared with same grade level schools in their borough. Charter Schools are lumped together by CMO, with schools not belonging to major CMOs lumped together for this analysis (they are indeed a very heterogeneous group). The analysis is weighted by the enrollment of the schools.

Figure 1 shows that in New York City, Charter Schools serve a) far fewer children who qualify for free lunch , b) far fewer LEP/ELL children and c) far fewer children with disabilities than school serving the same grade level in their borough. Uncommon schools are indeed the least common. But Success academies have particularly large deficits in LEP/ELL children. These schools simply aren’t comparable in terms of student populations.

Figure 1. New York City Demographics

Figure 2 shows the Houston schools by CMO. With my present data I was unable to parse free from free or reduced price lunch, which may be important. But, this figure and some of my previous analyses confirm that at least in Houston, charters are doing a better job of serving low income kids (than NYC charters). However, most Charter CMOs still seem to have a significant aversion to children with disabilities and in most cases, also to children who are LEP/ELL.

 

Figure 2. Houston Demographics

Spend Less?

The spending analysis is conducted similarly, using 3 years of data and using a regression model where spending per pupil is the dependent variable and where student characteristics, grade level and year are the independent variables. This analysis plays off our recent NEPC report, using the same data (expanded to include all NYC charters, and cleaned&merged with additional measures) and same methods.  I have stayed with my original expenditure measure for BOE schools (defended here) and have used Annual Financial Report (nor IRS 990) data for charters.

Figure 3 shows that in New York City, charters are generally outspending traditional public schools serving similar student populations. KIPP and Uncommon schools are outspending BOE schools by over  $4,000 and $3,500 respectively and Harlem Childrens Zone schools are spending similarly (and that’s not even counting all of the additional money flowing to/through the parent organization. it’s just the annual financial report data!).

Figure 3. New York City Spending

Figure 4 shows more of a mixed bag in Houston. Non-major-CMO charters spend less than district schools. KIPP’s elementary schools spend less, but not consistently/significantly so (some do). Cosmos/Harmony schools also spend less, but through different means.  Others are indistinguishable, with some network schools spending more and others less (some Yes Prep schools outspend districts schools serving similar students).

Figure 4. Houston Spending

Have High Average Outcomes?

The following several graphs explore ht distribution of spending and outcomes for individual schools. Note that these graphs don’t include all of the other stuff that might need to be included to parse whether spending differences actually help produce outcome differences. That’s not the point here. Rather, these are just descriptive graphs of the relative spending and outcomes of these schools – focusing on schools serving middle grades.

Figure 5 shows that each of the charters spends quite a bit more than otherwise similar district schools. Each charter also has higher average performance (except St. Hope) than district schools. But, as shown above, they also have less needy students.  In other words, no miracles by these measures. Higher average outcomes yes. Lower spending? NO. Same kids? NO!

Figure 5. New York City Average Outcomes

Figure 6 shows the average spending and outcomes for schools in Houston. Among KIPPs, all spend much more than district schools and two have higher than average outcomes and the other two have average outcomes. The Yes Prep school in the sample spends more and has higher than average outcomes. Meanwhile, some other charters spend less and do less well, and one spends less and has somewhat higher than average outcomes. Notably, however, the population characteristics of these schools were also mixed. It may be the case that these KIPP schools have more needy populations than the average district school, and on average are doing average to better. That’s not bad. However, we must acknowledge they are doing this at a much higher price! Similar kids? Mixed (more low income, fewer ELL or special ed). Better outcomes? Also mixed, but okay. Less money? No, actually more… a lot more.

Figure 6. Houston Average Outcomes (standardized)

Have Greater Gains?

Okay, here’s one last shot. Let’s look at achievement gains instead of just level of performance. I’ve constructed school level average gains for NYC schools by aggregating the teacher value added data for teachers in each school (weighted for the number of students who contribute to their scores in English Language Arts or Math). In other words, in this analysis a school is only as good as its teachers (consistent with reformy wisdom) and, for that matter, the children served by (linked in data records to) those teachers.

Figure 7 again shows that in New York City, charters tend to significantly outspend district schools with similar populations – well except Equality charter which is somewhat closer. On average, the average gains are indeed higher in these higher spending charters – actually moving upward in sort of a pattern. But remember, the peer groups in these schools also aren’t particularly comparable. KIPP AMP and Brooklyn prospect, however, don’t do so hot.  But, if there’s any case to be made here with these charters, resources just might matter. Not the same kids. More money. Some reasonable outcomes.

Clearly, some deeper investigation is warranted. But, in each case there are also district schools, including lower spending district schools that outperform most of the charter schools.

Figure 7. New York City Value-Added

Finally, we’ve got Figure 8, showing the distribution of school level value added ratings from the FAST TEXAS system. Here, the KIPPs in particular are more of a mixed bag. Some have higher and others lower value added.  Note that in Texas these “progress” metrics seem to be associated with student characteristics.  All of the KIPP schools and the Yes Prep school spend more than district schools. Clearly, some deeper investigation is warranted. But, in each case there are also district schools, including lower spending district schools that outperform all of the charter schools.

 

Figure 8. Houston Value-Added

As a bonus, I also have this slide on class sizes (8th grade math) for NYC schools serving 8th grade. I’ve related class size to spending here, showing that each of these higher spending charter chains (except for Democracy Prep) seem to be leveraging at least some of that funding to provide much lower class sizes.

Bonus Slide: Class Size and Relative Spending in NYC

Closing Thoughts

In closing, we all really need to step away from the misguided logic I laid out in the beginning of this post – that “charter” in and of itself is meaningful  and that the only answers for the future of education must be found among successful charter schools – especially miracle charters under the watchful eye of Superguy. If there’s anything we know by now about Superguy is that he’s got some pretty nice financial backing!

Superschools & Miracle-Guy… uh… wait… Superguy and Miracle Schools, if they do exist, are a rarity. Further, if they do exist, they are as likely if not more likely (by sheer numbers) to be traditional public schools and not charter schools. But regardless of the governance of the schools, when looking for ideas (not “solutions”) for how to improve educational opportunities, we should be focused on the following:

  1. Attempting to learn what we can from all types of schools and not predetermining that we should look at successful “charter” schools. This is especially true since charters are proportionately no more successful than other types of schools and, given that they are smaller in total numbers, they are in total, less numerous among all successful schools.
  2. Being really, really careful about parsing out reasons for success before declaring schools to be miracles (and we should avoid the whole notion of “miracle schools”). We must look closely at population characteristics and population dynamics (mobility/attrition/neighborhood changes).
  3. When we do find schools – charter or other – that appear to be making unexpected gains, we should explore what it is that they are doing. We should explore their resource use. We should explore the strategies they employ and we should figure out not just what it is costing them to adopt these approaches but what it would cost to adopt these approaches in other settings and more broadly.

Finally, publicness and true public access matters. While we may analyze and compare schools more thoughtfully regardless of their governance. I would now argue that when we consider the policy path forward we should actually give serious consideration to their governance.

All else equal, I am increasingly uneasy with the notion of creating larger numbers and shares of schools that are LIMITED PUBLIC ACCESS as I described in a previous post on charter schools. Intended or not, this is what has become of large segments of charter schooling.

I am concerned with the effect of expanded limited public access schools on those truly public schools around them. Intentional or not, this is how charter expansion seems to be playing out.

I am equally if not more concerned with the idea of shifting larger shares of children into schools where those children and their parents may forgo their constitutional and statutory protections, except where explicitly laid out in state charter statutes. Intended or not in the letter of state charter statutes, it is the charter operators themselves who invariably invoke their “private” status when defending the stifling of teachers’ free expression, or teachers’ claims seeking damages (under federal law) for mistreatment by a state actor, or students/parents claims regarding strict enforcement of discipline codes.

The maintenance of constitutional protections and true public access – non-exclusion – MUST play a significant role in the determination of the path forward. We should not be too quick to trade constitutional protections to employee or student free speech, privacy rights, protections from unreasonable searches and various statutory rights for a few additional points on state assessments or for a few dollars cut from school spending.

That said, the figures laid out above suggest that we likely aren’t even getting systematic or sizable bang for the buck when/if we do trade these constitutional protections. So perhaps that point is moot.

Related Resources

Baker, B.D., Libby, K., & Wiley, K. (2012). Spending by the Major Charter Management Organizations: Comparing charter school and local public district financial resources in New York, Ohio, and Texas. Boulder, CO: National Education Policy Center. Retrieved [date] from http://nepc.colorado.edu/publication/spending-major-charter.

(Follow up: https://schoolfinance101.wordpress.com/2012/05/07/no-excuses-really-another-look-at-our-nepc-charter-spending-figures/)

Baker, B.D. & Ferris, R. (2011). Adding Up the Spending: Fiscal Disparities and Philanthropy among New York City Charter Schools. Boulder, CO: National Education Policy Center, 33. Retrieved April 24, 2012, from http://nepc.colorado.edu/publication/NYC-charter-disparities.

The Gulen Charter School Teacher Supply Problem

There’s been some increased interest in recent months in what are often referred to as Gulen Charter Schools, or those schools affiliated with Fethullah Gulen. I’ve tried to stay off of this topic for the most part because I don’t like to write about “conspiracy theories” or even potentially inflammatory religious/cultural issues – at least on this blog.  Here are a few recent video clips/new stories:

From Ohio: http://www.youtube.com/watch?v=4qDbELO12uo

From 60 Minutes: http://www.youtube.com/watch?v=O4OtHpUCqy0&feature=related

New York Times article on Texas Gulen Schools: http://www.nytimes.com/2011/06/07/education/07charter.html

There are also a handful of websites that provide additional highly critical information on these schools.

What has intrigued me when I’ve watched these news clips and when I’ve read other news stories, is that when these schools’ leaders are challenged as to why they hire so many teachers on visas from Turkey, their standard response is that there just aren’t enough qualified applicants for their schools from U.S. resident citizens.

Yes, teacher supply can be an issue, especially in math and science. And economic research on the topic suggests that wages – especially the competitiveness of wages with other career alternatives – may play a role. See this report for a related analysis of teacher wages in the State of Washington (& relative competitiveness of Science/Math teacher wages)

Now, I’ve been conducting several analyses of teacher salary structure over time, trying to see how charter schools pay their teachers compared to other charter schools and public districts. It’s really important to understand that wages, wage growth expectations and job security expectations all may have significant influence on the supply of quality applicants for teaching positions.

It strikes me, after looking at salary structure data on Gulen schools that therein lies the problem.

Check this out. First, there’s the graph I made for our recent report on charter school expenditures. This graph appears in an appendix to the report, and represents exploratory analysis of what’s behind some of the spending differences between charter schools and between charters and public districts.

For this graph and a following graph on NJ Gulen schools, I use teacher level data from multiple years to estimate a model of teacher salaries as a function of experience and degree level. Then I project out the predicted salary for teachers at each experience level holding degree levels constant. This gives me a picture of how teacher wages compare between schools for comparably educated teachers and at different experience levels.

Harmony (Cosmos/Gulen) schools in Texas are relatively low spending schools and have particularly low labor expenses. Notably, this network of Texas charter schools is large enough to drag down average spending and average labor costs for charters statewide.

In the Houston area in particular, not only do the Gulen schools pay very low starting salaries, but salaries don’t appear to grow over the first few years of experience. Notably, the Harmony/Cosmos/Gulen schools really don’t have any teachers with more than a few years of experience. Now, this could be in part because no-one would really want to stick around if there’s no outlook for wage growth over time, or because no-one who would have intended to stick around ever applied to begin with, leading the schools to make extensive use of temporary imported staff.

Figure 1. Houston Area Wages for Charter & District Schools

Source: http://nepc.colorado.edu/publication/spending-major-charter

Figure 2 through Figure 4 show the average school level wages for teachers in Texas district and charter schools in Houston and Austin. Notably, Harmony schools have very low average experience levels and also have very low average salaries. They also have low average salaries even given their low average experience levels. Is it any wonder they suffer a teacher supply problem? Especially with a curricular emphasis on math and science? And especially in tech heavy urban centers.

Figure 2. Houston at all experience levels

Figure 3. Houston for teachers w/less than 5 years

Figure 4. Austin at all experience levels

Figure 5. Austin for teachers w/less than 5 years

Now, this salary structure anomaly for Gulen affiliated schools in Texas really isn’t just a Texas thing. That’s what struck me, and eventually led me to write this post – which at this point is still incomplete. Here’s what Gulen salaries look like in New Jersey, when compared with other charter schools and when compared with three major urban districts. Now, New Jersey’s urban districts have a quirky salary structure that I could quite honestly do without. As described by one NJ charter school leader, the urban districts in NJ often have a “hockey stick” salary schedule that stays relatively flat for the first several years/steps and then jumps way up around the 13th year. That actually permits some charter schools to gain a recruitment/retention edge by scaling up salaries more quickly on the front end. Notably, these charter schools to the best of my knowledge are not recruiting large shares of temporary staff from foreign countries!

But the Gulen affiliated schools – in this case Paterson Science and Technology and Central Jersey College Prep – have a strikingly similar compensation strategy to Harmony schools in Texas, and quite different from other major charter schools (notably, there are other minor charter schools that pay quite poorly, similar to the Gulen schools at the front end, but with more growth in pay for accumulated experience).  Again, one might expect these LOW and FLAT salaries to be a major barrier to generating a supply of high quality domestic applicants.

Figure 6. New Jersey Charter and District Salaries

Data Source: Based on regression model estimated to salary data from annual NJ fall staffing file. Salaries estimated as a function of a) total experience, b) degree level, c) year (3years of data included, 2008 to 2010) and d) FTE status.

In a sense, these Gulen salary structures and claims of insufficient teacher supply especially in math and science may be providing us with some insights as to what happens when we choose to pay teachers so poorly and when we strip them of any expectation of increased wages with experience. Maybe they do really have a domestic teacher supply problem. But their solution to that problem is not a scalable solution for American public schooling at large (cheap imported and temporary labor).

Quite honestly, any school that persists in offering this low a wage with no growth over time, while complaining about lack of supply of worthy U.S. applicants really isn’t even trying! Clearly, they are operating exactly how they want to operate – and have little if any interest in attracting the best and brightest science and math graduates from the U.S.

While it may be the case that some of these schools are producing reasonable average outcomes – and doing so at substantially reduced labor costs – this is clearly a model with serious limitations to its scalability. Further, there exist significant concerns that much of the apparent “high” performance in these schools is a function of student selection & attrition. (see also: http://www.texastribune.org/texas-education/texas-education-agency/what-drives-high-achievement-at-harmony-charters-/)

Just pondering. More to come, no doubt. Cheers!

For more on salary competitiveness and teacher quality/supply, see: http://www.shankerinstitute.org/images/doesmoneymatter_final.pdf

Note: There is also some evidence, like this: http://www.charterschoolwatchdog.com/tuzuk—a-contract-to-steal.html which suggests that for Turkish teachers, Gulen schools receive a sizable kickback on the salaries and levy numerous fees against those salaries. That would, of course, generate a substantial amount of money for the Gulen organization.  My Texas teacher level data set has over 800 teachers in Harmony schools in 2009-10, with cumulative reported base salaries near $30million. KIPP, Yes Prep and IDEA all have less than half of that number.  I’d appreciate any documentation readers might have regarding current contractual agreements, fees, etc. for non-U.S. employees of these schools. Thanks!

Friday Finance 101: On Parfaits & Property Taxes

Public preference for property taxes stands in perfect inverse relation to the public taste for parfaits. Everybody loves parfaits[i] and everybody hates property taxes.[ii] No, I don’t plan to spend this blog post bashing parfaits. I do like a good parfait. But, even more blasphemous, I intend to shed light on some of the virtues of much maligned property taxes.

I often hear school funding equity advocates argue that if we could only get rid of property taxes as a basis for funding public schools, we could dramatically improve funding equity. The solution, from their standpoint is to fund schools entirely from state general funds – based on rationally designed state school finance formulas – where state general fund revenues are derived primarily from income and sales taxes.  In theory, if the state controls the distribution of all resources to schools and none are raised locally through property taxes, the system can be made much fairer, even more progressive with respect to student needs and cost variation (as I discuss in this post).

Property Taxes are Less Volatile and their Decline (when/if there is one) Lags

The most recent cries for why the property tax is problematic came about because property tax revenues in some locations continue to decline marginally even as state revenues are rebounding! Damn those property tax revenues! Why don’t they rebound too?

As reported in The Atlantic, the question was: why are we still firing teachers even as the economy is rebounding (albeit slowly)? The answer- declining home values and thus declining revenues from taxes on those home values. Mind you, residential properties are only a share of taxable property values, and a much smaller share in some districts/locations than others. But more on that later.

The article in the Atlantic pointed to U.S. Census data on tax collections which are also summarized in recent reports from the Rockefeller Institute (www.rockinst.org), one of my most trusted sources for information on state and local tax policy. Consistently good stuff!  Figure 1 shows the fluctuations in state and local tax revenue sources over time. And yes, Figure 1 does show that property tax revenues continue to decline modestly. But figure 1 also shows that property tax revenues a) have much more consistently stayed in the positive zone since the early 1980s and b) have generally been much less volatile than income or sales tax revenues. Look at the most recent two downturns, in 2002-03 and 2009 to present.

Personal income tax revenues are most volatile, especially in states where larger shares of income are non-wage income.  You certainly don’t want all your eggs in that basket. And it was, in fact, state budgets that got most slammed, especially in states most reliant on personal income tax. Now this is not a case for eliminating the personal income tax. What goes down also goes up. And the personal income tax can be structured progressively and revenues from it can be distributed to improve equity. In education – or really any public service financed by a mix of income, sales and property taxes – property taxes serve as a buffer to the system – a stabilizer. While that buffer is not fairly distributed – wealthy communities having greater ability to buffer their aid losses than poorer ones – states could responsibly redistribute remaining state aid to those who need it most, or at least levy cuts in a pattern least harmful to the most needy, accounting for those with the strongest buffer. (which is not to say that states do, but rather that they could).

Figure 1. Rockefeller Institute Analysis of Revenue Volatility

 http://www.rockinst.org/pdf/government_finance/2012-07-16-Recession_Local_%20Property_Tax.pdf

The other point here that The Atlantic latched onto is that property tax revenue decline, if and when it happens, occurs in a lag. They decline after others start going back up. So yes, while it does suck that property tax revenues haven’t rebounded right away, it sucks a lot less than if property tax revenues had tanked to the same degree and at the same time as sales and income tax revenues. Further, the blow of declining property tax revenues could now be softened with the political will to tap rebounding incomes and sales. (yes… could…)

Having these revenue cycles, be, well… off cycle with one another is helpful… or at least less harmful.

In short, for all their other faults property taxes help balance the revenue portfolio for public services. They are the stable, safer investment in that portfolio. Shifting too dramatically away from property taxes places a greater burden on the state to provide additional state aid for property wealthy and property poor districts. And we know who’s likely to win out in that tug of war.

Overall Funding Equity is NOT a Function of Whether the State Pays a Larger Share

Now here’s the real kicker. While school funding equity advocates like to believe that fully state funded systems would somehow be magically more equitable because state share has increased, it is not generally the case that states where school districts are more reliant on local revenue have less equitable school funding systems. In fact, as we found in our school funding fairness report, there exists no relationship at all between overall school funding fairness and the percent of money that comes from state versus local resources. Figure 2 illustrates this stunning lack of relationship.

Figure 2. Relationship between State Share and Funding Fairness

http://schoolfundingfairness.org/downloads_popup.htm# (2010, p. 35)

How can that be? Why, if more money comes from the state wouldn’t that improve funding equity? Clearly the ability to raise funds from local property taxes is inequitable? Thus, state aid is needed to counterbalance that inequity. More state aid, more equity! Viola.

But alas, as I’ve discussed many times previously on this blog, state aid formulas are the output of an ugly political process – and that’s just how it has to be. I’d love to substitute my infinite wisdom for the political meat-grinder – but that would be about as arrogant and disrespectful to our American political system as… well…  executive waivers from Federal statutory obligations. In fact, it would be equivalent to trying to use executive or regulatory (departmental) authority to rewrite components of a state school finance system that are spelled out in statute. Nah… that would be just wrong!

Some times, as in Figure 3, the political meat-grinder, under some influence of that other branch of state government – the judicial branch – produces a state school finance system that, while heavily reliant on local property taxes, still manages to achieve pretty darn good relative equity. See Figure 3- New Jersey.

 Figure 3. Equitable state w/Heavy Property tax Reliance

But other times, even when a state school finance system is heavily state funded, with relatively small local share, that system still ends up incredibly inequitable and regressive with respect to student needs. See Figure 4 – North Carolina.

Figure 4. Inequitable State with Low Property tax Reliance

 

And yes, sometimes, you do have a state that is very heavily reliant on local property taxes and has a very inequitable distribution of resources, like New York! But hey, at least it’s a stable inequity over time, right? Nothin’ like dramatic disparities that stand the test of time!

 Figure 5. Inequitable State with Higher Property Tax Reliance

 

And every state’s a bit different from every other – creating its own brand of state endorsed inequities – seemingly regardless of how dependent, or not, that state system is on local property taxes.

General State Equalization Aid IS Property Tax Relief Aid to those who need it most!

Now, it’s not entirely the fault of the property tax that we have these disparities in states like North Carolina and New York. Clearly something else is going on. I won’t go too deep on that here, because I’ve got a really fun paper coming out this September in which I go painfully deep on this topic. The paper will be rolled out in a public event in DC – more on that later. But clearly, if states more reliant on property taxes are not generally less equitable, there’s other stuff going on. Hey, just look at all of that state aid in North Carolina going to the lowest poverty districts, even when higher poverty districts could likely use a bit more. And what about the aid going to the wealthiest New York districts, a topic I’ve written about many times here? How does state aid get so screwed up as to not help?

One common argument among state legislators, especially those from property wealthy communities is that their communities need property tax relief. It’s easy to hold up the tax bill on a $2 million dollar home in New York or New Jersey and get an eye-popping response. What? You mean you pay over $30,000 a year in property tax? Clearly you need tax relief! Uh… but wait, despite the eye-popping tax bill, the effective tax rate on that house might just be lower than the effective tax rate on the $200,000 home in Newark, NJ or Utica, NY! The tax bill is high because of the value of the house, not because of some unfair tax policy or aid distribution scheme.

The bottom line is that general state aid to schools – the equity enhancing aid – is already designed to promote tax equity. State aid to schools and property tax relief, are, to a large extent flip sides of the same coin. When a community gets more in aid, they need to raise less locally to achieve the desired quality of service. If they get less in state aid, they need to raise more. Communities with greater capacity to raise more should, in turn, get less.  There’s no reason to then turn around and say that those with greater capacity to raise more all of the sudden need a break… and need additional aid… that could have otherwise gone to those with less capacity? Such an argument presumes that the state has already over-corrected tax inequities between rich and poor communities. Highly unlikely! Even New Jersey, which has corrected more than many by providing aggressively targeted state aid hasn’t gone that far. See this post!  Effective tax rates remain lower in NJ districts with higher property values (at the peak of aid targeting).

New York certainly hasn’t “over-corrected,” the property tax burden across districts, warranting the counterbalancing distribution of un-equalization aid. Figure 5 shows the relationship between taxable property values per pupil and local effort rates. Local effort rates remain systematically higher in lower wealth communities.

Figure 5. Insufficient General Aid and Persistent Tax Inequities

 

But, as I’ve discussed on a number of occasions on this blog previously, New York still goes out of its way to operate a completely separate property tax relief subsidy program, which on average, spends more state resources each year to buy down property taxes in richer rather than poorer communities. See Figure 6.

Figure 6. Un-equalizing Distribution of Unnecessary Aid

 

 So the point of this seemingly tangential portion of this post is that states like New York and others, find ways to consume state resources toward making school funding less equitable. It’s not entirely the property tax that’s at fault here. Rather, it’s the use of state resources to buy down property taxes in wealthy communities – actually encouraging even more spending in these communities that already have greater capacity and are exerting less effort. Go figure.

And that’s ONE OF MANY REASONS why simply increasing the level of support coming from the state doesn’t always improve school funding equity. Heck, North Carolina barely even tries to equalize (adjust) general school aid for differences in local capacity. You get more or less the same state aid per pupil no matter how wealthy or poor. Thus, whatever disparities exist in local revenue are simply added onto with state aid.

Closing Thoughts

There are lots of ways to make property taxes “better” and “fairer.” But even in their current form, property taxes play an important role in stabilizing the revenue on which our public schooling system operates.  Further, overemphasis on the classic, savage inequalities of American public schooling that emerge from the inequitable mess that is property taxation may distract from the reality that state school finance systems often make things worse rather than better, replacing savage inequalities with stealth inequalities.

The solution is not to get rid of property taxes but to integrate them wisely into state school finance systems, use other state revenues to better achieve overall funding equity by aggressively targeting those revenues, count on property taxes as a portfolio stabilizer and, to the extent possible, seek ways to improve equity with property taxes and improve the equity of property taxation.

 


[i] Donkey (2001) Shrek.  

[ii] http://businessweekly.readingeagle.com/?p=2860 (okay, this is an indirect cite to the Tax Foundation, which isn’t really the most credible source on Tax Policy. See: https://schoolfinance101.wordpress.com/2010/03/17/just-the-facts-nj-taxes-teacher-salaries-and-spending-fluff/)

Poverty Counts & School Funding in New Jersey

NJ Spotlight today posted a story on upcoming Task Force deliberations and public hearings over whether the state should continue to target funding in its school finance formula to local districts on the basis of counts of children qualifying for free or reduced priced lunch.  That is, kids from families who fall below the 185% income threshold for poverty.

The basic assumption behind targeting additional resources to higher poverty schools and districts is that high need districts can leverage the additional resources to implement strategies that help to improve various outcomes for children at risk. I have discussed this issue at length in this related post.  New Jersey has done this better than most states over time. (evidence on outcomes here)

The idea is to find the indicator or measure that seems to best capture the likelihood that children will struggle in school – that they will enter kindergarten less prepared and have access to fewer out of school resources during their time in school (including limited summer learning opportunities). A variety of socioeconomic indicators might be considered. But often, the information that happens to be most available is counts of kids who are from low income families, as identified through the National School Lunch Program income criteria.  And, as a measure of convenience, it tends to work quite well. I compare this measure below with Census poverty measures, based on children in families living in a certain area (within school district boundaries) who fall below the much lower income threshold of 100%, which has some advantages but also some major shortcomings.

Of course, in the political context, this is really all about finding ways to deliver more aid to districts whose representatives/political leaders wield the most power in the political debate. That’s just the nature of the beast – the politics of school finance. Sometimes it goes well for the kids who need it most… other times, not so much.  I watched this play out in Kansas a few years back, and have seen similar conversations occur across other states.

Typically, the whole thing plays out according to the following politically motivated steps:

  1. Manufacture some scandalous but largely irrelevant, anecdotal manifesto about how local district officials are egregiously mislabeling children as low income in order to hoard and misappropriate obscene sums of state aid.
  2. Manufacture other claims that poverty really doesn’t matter anyway and certainly these poverty measures have little or nothing to do with determining whether children are likely to do well in schools.
  3. Assign a task force composed mainly of lay people with little or no expertise in education policy, finance or specifically the measurement of poverty, to swallow whole the manufactured evidence and generate politically convenient policy recommendations.

As I mentioned, Kansas went through this process while I lived there – establishing an “At Risk Council,” and now New Jersey is headed down a similar road. In Kansas, the political strategy of using the Task Force to reduce poverty based funding and drive more to the suburbs was thwarted by the assignment of a knowledgeable individual to head the task force – or At Risk Council – former Commissioner Andy Tompkins. In the end, Tomkins and the Kansas At Risk Council concluded:

The Council continues to believe that the best state proxy for identifying at-risk students is poverty, whether that be measured by free or free and reduced price lunches.

Darn them. Blasphemy! Amazingly, Andy Tompkins was not exiled from Kansas for his leadership in this matter, and he remains one of the kindest, most thoughtful individuals with whom I’ve ever interacted on state education policy issues!

Report here: LEG At-Risk Council Report SFFF

Of course, Kansas legislators still found additional clever ways to shift money away from higher need and toward lower need districts.

In any case, even though no-one asked me… nor do I really want to be asked to participate in such a charade, here are the questions and considerations that should guide the choice of measures for determining state aid distribution.

Two Key Questions:

First, the questions… and the data on New Jersey schools and districts.

Is the Poverty Measure Correlated with Other Poverty Measures?

It is indeed desirable to find some measure on which to base funding allocations that can’t be gamed, or manipulated by those who stand to receive the additional funding. But that’s not always feasible (or cost effective). And, even if a count method does involve local district officials gathering data, it can still be checked/audited (in a  more thorough and responsible way than checking a smattering of individual families forms for those who fall closest to the income threshold, necessarily ignoring those who fall just the other side of the threshold but didn’t file).

One reasonable way to evaluate district collected data on children qualifying for free or reduced lunch is to evaluate the relationship between the free/reduced lunch concentrations and census poverty estimates based on resident populations. Here are three versions of that comparison:

Figure 1. Relationship between Census Poverty 2010 and District Free/Reduced Lunch 2011

In this first figure we see that Census poverty rates tend to range from 0 to about 45% and free/reduced rates – children in families under a much higher income threshold, up to about 100%. In fact, as I’ve noticed in many analyses, the free/reduced lunch data tend to get messy above 80%, suggesting that this is the range within which local administrators may be maxing out their ability to get parents to comply & file paperwork. Here, we see that even though poverty rates keep climbing, free/reduced rates seem to level off. Arguably, if anything is going on here, it’s that very high poverty districts like Camden and Trenton – which fall “below the curve” are under-reporting their free/reduced rates – with some possibility of marginal over-reporting in Elizabeth.

Overall, however, census poverty explains nearly 90% of the variation in free/reduced rates. In other words, free/reduced lunch makes a pretty darn good proxy.

In this second figure, I’ve tried to better tease out the districts that may be under or over reporting by cleaning up that non-linear relationship and expressing both measures in their natural logarithm form. Here, we see that the relationship remains very strong and still slightly curved. If there were districts substantially over-reporting free/reduced lunch, they would appear to pop above the outer/upper edge of the curve. There’s not much of that going on. On the other hand, there are a number of districts that are relatively low in poverty but report disproportionately low free/reduced lunch rates – that is, under-reporting.

Figure 2. Logged Relationship (natural log) between Census Poverty and Free/Reduced Lunch

In general, these figures show that free/reduced lunch rates are a pretty darn good proxy for district poverty rates. And at least this analysis here doesn’t indicate substantial, systematic (beyond predicted, based on resident child poverty rates) mis-classification.

Is the Poverty Measure Correlated with Student Outcomes?

The “big question” is which version of the measure better captures differences in student outcomes – or predicts educational disadvantage.  This is straightforward enough to check as well. The first figure hear shows the relationship between free/reduced lunch rates and proficiency rates on state assessments in 2011.

Now, I know, we’ve been told that this relationship doesn’t really exist. There are lots of schools that flat out buck this trend, right? So much so that it’s not even a trend, right? In fact, we’ve even been fed a totally absurd graph which purports to validate that free/reduced lunch really doesn’t relate to performance.  Oh wait… and we’ve been fed even more ridiculous graphs to reinforce this point!

Setting aside all of that stuff, Figure 3 shows that % free/reduced lunch alone explains about 81% of the variation in proficiency rates across districts.  So, it’s a pretty reasonable proxy of educational disadvantage.

Figure 3. Free/Reduced Lunch & Proficiency in 2011

Now, I do have some concerns about the extent to which this relationship erodes at and approaching free/reduced rates above 80%. Is it really that Camden and Trenton perform that poorly compared to Union and Elizabeth despite serving even less poor populations? Or might the story be more complex than this. Figure 4 which shows the relationship between Census Poverty and proficiency sheds some additional light on this issue.

Figure 4. Census Poverty and Proficiency

Figure 4 suggests that Camden and Trenton are actually a) higher poverty than Elizabeth (and Camden higher than Union) and b) perform more or less where they are expected to [somewhat below… as opposed to well below]. This is an interesting contrast that adds some support to my speculation above that these very high poverty cities may in fact be understating their poverty rates in their free/reduced lunch data. Indeed, there may be some overstating in Union and Elizabeth, but neither “popped” substantially above the curve in the previous charts.

Census poverty rates, while capturing a unique story of difference between Camden and Trenton vs. Union and Elizabeth do slightly less well at explaining variations in proficiency rates, making the free/reduced count preferable in this regard.

Additional Policy Considerations:

Given all of this, there are a few additional considerations when pondering which measure to actually use in state school finance policy.

More Stringent Count Methods require Larger Weights

First, if we choose to use a more stringent income threshold for poverty, like the census poverty measure, we would need to assign the appropriate weight to drive the appropriate amount of funding to high need districts. Simply changing our method of counting kids in poverty doesn’t change the needs of Camden or Trenton. It merely recasts those needs with an alternative measure. More stringent measures require larger weights, an issue that has been explored empirically.

The applies to the choice of using free lunch (130% income threshold) as opposed to free or reduced lunch. Using free lunch only might permit better differentiation between high poverty districts, but a higher weight would then be required to drive sufficient funds to those districts.

Problems with Residential/Geography Based Measures in New Jersey

Census poverty measures are limited in their usefulness in the current New Jersey policy context, because they are based on location of residence and linked to geographic boundaries of school districts. New Jersey has significant numbers of non-unified, regional secondary school districts for which poverty estimates may be imprecise or inaccurate.

Further expansion of charter schools and inter-district choice programs complicates use of measures based on place of residence. Funding to schools must be sensitive to the demographics of students enrolled in those schools.  It would be entirely inappropriate, for example, to require a sending district like Newark or Camden to pay charter or other district tuition on the basis of their own average resident poverty rate if the charter school or receiving district is not taking a comparable share of children in poverty. This is certainly the case in Newark.

As a result, free or free and reduced price lunch measures remain preferable.

So, that’s my 2 cents (okay, more like a few dollars worth) of advice on this issue.

More on this later, no doubt, when the Task Force releases its final recommendations.

Related paper on poverty measurement.

http://aefpweb.org/sites/default/files/webform/VOL%20I-POVERTY%20REPORT-METHODOLOGY%202011-21-12%20CLEAN.docx