Stretching Truth, Not Dollars?

This week, Mike Petrilli (TB Fordham Institute) and Marguerite Roza (Gates Foundation) released a “policy brief” identifying 15 ways to “stretch” the school dollar. Presumably, what Petrilli and Roza mean by stretching the school dollar is to find ways to cut spending while either not harming educational outcomes or actually improving them. That goal in mind, it’s pretty darn hard to see how any of the 15 proposals would lead to progress toward that goal.

The new policy brief reads like School Finance Reform in a Can. I’ve written previously about what I called Off-the-Shelf school finance reforms, which are quick and easy – generally ineffective and meaningless, or potentially damaging – revenue-neutral school finance fixes. In this new brief, Petrilli and Roza have pulled out all the stops. They’ve generated a list, which could easily have been generated by a random search engine scouring “reformy” think tank websites, excluding any ideas actually supported by research literature.

The policy brief includes some introductory ramblings about district level practices for “stretching” the school dollar, but the policy brief focuses on state policies that can assist in stretching the school dollar at the state level and provide local districts greater options to stretch the school dollar. I will focus my efforts on the state policy list.

Here’s the state policy recommendation list:

1. End “last hired, first fired” practices.

2. Remove class-size mandates.

3. Eliminate mandatory salary schedules.

4. Eliminate state mandates regarding work rules and terms of employment.

5. Remove “seat time” requirements.

6. Merge categorical programs and ease onerous reporting requirements.

7. Create a rigorous teacher evaluation system.

8. Pool health-care benefits.

9. Tackle the fiscal viability of teacher pensions.

10. Move toward weighted student funding.

11. Eliminate excess spending on small schools and small districts.

12. Allocate spending for learning-disabled students as a percent of population.

13. Limit the length of time that students can be identified as English Language Learners.

14. Offer waivers of non-productive state requirements.

15. Create bankruptcy-like loan provisions.

This list can be lumped into four basic categories:

A) Regurgitation of “reformy” ideology for which there exists absolutely no evidence that the “reforms” in question lead to any improvement in schooling efficiency. That is, no evidence that these reforms either “cut costs” (meaning reduce spending without reducing outcomes) or improve benefits (or outcome effects).

  1. Creating a rigorous evaluation system
  2. Ending “last hired, first fired” practices
  3. Move toward weighted student funding

B) Relatively common “money saving” ideas, backed by little or no actual cost-benefit analysis – the kind of stuff you’d be likely to read in a personal finance column in magazine in a dentist’s office.

  1. Pool health-care benefits.
  2. Create bankruptcy-like loan provisions. (???)
  3. Tackle pensions
  4. Cut spending on small districts and schools (consolidate?)

C) Reducing expenditures on children with special needs by pretending they don’t exist.

  1. Allocate spending for learning-disabled students as a percent of population.
  2. Limit the length of time that students can be identified as English Language Learners.

D) Un-regulation

  1. eliminate class-size limits
  2. provide waivers for ineffective mandates
  3. eliminate seat time requirements
  4. merge categorical programs
  5. eliminate work rules
  6. eliminate mandatory salary schedules

So, let’s walk through a few of these in greater detail. Let’s address whether there is any evidence whatsoever that these policies a) would actually lead to reduced short run costs while not harming, or even improving outcomes, or b) are for any other reason a good idea.

Creating an Evaluation System

This likely requires significant up front spending- heavy front end investment to design the system and put the system into place. Yes, increased, not decreased spending. And in the short-term, while money is tight. AND, there is little or no evidence that what is being recommended – a Tennessee or Colorado-style teacher evaluation model (50% on value-added scores), would actually reduce spending and /or improve outcomes. Rather, I could make a strong case that such a model will lead to exorbitant legal fees for the foreseeable future (I have a forthcoming law review article on this topic).  The likelihood of achieving long run benefits from these short run expenses is questionable at best. In fact, the likelihood of significant harm seems equal if not greater (see my previous post on this topic: value-added teacher evaluation).

Ending “Last Hired, First Fired” layoff policies

In very crude terms, this approach might simply allow a district – or entire state – to layoff senior, higher salary teachers. Yeah… that could reduce the payroll. Good policy? Really questionable! Of course, Petrilli and Roza also argue that we simply shouldn’t be paying teachers for experience or degrees anyway. So I guess if we did that, we wouldn’t generate savings from this recommendation. Silly me. One or the other, I guess.

Now, we could generate performance increases (at lower spending, if we keep seniority pay, or at constant spending if we don’t) if, and only if, the future actually plays out as simulated in the various performance-based layoff simulations which I, and others have recently discussed. The assumptions in these simulations are bold (unrealistic), and much of the logic circular.

And then there are those short-term legal costs of defending the racially disparate firings, and random error firings.

Eliminating Class Size Limits

Yes, larger classes require less spending – on a per pupil basis. Smaller classes have greater benefit (greater “bang for the buck” shall we so boldly say) in higher poverty settings. A labor market dynamic problem realized in the late 1990s, when CA implemented statewide class size reduction, was that the policy stretched the pool of highly qualified teachers and ultimately made it even harder for high poverty schools to get high quality teachers (a dreadfully oversimplified and disputable version of the story).

Removing class size limits might be reasonable if only affluent districts agreed to increase their class sizes, putting more “high quality” teachers into the available labor pool… who might then be recruited into high poverty districts (another dreadfully oversimplified, if not absurd scenario).  But who really thinks it will play out this way? We already know that affluent school districts a) have strong preferences for very small class sizes and b) have the resources to retain those small class sizes or reduce them further. See Money and the Market for High Quality Schooling.

Eliminating mandatory salary schedules

It seems that in this recommendation, Petrilli and Roza are arguing against state policies that mandate the adoption by local public school districts of specific step and lane salary schedules. They really only provide one brief paragraph with little or no explanation regarding what the heck they are talking about.

I’ve personally never been much of a fan of state rigidity regarding local negotiated agreements – at least in terms of steps and lanes. Many problems can occur where states enact policies as rigid as those of Washington State, were teachers statewide are on a single salary schedule.

The best work on this topic (and I’ve worked on the same topic with Washington data) is by Lori Taylor of Texas A&M who shows that the Washington single salary schedule leads to non-competitive wages for teachers in metro areas, and also leads to non-competitive wages for teachers in math and science relative to other career opportunities in metro areas. The statewide salary schedule in Washington is arguably too rigid. Here’s a link to Taylor’s study:

Taylor, L. (2008) Washington Wages: An Analysis of Educator and Comparable Non-educator Wages in the State of Washington. Washington State Institute for Public Policy.

But this does not mean, by any stretch of the imagination, that removing this requirement would save money, or “stretch” the education dollar. It might allow bargaining units in metro areas in Washington to scale up salaries over time as the economy improves. And it might lead to some creative differentiation across negotiated agreements, with districts trying to leverage different competitive advantages over one another for teacher recruitment.

But, these competitive behaviors among districts may also lead to ratcheting of teacher salaries across neighboring bargaining units, and may lead to increased salary expense with small marginal returns (as clusters of districts compete to pay more for an unchanging labor pool). For an analysis of this effect, see Mike Slagle’s work on spatial relationships in teacher salaries in Missouri. In short, Slagle finds that changes to neighboring district salary schedules are among the strongest predictors of an individual district’s salary schedule. Ratcheting upward of salaries in neighboring districts is likely to lead to adjustment by each neighboring district (to the extent resources are available). Ratcheting downward does not tend to occur (not reported in this article).

Slagle, M. (2010) A Comparison of Spatial Statistical Methods in a School Finance Policy Context. Journal of Education Finance 35 (3)

[note: this article is a shortened version of Mike’s dissertation. The article addresses only the ratcheting of per pupil spending, but the full dissertation also addresses teacher salaries]

In any case, we certainly have no evidence that removing state level requirements for mandatory salary schedules would save money while holding outcomes harmless – hence improving efficiency. Like I said, I’m not a big fan of such restrictions either, but I have no delusion that removing them will save any district a ton of money – or any for that matter.

This recommendation seems to also be tied up in the notion that we shouldn’t be paying teachers for experience or degree levels anyway. Therefore, mandating as much would clearly be foolish. I’ve addressed this idea previously in The Research Question that Wasn’t Asked.

In addition, this recommendation seems to adopt the absurd assumption that we could immediately just pay every teacher in the current system the bachelor’s degree base salary (Okay, the salary of a teacher with 3 years and a bachelor’s degree, where marginal test-score returns to experience fade). We could immediately recapture all of that salary money dumped into differentiation by experience or differentiation by degree, and that we could have massive savings with absolutely no harm to the quality of schooling – or quality of teacher labor force in the short-run or in the long-term. Again, that’s the research question that was never asked. Previous estimates of all of the money wasted on the master’s degree salary “bump” are actually this crude.

For similarly absurd analysis by Marguerite Roza regarding teacher pay, see my previous post on “inventing research findings.”

Move toward Weighted Student Funding

Petrilli and Roza also advocate moving to Weighted Student Funding. They seem to argue that the “big” savings here will come from the ability of states and school districts to immediately take back funding as student enrollments decline. That is, a district in a state, or school in a district gets a certain amount per kid. If they lose the kid, they lose the money. This keeps us from wasting a whole lot of money on kids who aren’t there anymore.

Okay… Now… most state aid is allocated on a per pupil basis to begin with. And, in general, as enrollments fluctuate, state aid fluctuates. Lose a kid. Lose the state aid that is driven by that kid. Some states have recognized that the costs of providing education don’t actually decline linearly (or increase linearly) with changes in enrollment and have included safety valves to slow the rate of aid loss as enrollments decline. Such policies are reasonable.

Petrilli and Roza seem to be belligerently and ignorantly declaring that there is simply never a legitimate reason for a funding formula to include small school district or declining enrollment provisions. I have testified in court as an expert against such provisions when those provisions are completely “out of whack”, but would never say they are entirely unwarranted. That’s just foolish, and ignorant.

Local revenues in many states (and in many districts within states) still make up a large share of public school funding, and local revenues are typically derived from property taxes applied to the total taxable property wealth of the school district. As kids come and go, local revenues do not come and go. If a tax levy of X% on the district’s assessed property values raises $8,000 per pupil – and if enrollment declines, but the total assessed value stays constant, the same tax raises more per pupil, perhaps $8,100. The district would lose state funding because it has fewer pupils (and perhaps also because it can generate larger local share per pupil).  But that’s really nothing new.

There’s really no new “huge” savings to be had here.

UNLESS:

a) we are talking about kids moving to charter schools from the traditional public schools, and for each kid who moves to a charter school, we either require the district to pass along the local property tax share of funding associated with that child (Many states), or reduce state aid by the equivalent amount (Missouri).

b) there exists a property tax revenue limit tied specifically to the number of pupils served in the district (as in Wisconsin and other states) which then means that the district would have to reduce its local property taxes to generate only the per pupil revenue allowed. That’s not savings. It’s a state enforced local tax cut.

So then, why do Petrilli and Roza care about Weighted Student Funding as an option? The above two “Unless” scenarios are possible suspects. Blind reformy punditry regardless of logic is equally possible (WSF is cool… reformy… who cares what it does?).

It’s not really about “saving” money at all. Rather, it’s about creating mechanisms to enable local property tax revenues to be diverted in support of charter schools (even if the local taxpayers did not approve the charter), or to have local budgets forcibly reduced/capped when students opt-in to voucher programs (Milwaukee).

And this isn’t really a “weighted student funding” issue at all. In many states, it already works this way (WSF or not). Big savings? Perhaps an opportunity to reduce the state subsidy to charter schools by requiring greater local pass through – in those states where this doesn’t already occur. But these provisions face significant legal battles in some states. If a state is not already doing this, this policy change would also likely lead to significant up front legal expenses.

In fact, I can’t imagine a circumstance where adopting weighted student funding can be expected to either save money or improve outcomes for the same money. There’s simply no proof to this effect. Sadly, while it would seem at the very least, that adopting weighted funding might improve transparency and equity of funding across schools or districts, that’s not necessarily the case either.

My own research finds that districts adopting weighted funding formulas have not necessarily done any better than districts using other budgeting methods when it comes to targeting financial resources on the basis of student needs. See: http://epaa.asu.edu/ojs/index.php/epaa/article/view/5

Petrilli and Roza’s Weighted Funding recommendation for “stretching” the dollar is strange at best. As a recommendation to state policymakers, adoption of weighted funding provides few options for “stretching” the dollar, but may provide a mechanism for diverting districts’ local revenues to support choice programs (potentially reducing state support for those programs).

As a recommendation to local school district officials, adoption of weighted funding really provides no options for “stretching” the dollar, and may, in fact, increase centralized bureaucracy required to develop and manage the complex system of decentralized budgeting that accompanies WSF (see: http://epx.sagepub.com/content/23/1/66.short)

So,

No savings?

No improvements to equity?

No evidence of improved efficiency?

What then, does WSF have to do with “stretching” the school dollar?

Baker, B.D., Elmer, D.R. (2009) The Politics of Off‐the‐Shelf School Finance Reform. Educational Policy 23 (1) 66‐105

Baker, B.D. (2009) Evaluating Marginal Costs with School Level Data: Implications for the Design of Weighted Student Allocation Formulas. Education Policy Analysis Archives 17 (3)

Savings from Small Districts and Schools

I am one who believes in creating savings through consolidation of unnecessarily small schools and school districts. And, at the school or district level, some sizeable savings can be achieved by reorganizing schools into more optimal size configurations (elementary schools of 300 to 500 students and high schools of 600 to 900 for example, See Andrews, Duncombe and Yinger)

For other research on the extent to which consolidation can help cut costs, see Does School District Consolidation Cut Costs, also by Bill Duncombe and John Yinger (the leading experts on this stuff).

Now, Petrilli and Roza, however, seem to imply that the savings from these consolidations or simply from starving the small schools and districts can perhaps help states to sustain the big districts – STRETCHING that small school dollar. Note that Petrilli and Roza ignore entirely the possibility that some of these small schools and districts (in states like Wyoming, western Kansas, Nebraska) might actually have no legitimate consolidation options. Kill them all! Get rid of those useless small schools and districts, I say!

Here’s the thing about de-funding small schools and districts to save big ones. The total amount of money often is not much… BECAUSE THEY ARE SMALL SCHOOLS!!!!!  I learned this while working in Kansas, a state which arguably substantially oversubsidizes small rural school districts, creating significant inequities between those districts and some of the states large towns and cities with high concentrations of needy students. While the inequity can (and should) be reduced, the savings don’t go very far.

So, let’s say we have 6 school districts serving 100 kids each, and spending $16,000 per pupil to do so. Let’s say we can lump them all together and make them produce equal outcomes for only $10,000 per pupil. A bold, bold assumption. We just saved $6,000 per pupil (really unlikely), across 600 pupils. That’s not chump change… it’s $3,600,000 (okay… in most state budgets that is chump change).

So, now let’s take this savings, and give it to the rest of the kids in the state – oh – about 400,000. Well, we just got ourselves about $9 per pupil. Even if we try to save the mid-sized city district of 50,000 students down the road, it’s about $72 per pupil. That is something. And if we can achieve that, then fine. But slashing small districts and schools to save big, or even average ones, usually doesn’t get us very far. BECAUSE THEY ARE SMALL! GET IT! SMALL DISTRICTS WITH SMALL BUDGETS!

Similar issues apply to elimination of very small schools in large urban districts. It’s appropriate strategy – balancing and optimizing enrollment (reorganizing those too-small high schools created as a previous Gates-funded reform?). It should be done. But unless a district is a complete mess of tiny, poorly organized schools, the savings aren’t likely to go that far.

Let’s also remember that major reconfiguration of school level enrollments will require significant up front capital expense! Yep, here we are again with a significant increased expense in the short-term. Duncombe and Yinger discuss this in their work. Strangely, this slips right past Petrilli and Roza.

Use Census Based Funding for Special Education

So, what Petrilli and Roza are arguing here is that states could somehow save money by allocating their special education funding to school districts on an assumption that every school district has a constant share of its enrollment that qualifies for special education programs. Those districts that presently have more? Well, they’ve just been classifying every kid they can find so they can get that special education money. This flat-funding policy will bring them into line… and somehow “stretch” that dollar.

Let’s say we assume that every district has 16% (Pennsylvania) or 14.69% (New Jersey) children qualifying for special education. Let’s say we pick some number, like these, that is about the current average special education population.  Our goal is really to reduce the money flowing to those districts that have higher than average rates. Of course, if we pick the average, we’ll be reducing money to the districts with higher rates and increasing money to the districts with lower rates and you know what – WE’LL SPEND ABOUT THE SAME IN SPECIAL EDUCATION AID? “Stretching?” how?

And will we have accomplished anything close to logical? Let’s see, we will have slammed those districts that have been supposedly over-identifying kids for decades just to get more special ed aid. That, of course, must be good.

BUT, we will also be providing aid for 14.69% of kids to districts that have only 7% or 8% children with disabilities. Funding on a census basis or flat basis requires that we provide excess special education aid to many districts – unless we fund all districts as if they have the same proportion of special education kids as the district with the fewest special education kids. That is, simply cut special education aid to all districts except the one that currently receives the least.

How is that smart “stretching?”

The only way to “save” money with this recommendation is simply to “cut funding” and “cut services.” And, unless cut to the bare minimum, the “flat allocation” strategy requires choosing to “overfund” some districts while “underfunding” others. One might try to argue that this policy change would at least reduce further growth in special ed populations. But the article below suggests that this is not likely the case either. The resulting inequities significantly offset any potential benefits.

There exist a multitude of problems with flat, or census-based special education funding, which have led to declining numbers of states moving in this direction in recent years, New Jersey being an exception. I discuss this with co-authors Matt Ramsey and Preston Green in our forthcoming chapter on special education finance in the Handbook on Special Education Policy Research.

Of course, there also exists the demographic reality that children with disabilities are simply not distributed evenly across cities, towns and rural areas within states, leading to significant inequities when using Census Based funding. CB Funding is, in fact, the antithesis of Weighted Student Funding. How does one reconcile that?

For a recent article on the problems with the underlying assumptions of Census Based special education funding, see:

Baker, B.D., Ramsey, M.J. (2010) What we don’t know can’t hurt us? Evaluating the equity consequences of the assumption of uniform distribution of needs in Census Based special education funding. Journal of Education Finance 35 (3) 245‐275

Here’s a draft copy of our forthcoming book chapter on special education finance: SEF.Baker.Green.Ramsey.Final

Limit Time for ELL/LEP

This one is both absurd and obnoxious. Essentially, Petrilli and Roza argue that kids should be given a time limit to become English proficient and should not be provided supplemental programs or services – or at least the money for them – beyond that time frame. For example, a child might be funded for supplemental services for 2 years, and 2 years only. Some states have done this. Again, there is no clear basis for such cutoffs, nor is it clear how one would even establish the “right” time limit, or whether that time limit would somehow vary based on the level of language proficiency at the starting time.

Yes, this approach, like cutting special education funding can be used to cut spending and cut and reduce the quality of services. But that’s all it is. It’s not “stretching” any dollar.

Other Stuff

Now, the brief does list other state policy options as well as other district practices. Some of these are rather mundane, typical ideas for “cost saving.” But, of course, no evidence or citation of actual cost effectiveness, cost benefit or cut utility analysis is presented. Petrilli and Roza toss around ideas like a) pooling health care costs, b) redesigning sick leave policies or c) shifting health care costs to employees. These are the kind of things that are often on the table anyway.

I fail to see how this new policy brief provides any useful insights in this regard. Some actual cost-benefit analysis would be the way to go. As a guide for such analyses, I recommend Henry Levin and Patrick McEwan’s book on Cost Effectiveness Analysis in Education.

There are a handful of articles available on the topic of incentives associated with varied sick leave policies, including THIS ONE, School District Leave Policies, Teacher Absenteeism, and Student Achievement, by Ron Ehrenberg of Cornell (back in 1991).

One category I might have included above is that at least two of the recommendations embedded in the report argue for stretching the school dollar, so-to-speak, by effectively taxing school employees. That is, setting up a pension system that requires greater contribution from teacher salaries, and doing the same for health care costs. This is a tax – revenue generating (or at least a give back). This is not stretching an existing dollar. This is requiring the public employees, rather than the broader pool of taxpayers (state and/or local), to pay the additional share. One could also classify it as a salary cut. But Petrilli and Roza have already proposed salary cuts in half of the other recommendations. Just say it. Hey… why not just take the “master’s bump” money and use that to pay for pensions and health care? No-one will notice it’s even gone? We all know it was wasted and un-noticed to begin with.

I was particularly intrigued by the entirely reasonable point that school districts should NOT make the harmful cuts by narrowing their curriculum. I was intrigued by this point because this is precisely what Marguerite Roza has been arguing that poor districts MUST do in order to achieve minimum standards within their existing budgets. I wrote about this issue previously HERE. It is an interesting, but welcome about-face to see Roza no-longer arguing that poor, resource constrained school districts should dump all but the basics (while other districts, with more advantaged student populations and more adequate resources need not do the same).

Utter lack of sources/evidence for any/all of this junk

Finally, I encourage you to explore the utter lack of support (or analysis) that the policy brief provides for any/all of its recommendations. It won’t take much time or effort. Read the footnotes. They are downright embarrassing, and in some cases infuriating. At the very least, they border on THINK TANKY MALPRACTICE.

There is a reference to the paper by Dan Goldhaber simulating seniority based layoffs, but that paper provides no analysis of cost/benefit, the central premise of the dollar stretching brief. The Petrilli/Roza (not Goldhaber) assumption is simply that the results will be good, and because we are firing more expensive teachers, it will cost less to get those good results.

The policy brief makes a reference to “typical teacher contracts” (FN2) regarding sick leave, with no citation… no supporting evidence, and phrased rather offensively (18 weeks a year off? For all teachers? Everywhere! OMG???)

FN2: Typical U.S. teacher contracts are for 36.5 weeks per year and include 2.5 weeks sick and personal days for a total work year of 34 weeks, or 18 weeks time off.

The brief refers to work by NCTQ (not the strongest “research” organization) for how to restructure teacher pay.

The report self-cites The Promise of Cafeteria Style Pay (by Roza, non-peer reviewed… schlock), and makes a bizarre generalized attack in footnote 5 that school districts uniformly defend the use of non-teaching staff as substitutes (no evidence/source provided).

FN5: Districts requiring non-teaching staff to serve as substitutes argue that it is good practice to have all staff in classrooms at least a few days a year.

The brief cites policy reports (and punditry) on pension gaps (including the Pew Center report), and those reports refer to alternative plans for closing gaps over time. These are important issues, but the question of how this “stretches” the school dollar is noticeably absent.

And that’s it. That’s the entire extent of “research” and “evidence” used to support this policy brief.

The Curious Duplicity of NCTQ

NCTQ fashions itself as a leading think tank on promoting teacher quality in K-12 education. NCTQ adopts a relatively extreme position that teacher quality is the one and only thing that matters! Teacher quality is THE determining factor of school quality.

I also believe that teacher quality is very important. I also agree with NCTQ on the point that content knowledge, at the middle and secondary levels especially, is particularly important and that simply being listed as “qualified” to teach specific content is no guarantee.

As part of their effort to improve teacher quality, NCTQ has been going around doing “studies” and applying ratings to the quality of teacher preparation institutions. Now, I noted on my previous post that NCTQ and others may actually be missing the boat on who is actually preparing teachers. But lets set that aside for a moment. One would think that if NCTQ is so interested in teacher quality as the primary determinant of school quality and student success, and teacher expertise as an important part of that equation at higher grade levels, that any analysis of the quality of undergraduate or graduate programs to train teachers would have to place significant emphasis on faculty quality and expertise? right? It would make little sense to simply review which textbooks are used or what the course descriptions say, or what the curricular sequence happens to be? Right?

Out of a multitude of indicators on teacher preparation institutions, NCTQ includes only 1 – yes 1 – regarding faculty quality, which is described as follows:

In our evaluation of programs, we examined teaching responsibilities for all faculty members, as indicated by course assignments in course schedules, excluding all clinical coursework. We looked for two specific examples of inappropriate assignments: 1) an instructor teaching across the areas of foundations of education, methods and educational psychology; and/or 2) an instructor who teaches both reading and mathematics methods courses. Other inappropriate assignments may well be made but were not included in our review.

http://www.nctq.org/edschoolreports/illinois/standards/26Methodology.jsp

Yep, that’s it. All that they address is whether a faculty member appears to teach across two areas that no faculty member, in their view, could be sufficiently prepared to teach. The rest is based largely on textbooks chosen, syllabi and course descriptions, regardless of faculty expertise. Clearly this was a matter of data convenience. It’s hard to figure out whether individual faculty members truly possess expertise in their fields, short of evaluating their individual academic backgrounds, research and writing on the topic.

But it is absurd for an organization that believes teacher quality in K-12 education paramount, and content expertise critical, to ignore outright faculty expertise in their evaluations of teacher preparation institutions.

Here’s their FAQ on the long-term project of evaluating teacher preparation programs: http://www.nctq.org/p/response/evaluation_faq.jsp

Related reading (actual research):

Wolf-Wendel, L, Baker, B.D., Twombly, S., Tollefson, N., & Mahlios, M. (2006) Who’s Teaching the Teachers? Evidence from the National Survey of Postsecondary Faculty and Survey of Earned
Doctorates. American Journal of Education 112 (2) 273-300

Biddle me this? (or Flunkout Nation)

While I suspect few people have read or seen this post by RiShawn Biddle of Dropout Nation, I felt that it was worth mentioning because it presents such egregiously flawed logic coupled with flat-out factually incorrect and unsubstantiated claims. Sadly, this is what we have come to all too often in the current education reform debate. And this isn’t really about RiShawn Biddle as an individual or his blog and tweets, but rather about the propensity to argue important and complex issues in such crude terms and with so little knowledge or understanding of context and history.

In a recent post, Biddle argues that the NAACP is and has been heading up a misguided public policy agenda on behalf of black America (my characterization – perhaps not right on target). Biddle argues that Jealous should a) admit that arguments for more funding and more equal funding are wrong and have proven to fail and b) that instead, Jealous should embrace charter schools as a solution.

Now, this is a strange dichotomy to begin with – either fair and adequate funding or charter schools. It seems from a charter advocacy approach that one would also want fair and adequate funding including substantial funding targeted to high need areas. That is, both, not either/or.

That logical point aside, Biddle then goes on to make his bold points to Jealous with the most absurd claims I’ve read, in well, about a week.

Here’s how Biddle explains the failures of legal challenges over school funding:

The NAACP has taken the wrong approach on school reform for far too long. The continuing dropout factory status of Newark, Kansas City, Mo., and other cities that have benefited from funding equity suits is clear evidence that this approach doesn’t spur any kind of reform.

I’ll set aside the equally absurd claims about integration that follow in the next sentence. But, let’s take a quick look at this claim.

First of all, Kansas City, Mo. never really benefited from a case over school funding. Rather, Kansas City, Mo. – this supposed poster child for failed school funding reform – saw a short term boost in funding while under court ordered desegregation. Funding litigation concurrent (1993) with the desegregation litigation had negligible effect on KCMSD funding. Later funding challenges in Missouri were found in favor of the state, producing no benefit to Kansas City, Missouri. In fact, from about 1995 to present, KCMSD funding has generally slid backwards.

Preston Green and I document the disconnect between desegregation litigation in Kansas City and claims of school funding failures in this article: Urban Legends, Desegregation and School Finance: Did Kansas City Really Prove That Money Doesn’t Matter? (which appeared in the Michigan Journal of Race and Law). Among other things, we note:

Critics cite the statistics the KCMSD spent more than $11,000 per pupil and that $2 billion were spent on the desegregation plan as evidence of exorbitant spending. When taken out of context, these numbers appear huge. However, our analysis reveals that the KCMSD was a very high spending district for no more than five years, or the time in which one cohort of children is able to progress through five grade levels in the district. Further, when adjusted for student needs, the KCMSD’s funding dropped below the metropolitan area average by 1998. This is hardly enough time to erase the generational poverty of the KCMSD or alter the residential structure and demographics of a school district that had been designed to be racially segregated until the 1960s.

Biddle also points to Newark, NJ as providing evidence of the failures of school finance reforms. Yet, New Jersey is among those states we discuss here as having some (albeit limited due to data quality) evidentiary basis for the positive effects of state school finance reforms, including court ordered reforms. In School Finance and Courts: Does Reform Matter, and How Can We Tell?, Kevin Welner and I discuss the many flawed claims about the dreadful failures of attempts to improve equity and adequacy of school funding. We also point out how the Kansas City case does not even fit into this category. Regarding the general, popular claims of the failures of funding reforms, Kevin Welner and I review the basis for those claims and conclude:

We conclude that there is arbitrariness in how research in this area appears to have shaped the perceptions and discourse of policymakers and the public. Methodological complexities and design problems plague finance impact studies. Advocacy research that has received considerable attention in the press and elsewhere has taken shortcuts toward desired conclusions, and this is troubling.

We also review more rigorous peer-reviewed studies and find, on balance, that those studies show positive effects of school finance reforms, both in terms of improving equity in student outcomes and in terms of improving the overall level of student outcomes.

Higher quality research, in contrast, shows that states that implemented significant reforms to the level and/or distribution of funding tend to have significant gains in student outcomes. Moreover, we stress the importance of the specific nature of any given reform: positive outcomes are likely to arise only if the reform is both significant and sustained.

And now for the truly ironic part of Biddle’s claim. So, Biddle’s argument to Jealous is that Jealous should drop all this funding equity and integration crap from NAACP’s past, and focus on charters – expanding access to charters. After all, if Kansas City, Missouri had not wasted all that time arguing in court over money and chasing more equitable funding and instead had spent its time pursuing an aggressive strategy of increasing numbers of charter schools, kids in Kansas City – especially poor, minority kids – would have much better educational opportunities!!!!

Biddle argues (regarding NAACP):

It must embrace the charter school movement: After all, charters have been the leading source of improving access to high quality education for urban black and Latino communities, who would otherwise be forced to attend the dropout factories in their neighborhoods. The success of charter school operators such as KIPP and Uncommon Schools — all of which educate mostly-minority students — can be replicated throughout the nation.

I wouldn’t have even written this response (to a post not really worthy of response) had it not been for the fact that I addressed this very topic the other day. As funding for KCMSD was receding as the district moved toward unitary status, what did happen in KCMSD? The massive expansion of charter schools! Here’s what I wrote the other day, upon release of a very interesting report from Kauffman Foundation regarding educational opportunity in Kansas City and the role of charter schools:

Kansas City is #4 on charter market share, according to the National Alliance report, and rose to that position much earlier in the charter proliferation era than other cities. As a result, by reformy logic, Kansas City should be a hotbed for educational opportunity for school-aged children – after years of previously throwing money down the drain in the Kansas City Missouri Public School District (many of these claims actually being Urban Legend).

In Kansas City, the reality of charter expansion has clashed substantially with the reformy ideology. Arthur Benson in a recent Kansas City Star Op Ed, noted:

Charters have subtle means for selecting or de-selecting students to fit their school’s model. The Kansas City School District keeps its doors open to non-English speakers and all those kids sent back from the charter schools. In spite of those hurdles, Kansas City district schools across the board out-perform charter schools. That is not saying much. We have until recently failed 80 percent of our kids, but most charters fail more.

I was initially curious about Benson’s (a district board member and attorney) claims that charters have done so poorly in Kansas City. Could it really be that the massive expansion of charter schools in Kansas City has done little to improve and may have aided in the erosion of high quality educational opportunities for Kansas City children?

The recent Kauffman Foundation report draws some similar conclusions, and Kauffman Foundation has generally been an advocate for charter schools. The report classifies district and charter schools into groups by performance, with level 4 being the lowest, and level 1 being the only acceptable group.

  • Level I- A school that met or exceeded the state standard on the MAP Communication Arts and Mathematics exams in 2008-2009.
  • Level II- A school that scored between 75 and 99 percent of the state standard on the MAP Communication Arts and Mathematics exams in 2008-2009.
  • Level III– A school that scored between 50 and 74 percent of the state standard on the MAP Communication Arts and Mathematics exams in 2008-2009.
  • Level IV– A school that scored below 50 percent of the state standard on the MAP Communication

Among other things, the report found that charter operators had avoided opening schools in the neediest neighborhoods. Rather, they set up shop in lower need neighborhoods, potentially exacerbating disparities in opportunities across the city’s zip codes. The report recommended:

A strategy for charter school growth should be developed by Kansas City education leaders. Charter schools should only be approved by DESE if they can demonstrate how they intend to fill a geographic need or a specific void in the communities they intend to serve.

Regarding charter performance more generally, the report noted:

In many communities charter schools are a model that increases students’ access to better public schools, but the majority of charter school students (5,490 or 64.7 percent) are in a Level IV school. Many of Kansas City’s charters have existed for 10 years and are still not able to reach even half of state standard.

Now, I’m not sure I accept their premise that in many communities this actually works – and that it just went awry for some strange reason in Kansas City. That said, the reality in Kansas City, by the authors own acknowledgment is in sharp contrast with the reality the authors believe exists in other cities.

One implication (not tested directly) of this report is that the massive charter school expansion that occurred in Kansas City may have done little or nothing to improve the overall availability or distribution of educational opportunities for children in that city and may have actually made things worse.

So, Mr. Biddle, I urge you to do a little reading. Check a few facts and consider your arguments carefully. Your current arguments FLUNK at even the most basic level.

More importantly, others who come across such bombastic claims like those argued by Biddle should scrutinize those claims carefully. Heck, I hope you scrutinize the stuff in my own posts carefully too. I try to shoot for a reasonably high level of rigor and factual accuracy in these posts and do what I can to cite my claims to respectable sources. Biddle’s blog to Benjamin Jealous is, to me, an example of the worst form of ill-conceived, factually incorrect, contorted reform logic out there. Sadly, there’s way too much of it.

READINGS

Baker, B.D., Welner, K. (2011) School Finance and Courts: Does Reform Matter, and How Can We Tell? Teachers College Record 113 (11) p. –

Green, P.C., Baker, B.D. (2006) Urban Legends, Desegregation and School Finance: Did Kansas City really prove that money doesn’t matter? Michigan Journal of Race and Law 12 (1) 57-105

Intellectual Pathologies of the Reformy World (Kevin vs. Kevin)

Yesterday, a colleague and coauthor on two recent articles – Kevin Welner (U. of Colorado) – wrote a scathing critique of the manifesto on fixing urban schools that was released last week by several large city superintendents.

Kevin Welner’s commentary can be found here: http://voices.washingtonpost.com/answer-sheet/guest-bloggers/manifesto-should-be-resignatio.html

The manifesto can be found here: http://www.washingtonpost.com/wp-dyn/content/article/2010/10/07/AR2010100705078.html

Kevin Carey notes in his critique of Kevin Welner:

I highlight this because it’s crucial to understanding the worst intellectual pathologies of the education establishment. People like Welner don’t just think that Joel Klein, Michele Rhee, Andres Alonso, and Arlene Ackerman are making bad decisions in the course of helping poor children learn. Welner believes that by asserting that poor children can learn, the superintendents are hurting the cause of making poor children less poor. While many people believe this, most choose not to say it so clearly.

http://www.quickanded.com/2010/10/the-supposed-trouble-with-helping-poor-students-learn.html

I urge you to take a look at what Kevin Welner actually said in his commentary. The centerpiece of Kevin Welner’s argument was that the superintendents and others behind the manifesto were making a strong sales pitch for fast-tracking education reform strategies for which the research base is mixed at best. Kevin Welner asks:

Are these adults acting responsibly when they advocate for even more test-based accountability and school choice? Over the past two decades, haven’t these two policies dominated the reform landscape – and what do we have to show for it? Wouldn’t true reform move away from what has not been working, rather than further intensifying those ineffective policies? Are they acting responsibly when they promote unproven gimmicks as solutions?

Are they acting responsibly when they do not acknowledge their own role in failing to secure the opportunities and resources needed by students in their own districts, opting instead to place the blame on those struggling in classrooms to help students learn?

And Kevin Welner summarizes the manifesto as follows:

Move money from neighborhood schools to charter schools!
Make children take more tests!
Move money from classrooms to online learning!
Blame teachers and their unions – make them easier to fire!
Tie teacher jobs and salaries to student test scores!

Explaining:

None – literally NONE – of these gimmicks is evidence-based.

I tend to agree that the findings on expansion of charters are mixed at best, and that tying teacher ratings to test scores is deeply problematic. Perhaps what irked Kevin Carey most here, is that he has convinced himself, through exceedingly flimsy logic, that he Kevin Carey is right, and that other Kevin, Kevin Welner is simply wrong on these points. Allow me to bring you back to a series of recent comments by Kevin Carey that display his completely distorted understanding of research on charters (and implications for policy) and the usefulness of value-added modeling to rate teachers.

Kevin Carey on Charters

Here’s a recent quote from Kevin Carey, attacking the civil rights framework on whether the evidence supports expansion of charter schools.

Here’s the problem: the contention that charters have “little or no evidentiary support” rests on studies finding that the average performance of all charters is generally indistinguishable from the average regular public school. At the same time, reasonable people acknowledge that the best charter schools–let’s call them “high-quality” charter schools–are really good, and there’s plenty of research to support this.

http://www.quickanded.com/2010/08/evidence-and-the-civil-rights-group-framework.html

I have noted previously, here, that I find this to be one of the most patently stupid arguments I think I’ve seen in a long time.

To put it in really simple terms:

THE UPPER HALF OF ALL SCHOOLS OUTPERFORM THE AVERAGE OF ALL SCHOOLS!!!!!

or … Good schools outperform average ones. Really?

Why should that be any different for charter schools (accepting a similar distribution) that have a similar average performance to all schools?

This is absurd logic for promoting charter schools as some sort of unified reform strategy – Saying… we want to replicate the best charter schools (not that other half of them that don’t do so well).

Yes, one can point to specific analyses of specific charter models adopted in specific locations and identify them as particularly successful. And, we might learn something from these models which might be used in new charter schools or might even be used in traditional public schools.

But the idea that “successful charters” (the upper half) are evidence that charters are “successful” is just plain silly.

Kevin Carey on Value-Added Teacher Ratings

In the New York Times Room for Debate series on value-added measurement of teachers, Carey argued that Value-added measures would protect teachers from favoritism. Principals would no-longer be able to go after certain teachers based on their own personal biases. Teachers would be able to back up their “real” performance with hard data. Here’s a quote:

“Value-added analysis can protect teachers from favoritism by using hard numbers and allow those with unorthodox methods to prove their worth.” (Kevin Carey, here)

The reality is that value-added measures simply create new opportunities to manipulate teacher evaluations through favoritism. In fact, it might even be easier to get a teacher fired by making sure the teacher has a weak value-added scorecard. Because value-added estimates are sensitive to non-random assignment of students, principals can easily manipulate the distributions of disruptive students, students with special needs, students with weak prior growth and other factors, which, if not fully accounted for by the VA model will bias teacher ratings. More here!

Kevin Carey also claims as a matter of accepted fact, that VA measures “level the playing field for teachers who are assigned students of different ability.” This statement, as a general conclusion, is wrong.

  1. VA measures do account for the initial performance level of individual students, or they would not be VA measures. Even this becomes problematic when measures are annual rather than fall/spring, so that summer learning loss is included in the year to year gain. An even more thorough approach for reducing model bias is to have multiple years of lagged scores on each child in order to estimate the extent to which a teacher can change a child’s trajectory (growth curve). That makes it more difficult to evaluate 3rd or 4th grade teachers, where many lagged scores aren’t yet available. The LAT model may have had multiple years of data on each teacher, but didn’t have multiple lagged scores on each child. All that the LAT approach does is to generate a more stable measure for a teacher, even if it is merely a stable measure of the bias of which students that teacher typically has assigned to him/her.
  2. VA measures might crudely account for socio-economic status, disability status or language proficiency status, which may also  affect learning gains. But, typical VA models, like the LA Times model by Buddin tend to use relatively crude, dichotomous proxies/indicators for these things. They don’t effectively capture the range of differences among kids. They don’t capture numerous potentially important, unmeasured differences.  Nor do they typically capture classroom composition – peer group – effect which has been shown to be significant in many studies, whether measured by racial/ethnic/socioeconomic composition of the peer group or by average performance of the peer group.
  3. For students who have more than one teacher across subjects (and/or teaching aides/assistants), each teacher’s VA measures may be influenced by the other teachers serving the same students.

I could go on, but recommend revisiting my previous posts on the topic where I have already addressed most of these concerns.

Intellectual pathologies?  Pot… kettle?


New from the Center on Inventing Research Findings

The other day, the Center on Reinventing Public Education (CRPE) at University of Washington released a bold new study claiming that Washington school districts underpay Math and Science teachers relative to other teachers – which is clearly an abomination in a state that is home to high-tech industries like Boeing and Microsoft.

The study consisted of looking at the average salaries of math and science teachers and other teachers in several large Washington State school districts and showing that in most, the average for math and science teachers is lower than for other teachers. As it turns out, the average experience of math and science teachers is lower and far more of them are in their first five years. So, it’s mainly about the experience differential. The authors infer from this that turnover of math and science teachers must be higher, but never actually test this assumption. They next infer that this turnover must be a function of having less competitive salaries – relative to what they could earn outside of teaching.

The study never calculates relative turnover of math and science versus other teachers. Rather, the study implies that lower average experience levels must be indicative of higher turnover. The only follow-up analysis on this point is to show that math and science teachers, in addition to being less experienced, are also younger. Wow! That doesn’t validate the turnover claim though, which may be true… but no validation here.

This is a silly study to begin with, but check out the not-so-subtle difference between the press release and the study itself.

The Press Release
http://www.crpe.org/cs/crpe/view/news/111

The analysis finds that in twenty-five of the thirty largest districts, math and science teachers had fewer years of teaching experience due to higher turnover—an indication that labor market forces do indeed vary with subject matter expertise. The subject-neutral salary schedule works to ignore these differences.

The Study
http://www.crpe.org/cs/crpe/download/csr_files/rr_crpe_STEM_Aug10.pdf

That said, the lower teacher experience levels are indicative of greater turnover among the math and science teaching ranks, lending support to the hypothesis that math and science teachers may have access to more compelling non-teaching opportunities than do their peers. (p. 5)

Both are a stretch, given the thin analysis, but the press release declares outright that turnover is the issue, while the study merely infers without ever testing or validating.

The study goes on to be an indictment of paying teachers more for years of experience – (because we all know that experience doesn’t matter?) – and argues that differential pay by teaching field is the answer. This is an absurd false dichotomy. Even if it is reasonable to differentiate pay by teaching field that does not mean that it is unreasonable to differentiate by experience, or that taking dollars away from experience-based pay is the only way to differentiate by field.

I happen to agree that there exist significant problems with Washington’s statewide teacher salary schedule, and that among other things, math and science teachers in Washington State are disadvantaged on the broader labor market. But the CRPE study does nothing to advance this argument.

Previous work by Lori Taylor, of Texas A&M does:

Report on Taylor Study:

http://www.wsipp.wa.gov/rptfiles/08-12-2201.pdf

Taylor Study:

http://www.leg.wa.gov/JointCommittees/BEF/Documents/Mtg11-10_11-08/WAWagesDraftRpt.pdf

The CRPE study goes further to say that the findings indicate that school districts haven’t taken seriously a state policy initiative to increase investment in math and science teaching. So let’s say that the bill to which the CRPE press release refers – House Bill 2621 – really did stimulate districts to step up their efforts to hire more math and science teachers. What would likely happen to math and science teacher average salaries? Well, many new math and science teachers would enter the system. That would alter the experience distribution of math and science teachers – they would likely become less experienced on average – and hence their average salaries would decline and be lower than average salaries in other fields not stimulated by similar initiatives.

When I get a chance, I’ll try to play around with my Washington teacher data set and post some follow-up analyses.

Kevin Welner and I point to similar misrepresentations of findings from several reports from this same center in this article on within and between-district financial disparities:

Baker, B. D., & Welner, K. G. (2010). “Premature celebrations: The persistence of interdistrict funding disparities” Educational Policy Analysis Archives, 18(9). Retrieved [date] from http://epaa.asu.edu/ojs/article/view/718

And now, for some fun follow-up figures:

These figures use individual teacher level data from the State of Washington. I include all teachers holding “secondary” assignments and identify teachers certified to teach biology, chemistry, physics, general science and math (and all subcategories) using the certification record files on the same teachers. Note that some teachers in the data set hold multiple assignments, so the total numbers of cases in these graphs is not an exact match for the total number of individual teachers. I haven’t asked for Washington Teacher data for a few years, so these only go up to 2006-07. Unlike the CRPE report, which cherry picks 30 districts, I use the whole state. If I get a chance, I’ll play with some other cuts at the data. These data don’t coincide at all with the CRPE “findings.”

Here are the experience differences:

Here are the salary differences, on average, which coincide with the experience differences:

Now, here are the total numbers of teachers, and apparent decline in share that are math/science certified over this time period. Math/science teachers were relatively flat, while others grew.

Finally, here’s a portion of the regression model of certified base salaries, where I control for degree level, experience, year, hours per day and days per year, all of which influence salaries. Interestingly, this regression shows that math and science teachers, holding all that other stuff constant, made about $380 more than non-math/science teachers, even under the fixed salary schedule.


Newsflash: The upper half is better than average!

I’ve seen many versions of this argument in the past year, but this one comes from Kevin Carey in response to the Civil Rights Framework which criticized the current administration’s overemphasis on Charter Schools as lacking evidentiary support. Carey responds that the Civil Rights Framework selectively interprets the research on Charter schools, noting:

Here’s the problem: the contention that charters have “little or no evidentiary support” rests on studies finding that the average performance of all charters is generally indistinguishable from the average regular public school. At the same time, reasonable people acknowledge that the best charter schools–let’s call them “high-quality” charter schools–are really good, and there’s plenty of research to support this.

http://www.quickanded.com/2010/08/evidence-and-the-civil-rights-group-framework.html

I recall a similar comment in the media a few months back, by a researcher, regarding a national charter schools study – something to the effect of – Charter schools on average performed similarly to traditional public schools, but if we look at the upper half of the charter schools in the sample, they substantially  outperformed the average public school serving similar students.

These statements have been driving me crazy for months now. Here’s why –

To put it in really simple terms:

THE UPPER HALF OF ALL SCHOOLS OUTPERFORM THE AVERAGE OF ALL SCHOOLS!!!!!

or … Good schools outperform average ones. Really?

Why should that be any different for charter schools (accepting a similar distribution) that have a similar average performance to all schools?

This is absurd logic for promoting charter schools as some sort of unified reform strategy – Saying… we want to replicate the best charter schools (not that other half of them that don’t do so well).

Yes, one can point to specific analyses of specific charter models adopted in specific locations and identify them as particularly successful. And, we might learn something from these models which might be used in new charter schools or might even be used in traditional public schools.

But the idea that “successful charters” (the upper half) are evidence that charters are “successful” is just plain silly.

=======

Let’s throw a few visuals and numbers on my whining session above.  Below are some snapshots of New York City Charter schools. First, lets take a quick look at the mismatched demographics of New York City charters compared to same grade level traditional public schools. Here are the Free Lunch rates. I’ve tended to focus on Free Lunch rates rather than Free and Reduced, because Free Lunch falls under a lower poverty threshold, and, as my previous analyses have shown, while charters often serve similar numbers of combined free and reduced lunch children, they tend to serve the less poor among the poor (larger reduced shares, smaller free shares). This graph confirms my previous findings, and is based on data corroborated from both the NCES Common Core, Public School Universe Data from 2007-08 and the New York State Education Department School Report Cards.  Note also that the biggest differences are at the elementary level, which covers most of the charter schools.

Second, let’s look at the rates of children who are limited in their English Language proficiency. Here, the differences at the elementary level are huge! Charters in NYC simply don’t serve limited English proficient children!

Now for a few oversimplified scatterplots comparing charter school performance outcomes to traditional public schools – all “Regular Schools” by the school type classification in then NCES Common Core – and compared against those in the same borough. I’ve focused on Brooklyn and the Bronx here because of the wide variations in student population composition across Manhattan schools.

First note that none of the charters in the Bronx which had 8th grade 2009 test scores available had a free lunch rate over 80%, while several traditional public schools in the Bronx did. This chart shows the relationship between % scoring level 4 (top level) and % qualifying for free lunch. Charters are named and shown in red. Traditional publics are hollow circles. Both groups scatter! In fact, there are a few traditional publics at the top (which may be classified as “regular schools” but may be far from regular). Among Charters, Bronx Prep, KIPP Academy and Icahn 1 do rather well. Hyde Leadership (higher poverty than the other charters) and Harriet Tubman – not so well. But there are plenty of traditional public schools in the Bronx that appear to do well, and others not so well.

Here are the Brooklyn Charters and traditional public schools on the same outcome measure – percent scoring level 4 or higher on 8th grade math.  Here, all but Brooklyn Excelsior Charter have much lower poverty rates and simply aren’t comparable to most Brooklyn traditional public schools. And don’t forget, there are also likely very large differences in rates of children with other needs – like limited English proficiency. Williamsburg Collegiate and Brooklyn Excelsior appear to be doing quite well. But then again, Williamsburg Collegiate starts at 5th grade, so their success is likely at least partly a function of feeder schools.  There are plenty of “high flying” traditional public schools in this picture as well… and likely a few unique explanations as to why they fly so high. There are also plenty of low-flying charters. Here are the Bronx Charters in 2009, on 5th grade math. Again, the charters generally have much lower free lunch rates than the traditional public schools. In this figure, most of the traditional public schools have free lunch rates over 80% while none of the charters do. And again, charter performance, like traditional public school performance is scattered – some low – some high.

And finally, those Brooklyn charters on 5th grade math performance. Low poverty and scattered (except Brooklyn Excelsior which is higher poverty, and seemingly doin’ pretty well).A few new ones – Here are the Bronx and Brooklyn charter 5th grade performance levels based on a regression model controlling for stability rates, free lunch, ELL concentrations, year of data (using 2008 and 2009) and comparing specifically against other schools in the same borough. The performance levels are represented by the residuals of the regression model. Above “0” on the vertical axis is “better than predicted – or better than average at given characteristics, and below “0” is below expected at given characteristics.

In these graphs, most of the highest high flyers are non-charters. Charters are split above and below the “0” line, as one might expect.

Anyway, on this cursory walk through of the relative demographics and relative position of charters in the performance mix, it continues to evade me as to why we should be considering “charters” as a specific reform strategy and one that can raise urban school districts from their dreadful depths of failure. Had I not indicated which schools were charters in these graphs, I wonder how many “reformy” types could have picked out the dots that were charters. I suspect, given a blind sample, they would select the dots that fall furthest out of line in the upper right hand corner of each graph – the highest performing high poverty schools.  In three of the above 4 graphs, they’d have picked non-charters first and would have done so on the misguided perceptions that a) charters are the high flyers in any mix of schools and b) charters serve very high poverty populations. The reality is that charters are as scattered as traditional schools, and in general in NYC, they are serving lower need populations.

=======

A little more fun here. Here are schools in the area around the Harlem Children’s Zone. First, here are the maps of free lunch shares and LEP shares for charter and traditional public schools.  Green dots have lower rates of LEP or free lunch. Stars indicate charters. Names are adjacent to schools. Note that most of the charters are lower poverty and much lower LEP than surrounding schools.

And here are the residuals of the same regression model used above, applied in this case to Grade 5 Math Mean Scale Scores. Red dots are schools that perform less well than expected and green dots are those that perform much better than expected. Note that charters are a mixed bag, and the HCZ charter performs particularly poorly – which caught me off guard.

Another “You Cannot be Serious!” The demise of private sector preschool in New Jersey?

There is little I find more enjoyable than boldly stated claims where the claims are entirely unsubstantiated… but where data are relatively accessible for testing those claims.

This week, the Governor’s Task Force on Privatization in New Jersey released their final report on the virtues of privatization for specific services. I took particular interest in the claims made about preschool in New Jersey. Preschool programs were expanded significantly with public support for both public and private programs for 3 and 4 year olds following the 1998 NJ Supreme Court ruling in Abbott v. Burke. For more information on the rulings and Abbott pre-school programs, see: http://www.edlawcenter.org/ELCPublic/AbbottPreschool/AbbottPreschoolProgram.htm

Here are the claims made in the privatization report:

•At the program’s inception, nearly 100 percent of students were served by providers in the private sector, many of which are women‐and minority‐owned businesses. Now, approximately 60 percent are served by private providers, as traditional districts have built preschools at great public expense and unfairly regulated their private‐sector competitors out of business.

•There are currently two sets of state regulations governing pre‐k. The majority of private pre‐k providers are subject to Dept. of Children and Families (DCF) regulations, but private pre‐k providers working in the former Abbott districts and serving low‐income children in some other districts are subject to the regulation of the DOE and the respective districts themselves, effectively crowding out the private sector and driving up costs to the taxpayer without any documented benefit to the children they serve.

To summarize, the over-subsidized public option of Abbott preschool has decimated the private preschool market in New Jersey, adding numerous women and minority business owners to the unemployment roles since the program was implemented (okay… a bit extreme… but I suspect you’ll hear it spun this way… since the above language isn’t far off from this).
The last time I read something this silly was in a research report from The Reason Foundation regarding “weighted student funding.” Not surprisingly, the Reason Foundation is among the only sources cited for… anything… in this report on the virtues of privatization (see page 4).

In this post, I’ll address two issues:

First, I address whether the claim that private preschool enrollment has dropped is true. Has private preschool in New Jersey actually been decimated since the 1998 Abbott decision? Are there that many fewer slots in private versus public preschools than before that time? Have public programs continued to grow while private programs have been eliminated? Has private preschool enrollment declined at any greater rate than private school enrollment generally? if at all?

Second, I revisit some of my previous findings about private versus public school markets, cost and quality. The recommendation that follows from the above claims is that the state, instead of continuing to subsidize expensive Abbott preschool programs, should allow any private provider to participate without Abbott regulation. This, it is assumed, would dramatically reduce costs. Rather, this might reduce expenditures… and the quality of service along with it. Lower spending (not cost) private providers simply don’t and can’t offer what higher spending providers do. Cost assumes specific quality, and lower “cost” assumes that less can be spent for the same quality. In this case, quality is being ignored entirely (or assumed entirely unimportant). That is, the proposed plan of allowing any private provider to house “preschool” students would likely be the equivalent of subsidized “daycare” (minimally compliant with Dept. of Children and Families (DCF) regulations) and not actual “pre-school.”

Issue 1

For these first four figures, I use data from the U.S.Census Bureau’s Integrated Public Use Microdata System. One of my favorites. Specifically, I evaluate the school enrollment patterns of 3 and 4 year olds in New Jersey from 1990 to 2008, by school type. Note that Census IPUMS data are actually not great for evaluating parent responses to the “school” enrollment question for 3 and 4 year olds, because in many cases a parent will identify their child as being in “school” even if the child is merely in daycare… home based, non-instructional, or any type of daycare. This is not hugely problematic here, because the report on privatization assumes that home based daycare or anything registered with DCF to supervise children during the day qualifies as a pre-school.  If anything, there may be under-reporting of private enrollment in these data by parents who actually don’t consider their private daycare to be “school.”

For 3 year olds, from 1990 to 2000, both public and private enrollment increase, while non-enrollment decreases. Public and private enrollment then stay relatively steady, except for an apparent increase in private enrollment in 2008 (I’m not confident in this bump, having seen other odd jumps between 2007 and 2008 IPUMS data). In any case, it would not appear that public enrollment has continued to severely squeeze out the private market place, unless we were to assume that the private market would have absorbed the entirety of the reduction in non-enrollment.  The lack of substantive shift from 2000 to 2008, with privates if anything, increasing their share, suggests that public subsidized have not led to the collapse of the private preschool market.

The next two figures show the enrollment patterns for 4 year olds. In general, 4 year olds are more likely to be enrolled in school, public or private, and less likely to be non-enrolled. As with 3 year olds, there really aren’t any substantive changes to the relative enrollment of 4 year olds in public and private settings between 2000 and 2008. No collapse of the private market here.


As an alternative, I explore the enrollment of private schools which provide pre-kindergarten programs statewide, using the National Center for Education Statistics Private School Universe Survey. Using this data set, we can determine whether the number of enrollment slots at the preschool level among private providers has declined, and whether the decline in private preschool enrollment has been greater than the decline in private school enrollment more generally.  Note that much has been made of the “collapse” of private schooling in New Jersey in the context of the New Jersey Opportunity Scholarship Act.

This figure shows that private school enrollment generally has declined more than private preschool enrollment since 2000. Private preschool enrollment has remained relatively stagnant statewide from 2002 to 2008. No real collapse of private preschools evident here.

Issue 2

As I noted above, preschool might be defined in many different ways. On the one hand, we might wish to consider preschool to be any place that meets minimum health and safety guidelines for caring for children between the ages of 3 and 4. To me, that sounds more like daycare. Alternatively, preschool might actually involve specific curriculum and activities as well as training for personnel, etc. Obviously, these differences in definition can and likely do significantly influence the cost per child of offering the service. If I can hire high school graduates and rely heavily in parent volunteers, and use only minimally compliant physical space to supervise children at play – mix in story time – I can likely do things relatively cheaply. On the other hand, if I actually have to hire teachers who hold college degrees and provide a specific curriculum and have appropriate physical spaces in which to do those things, it’s likely going to get more expensive – publicly or privately provided. It’s not so much about whether it’s publicly or privately provided, but whether there are minimum expectations for what defines “preschool.”

The elementary and secondary private school market is highly stratified by price and quality, as I have discussed on many previous occasions. YOU GET WHAT YOU PAY FOR. Yeah… I know that clashes with the appealing logic that private providers always do more with less…. thwarting the “you get what you pay for” assumption… or even reversing it… ‘cuz private provides do so much more with so much less. But let’s look again at one of my favorite summaries – with a new presentation – of the private school market. Here’s the earlier version.

This figure lines up the national average (regionally cost adjusted for each regional cluster) a) per pupil spending, b) pupil to teacher ratios and c) percentage of teachers who attended competitive undergraduate colleges, for private schools by private school type. Public school expenditures sit right near the middle. The small group of Catholic schools in the national sample sit right along side public schools (the system of Catholic schools has evolved to look much like their public school counterparts over time).  Independent schools spend nearly twice what public schools spend, have much smaller class sizes and have very high percentages of teachers who attended competitive undergraduate colleges. Hebrew and Jewish day schools lie about half way between the elite privates and public and Catholic schools. At the other end of the private school market are conservative christian schools, which spend much less per pupil than public or Catholic schools. They do have somewhat smaller class sizes, but have very poorly paid teachers, and have few if any teachers who attended competitive colleges. For more on these comparisons, see: https://schoolfinance101.wordpress.com/2010/02/20/stossel-coulson-misinformation-on-private-vs-public-school-costs/. In short, this figure shows that even in the k-12 marketplace, private providers are very diverse, some offering small class sizes and highly qualified teachers for a much higher price than public schools, and others offering much less.

We can certainly expect at least as much variation in the private preschool marketplace, if not one-heck-of-a-lot more, since many private daycare facilities require little or no formal training and no college degree for their employees.

As an aside, I was driving down Route 202 the other day west of Somerville Circle and noticed that they are putting in a Creme-de-la-Creme “daycare/preschool.”  We had one around the corner from our house in Leawood, KS.  I suspect that few of the Abbott preschool facilities built at such great expense compare favorably to a “Creme” facility – with waterpark (we’re talking slides, fountains), mini tennis court, indoor fish pond, tv studio, etc. (at least that’s what the one in Leawood had. I expect nothing less here?).  I expect that many parents, having toured many other “less desirable” daycare and preschools, will decide that their child deserves the “Creme” lifestyle (I suspect that there are actually other options with better curriculum and perhaps better teachers in the area, but I have not had the occasion to research it). It’s just an extreme example of the diversity of the private preschool marketplace. I suspect the cost per pupil will far exceed that of the Abbott preschools (heck… it already exceeded $12k per year in Kansas several years ago).

To summarize, the Task Force report on privatization makes bold claims about Abbott preschool programs crowding out, and decimating private preschool programs, many run by women and minority business owners. But the Task Force report does not bother to substantiate a) that private preschools have actually suffered, or b) that any, if they had suffered, were actually owned and operated by women or minorities. The only “evidence” the report has to offer is the undocumented claim that 100% of kids were in private programs and now only 60% are. Where does that come from? What the heck is that? 100% of who? 60% of what?

Further, the Task Force report is willing to assume that warehousing 3 and 4 year olds under the supervision of high school graduates in physical spaces and with supervision ratios compliant with DCF regulations is sufficient for low-income and minority children… or rather… that it is the lower cost option with equivalent quality to Abbott pre-school programs (public or publicly regulated private). It is critically important that we acknowledge the difference in the quality or even type of service received at different price points. Like the private K-12 market, the private preschool market varies widely, and spending much less generally means getting much less.

=====

See also, the Abbott 5th year report: http://edlawcenter.org/ELCPublic/Publications/PDF/PreschoolFifthYearReport.pdf

Manual for Child Care Centers from DCF in NJ: http://www.nj.gov/dcf/divisions/licensing/CCCmanual.pdf

Can’t forget this:

The Gist Twist(s) & Rhode Island School Finance

So, I’ve tried not to… but I’ve been following the relatively uninformed debate over Rhode Island’s nifty new Foundation Aid formula on the National Journal “Experts” Blog.

http://education.nationaljournal.com/2010/06/a-funding-formula-for-success.php#comments

Yep, Rhode Island has invented the… wheel… or perhaps bread… one or the other. Pretty much a run-of-the-mill foundation aid formula here. And that’s not necessarily a bad thing. But there are a number of “wait and see” issues here… like how well the crafty state-local matching aid formula will work and to what extent the single relatively small and completely arbitrary poverty weight will actually drive additional funding to higher poverty districts.

One thing really caught my eye in Deborah Gist’s response to David Sciarra. Mr. Sciarra criticized the inclusion of New Hampshire in the calculation of the foundation aid level for the 2010-11 incarnation – adoption year incarnation of the nifty new bread/wheel. Here’s how Gist responds:

1. Our core instructional amount was based on national research, using data from the NCES, is sufficient to fund the requirements of the Rhode Island Basic Education Program, and it in no way focused on states with low per-pupil expenditures. In fact, we looked particularly carefully at our neighboring states, which have some of the highest per-pupil expenditures in the nation, and we included only those states that have an organizational structure and staffing patterns similar to ours.

First, I must say that it is a strange use of the term “national research” to refer to simply taking averages of spending data from states collected from a national survey, jointly from the National Center for Education Statistics and Census Bureau. It’s an annual survey. Collection of data. Not national research. It could be used for research. Heck, I love those data and know them oh too well. Which brings me to the Gist Twist here. And, it’s a three part twist.

You see, the goal is to identify an underlying “foundation” level of funding for school districts in Rhode Island.

Twist Part I: The first part of the twist, which I will not dig through here in great detail, is the pruning back of core instructional expenditures, a definition in the NCES data intended to be reported uniformly across states, albeit imperfect. The choice of core versus all current operating expense clearly drops the foundation value, and quite significantly. What remains unknown is the extent to which other aid beyond the foundation formula will actually address those other cost areas. In 2007-08, Rhode Island instructional spending per pupil was about $8,500 and current operating expenditures per pupil over $14,000. That’s a big difference to cover with other aid. Let’s hope they do.

Twist Part II: I was also quite intrigued by Gist’s explanation of how national data were used, and her defense to the accusation that they picked low spending states and took the average of the low spending states. Gist responds by saying they took “neighbors” of Rhode Island, which are, of course high spending states.

Here’s how the actual legislation describes the process:

(1) The core instruction amount shall be an amount equal to a statewide per pupil core instruction amount as established by the department of elementary and secondary education, derived from the average of northeast regional expenditure data for the states of Rhode Island, Massachusetts, Connecticut, and New Hampshire from the National Center for Education Statistics (NCES) that will adequately fund the student instructional needs as described in the basic education program and multiplied by the district average daily membership as defined in section 16-7-22.

http://www.ride.ri.gov/Finance/Funding/FundingFormula/Docs/H8094Aaa_FINAL_6_10_10.pdf

Even though I love maps, I won’t post one here. Maybe it’s because I used to teach in New Hampshire, and once lived in eastern Connecticut that I realize that one of these two is actually a neighbor of Rhode Island and one is not. Okay… for those of you pulling out your maps to figure out how all of those tiny New England states line up… yeah… New Hampshire does not neighbor Rhode Island. So then, why include New Hampshire in the calculation of the average instructional expenditures to set the Rhode Island foundation. Okay… let’s set aside the fact that this whole approach is actually not a reasonable way to identify the costs of meeting Rhode Island’s education standards, in Rhode Island districts and charter schools. But if you’re going to go down this road, the decisions should be somewhat justifiable.

Here’s the average core instructional spending per pupil for the states used:

Hmmmm… which one of these is not like the others? Yeah… New Hampshire’s per pupil spending is somewhat lower. But, it is a smaller state than the other two, and thus has lessened effect on the averages.  Oh… by the way… “similar organizational structure” as noted by Gist above, was her/their way of cutting out Vermont from the averages – because Vermont has too many non-unified districts – or actually – because Vermont is the highest spending of these states.

Here’s the effect on the averages. Including New Hampshire brings the average down by just under $200 per pupil. While this doesn’t seem like a lot, it’s about 1/3 of the difference between Rhode Island’s current spending per pupil and the target spending. That is, including New Hampshire cuts the aggregate increases in funding (difference between RI current and Target) required by about 1/3 … but that’s before we get to Part III of the twist.

Twist Part III: As far as I can tell, the proposed foundation level for fy2010-11 or even fy2011-2012? is to be set at $8,295.  Please correct me if this is not true.  That’s the amount cited here on slide #8:

http://www.ride.ri.gov/Finance/Funding/FundingFormula/Docs/Formula_PPT.pdf

And in any other documentation in which a foundation number is cited. These documents are generally from this past winter/spring leading up to passage of the legislation. So what’s wrong with that?  Well, the average spending of CT, MA and NH which comes out to about $8,295 (actually, mine comes out to $8,259) is from data from fiscal year 2006-07. Are they really basing the 2010-11 or 2011-12 foundation level on 2006-07 data?  Take a look at my second graph above. The 2007-08 data came out the other day. And, as it turns out, the 2007-08 Rhode Island average core instructional spending per pupil was over $8,500. That’s actually more than the new foundation level.

That’s not to say that it can’t be reasonable to have a foundation level that’s less than current average spending. After all, the average spending is the average of all districts, including their varied needs. It is conceivable that the current average is more than sufficient… to achieve current average performance in districts with less than average needs. But that’s not how this is being spun at all. Rather, it’s being spun as a breakthrough based on thorough and thoughtful empirical analysis.  That’s hardly the case.

Quite honestly, Ms. Gist and the RI legislature may have been better off saying that the foundation level will be set at $8,295 because that’s how much we are willing to pay for – not this silly back of the napkin justification of the amount they were willing to pay for. That in mind, this foundation formula and its arbitrary weights – excuse me – weight – actually bring us backwards, not forwards in the school finance debate, making a mockery of “research” and its potential use for informing state school finance policy.

Sorry… got a little edgy at the end there.

And here’s a little extra credit reading which actually covers national research on estimating the cost of achieving state standards. It’s from the National Research Council of all places: http://www7.nationalacademies.org/CFE/Taylor%20Paper.pdf

Follow up note:

As the statute reads, RI itself would also be included in the average calculation, lowering the value further. It makes little sense to include current average (or even 3 year old average) spending of the state you are trying to “fix” in the average spending to inform the foundation level if the assumption is that the state has, for lack of any real formula, fallen behind in regional competitiveness. Of course, it hasn’t fallen behind New Hampshire. So… my above averages do not include Rhode Island itself and are intended only to be illustrative of the arbitrary (well… not really arbitrary… intentional) choice of including New Hampshire in the calculation.

By the way… I wonder if Deborah Gist can see New Hampshire from her window, or does Massachusetts actually get in the way?

Cartel Recap

This is an old post which I have moved forward in time on my blog because of the national release of this absurd film.

For series of posts on this topic, see: https://schoolfinance101.wordpress.com/category/the-cartel-movie-schlockumentary/

Okay… so a few people meandering through my posts over time have sought some synthesis of my gripes about Bob Bowdon’s Cartel Movie. First of all, here’s a link to a pretty good review of the film which I just found yesterday: http://www.nj.com/entertainment/tv/index.ssf/2009/10/the_cartel_movie_review_docume.html

  • The divisive, emotional and complicated debate now raging over powerful public teachers unions and “school choice” — a catchphrase that encompasses support for vouchers, charter schools and a variety of other reforms — could use a comprehensive sorting-out by a diligent observer. Bob Bowdon’s smarmy diatribe isn’t it.
  • In taking to task the sorry state of our public schools, former New Jersey TV personality Bob Bowdon employs the three R’s of bad filmmaking: righteousness, revilement and redundancy.

And these glowing reviews accept as a given, Bowdon’s “statistical” argument validating the crisis of schooling in New Jersey.

Here’s my own synopsis of the arguments behind the film – the Crisis that necessitates the Solution.

The Crisis (Bowdon’s Crisis)

There’s a crisis in education in America and more specifically in New Jersey. Quite simply, every country in the world is handing us a beating as a nation and as a state, despite the massive amount of money we are throwing down the rat-hole of our public education system.

Bowdon’s evidence of a crisis:

Bowdon complains of our lagging national performance by making comparisons of international assessments such as PISA to other countries (critique of the relevance of PISA here). Here, Bowdon twists the argument to specifically blame states like New Jersey, which are not only a part of this substandard American education system, but are emblematic of it, by spending obscene amounts of money for these failures. Okay… so here’s the basic logic:

  • Our national average test scores are bad compared to other countries
  • New Jersey spends a lot on schools, and is part of this terrible national system
  • Therefore, spending is bad, our schools are terrible nationally, and New Jersey is even worse

But, as I discuss here: https://schoolfinance101.wordpress.com/2009/06/17/vacuous_bowdon/

New Jersey actually performs very well even on international comparisons, in a legitimate, rigorous statistical analysis by the American Institutes for Research (http://www.air.org/files/International_Benchmarks1.pdf) And, our national average is only as low as it is because of our many very low spending states that have chosen to throw their public education systems under the bus. Can’t blame New Jersey’s high spending for Louisiana and Mississippi’s low performance Bob. (some useful comparisons on this more recent post: https://schoolfinance101.wordpress.com/2010/02/23/common-standards-and-the-capacity-to-achieve-them/)

In an effort to further the argument that New Jersey schools in particular are an abomination, Bowdon points out how New Jersey is by a long shot (okay, I’m exaggerating his point here), first in the nation (if not the world) in spending on schools. Yet, if you correct NJ graduation rates to count only those kids who pass the NJ state tests, we’re only 24th on graduation rate. Yep, mediocre at best for all that money. Down the rat-hole. Clearly, the kids who graduate high school in all those other states, like Tennessee for example, must be able to pass the NJ state tests. Oh wait, they don’t take the NJ tests, do they? Another really dumb comparison Bob (a comparison originally generated by E3, but in the context of a broader critique of graduation rates).

This new report (as well as an older version) shows that the NJ tests aren’t really the least rigorous tests out there: http://nces.ed.gov/nationsreportcard/pubs/studies/2010456.asp. Not great. But not the worst either. Yes, if we’re going to have tests, we should expect kids to pass them. No excuses there. But the graduation rate comparison is still completely bogus. I address this topic in greater detail here: https://schoolfinance101.wordpress.com/2009/10/16/more-cartel-garbage-bowdon-still-an-idiot/. Oh, and by the way, as I point out in that same post, NJ is in good company on per pupil spending, rarely actually topping the list.

The icing on the cake is the analysis Bowdon originally presented as part of his “Facts and Figures” to support his “crisis” case . This still stands as the absolute dumbest analysis I have seen or read pretty much anywhere in my years working in education policy research (okay, this one comes close: https://schoolfinance101.wordpress.com/2009/05/13/should-think-tanks-be-licenced-to-think-and-when-should-a-license-be-revoked/) . Here, Bob Bowdon explains his brilliant revelation that states which spend more on their schools have lower SAT scores – so spending more lowers SAT scores… or at least those states that do spend more simply waste it so badly that SAT scores go down… for some reason. I tackle this outright stupidity in my first post on the topic: https://schoolfinance101.wordpress.com/2009/05/30/idiot-of-week-award-the-cartel-check-this-out/ (While Bowdon has removed much of this completely ridiculous content from the movie site, the logic of his current site content remains the same, and these absurd comments/arguments represent the level of Bowdon’s thinking at the time the movie was initially released. I saved copies of the original SAT graphs. They make great teaching examples of deeply flawed reasoning!)

The solution to the crisis that may not exist:

Okay, so if Bowdon can’t concoct his crisis, there’s really no need for a solution to it. You know, it might not actually be that hard to do a reasonable run through some real numbers to point out some serious problems, inequities and inadequacies in our education system as a nation and in New Jersey schools. They are certainly far from perfect. But, Bowdon can’t seem to string together even one set of legitimate, well argued facts to make such a case. So, I could stop here. By Bowdon’s absurd evidence, no crisis actually exists, therefore, no need for solution. But of course, Bob has one:

The only possible two solutions – Charter schools and Vouchers to private schools – with emphasis on the former. Everyone knows that money doesn’t solve education problems, Charters and Vouchers do (only if they’re well funded, though). Now, let me qualify here that I am a fan of charter schools having been a founding member of the special interest group on charter school research of the American Education Research Association and having written research articles which find favorable results for charter schools regarding academic quality of teachers (http://epx.sagepub.com/cgi/content/abstract/20/5/752) . I’m also a fan of private schools, having taught in one of NYC’s most elite independent day schools and having written on private school finances (http://www.epicpolicy.org/files/PB-Baker-PvtFinance.pdf) . But sadly, my actual knowledge of Charters and Private schools makes it harder, not easier to accept Bowdon’s poorly conceived arguments.

On Charters: Bowdon points to a few specific charter schools that are doing very well compared to other schools. Great. Some schools do better than others. I’m good with that. But, Bowdon seems to argue that because these few schools are good, all charters are good – certainly better than any traditional public school. Therefore, it is an outrage that the state of NJ won’t simply throw the doors wide open to more charters to accommodate the tear filled rooms of parents awaiting their chance at the opportunity to send their kids to one of the many outstanding charter schools. Here’s the glitch in this logic. I explain here (link below) that the average performance of Charter schools is statistically no different from the average performance of poor traditional public schools in NJ. Yes, some are better and many, many are much worse. The chances that a student in a charter is not in a low performing school are only marginally (very marginally) better than for students in the poorest (comparably poor) traditional public schools. While some charter school research shows strong positive results, the balance of that research shows a break-even, on average (see my post: https://schoolfinance101.wordpress.com/2009/10/20/a-few-quick-nj-charter-school-facts-figures/) and NJ charters are no different.

For updated and more extensive analysis of NJ charter schools, see: https://schoolfinance101.wordpress.com/category/new-jersey-charter-schools/

Convincing inner city families that Charter schools will save their children simply because they are charter schools and therefore they must be better than traditional public schools is disingenuous at best. I have no problem whatsoever arguing that parents should have the option to choose a “better” school and should be provided reasonable information to aid them in choosing a legitimately better option for their children. Information is the ultimate equalizer here. Contributing to and/or concocting misinformation – creating a “market for lemons” by distorting information – when the stakes are this high – merely to advance a political agenda and build reputation as a supposed “documentary” film producer is morally repugnant.

Finally, on the private school voucher side of the argument: Like I said, I’m a big fan of private schools and I’ve seen what money can buy in the best of private schools. By the way, I report here on the actual per pupil spending of private schools by the affiliation of those schools (https://schoolfinance101.wordpress.com/2009/08/18/private-school-spending/). When it comes to private schools, like Charter schools or traditional public schools, you get what you pay for, and the average per pupil spending (not tuition, but actual spending) in private independent day schools in New Jersey hovered around $25,000 to over $30,000 in 2007. Urban Catholic school per pupil spending is on par with Charter spending, and only conservative religious schools spend much less. Note that Catholic schools, like Charter schools are struggling these days to operate at such low expense (around $12k per pupil). Providing vouchers at levels similar to charter funding would ensure that the only choices available to parents would be financially struggling Catholic schools or conservative religious schools. There would be no religious neutrality in the options available. Private independent schools would remain well out of reach. Double the voucher level and you might get somewhere, but demand for slots would likely far outpace supply (see for a fun paper on price elasticity and private school attendance: http://www.nber.org/~dynarski/w15461.pdf). Under-subsidized vouchers are a cruel hoax, like distorting information on the true variance in charter school quality.

There are other potential forms of choice here, which are noticeably absent in Bowdon’s arguments (unless I’m missing something). Hey, look at my graph of school performance by DFG in my charter school post:  https://schoolfinance101.com/wp-content/uploads/2009/11/updated-charter-rel-performance.jpg Wouldn’t it be nice to provide open enrollment choice options for kids from the urban core to attend the high performing affluent suburban schools? Why should we only let them choose from the relatively average, under-resourced charter or religious private schools? Seems a little unfair, don’t you think? Seems a little disingenuous to argue that choice will solve our problems as long as we only let the poor minority children in the urban core attend start-up charter schools in church basements and other makeshift rental properties (since the slots in the elite, high performing charters are taken) and low tuition, low spending exclusively religious private schools. You wouldn’t want to include all of those higher performing traditional public schools a few NJ transit stops away.

Summary

So, here it is in a nutshell:

  1. Even if there is a crisis, Bowdon provides no legitimate evidence of one, and in fact, provides laughable claims that make it hard to take him seriously at all;
  2. Since there is no validated crisis, there is no need for a solution, but Bowdon offers one anyway;
  • Instead of attending NJ’s dreadful traditional public schools, students should flock to NJ’s outstanding charter schools, which, it turns out, have average performance the same as the poorest NJ traditional public schools, or
  • NJ children should be provided vouchers at levels that will allow them only to select from cash strapped urban religious private schools.

Seems reasonable enough. Ill-conceived? Intellectually vacuous? Schlockumentary? I must stop myself.

As a professor of school finance who lives every day immersed in national and state databases on school funding and student outcomes and who has advised many national organizations on the development of indicator systems for comparing schools/districts and states, Bowdon’s presentation of “shocking statistics” is quite honestly the most offensive, absurd and amateur presentation I have ever witnessed – regardless of political angle.

Cheers.

Leaders and Laggards Lags!

A quick note on Center for American Progress Leaders and Laggards report.

On pages 23 & 24, this report attempts to grade state school funding systems and their level of “innovation.” But, the report pays no attention to a) whether these states actually perform well on any measures of outcomes,  b) whether these states actually fund their schools well overall, or c) whether these states actually target any of that funding to where it’s needed most.

Quite simply, this report is complete garbage – at least the finance section! One cannot possibly rate “innovation” of a state school funding system without any regard for whether that system is sufficiently and equitably funded. You can’t stimulate innovation without an investment in Research and Development or the product itself! It really is that simple.

The best Finance grades in the report are given to such education funding laggards as:

Yet, high performing states that actually fund their systems well and target resources where needed most get lousy grades (Massachusetts & New Jersey).  This  stuff is just plain silly!

====

To lighten the mood a bit, here’s Willy Wonka summarizing the Arizona school finance formula: http://www.youtube.com/watch?v=M5QGkOGZubQ