What do our One Newark Reports tell us?

My doctoral student Mark Weber and I have just completed our second report evaluating the impact of the proposed One Newark plan on Newark schools, teachers and the students they serve. In this post, I will try to provide a condensed summary of our findings and the connections between the two reports.

In our first report, we evaluated the statistical bases for the placement of Newark district schools into various categories of consequences. Those categories are as follows:

  1. No Major Change: Neither the staff nor students will experience a major restructuring. While some schools may be resited, there will otherwise be little impact on the school. We have classified many of the schools slated for redesign in this category, as there appears to be no substantial change in the student body, the staff, or the mission of the school in NPS documents; however, we recognize that this may change as One Newark is implemented, and that some of these schools may eventually belong in different categories.
  2. Renew: As staff will have to reapply for their positions, students may see a large change in personnel. The governance of the school may change in other ways.
  3. Charter Takeover: While students are given “priority” if they choose to apply to the charter, there appears to be no guarantee they will be accepted.
  4. Close: We consider a school “closed” when it ceases to function in its current form, its building is being divested or repurposed, and it is not being taken over by a charter operator.
  5. Unknown: The “One Newark” documents published by NPS are ambiguous about the fate of the school.

We evaluated the extent to which schools, by these classifications differed in terms of a) performance measures, b) student population characteristics and c) facilities indicators. We also tested whether these factors might be used to predict to which group schools were assigned.  What we found was:

  • Measures of academic performance are not significant predictors of the classifications assigned to NPS schools by the district, when controlling for student population characteristics.
  • Schools assigned the consequential classifications have substantively and statistically significantly greater shares of low income and black students.
  • Further, facilities utilization is also not a predictor of assigned classifications, though utilization rates are somewhat lower for those schools slated for charter takeover.
  • Proposed charter takeovers cannot be justified on the assumption that charters will yield better outcomes with those same children. This is because the charters in question do not currently serve similar children. Rather they serve less needy children and when adjusting school aggregate performance measures for the children they serve, they achieve no better current outcomes on average than the schools they are slated to take over.
  • Schools slated for charter takeover or closure specifically serve higher shares of black children than do schools facing no consequential classification. Schools classified under “renew” status serve higher shares of low-income children.

Figure 1

Slide1

Figure 2

Slide2

Figure 3

Slide3

Figure 4

Slide4

These last two figures are particularly important because what they show is that after we adjust for student population characteristics, schools slated for takeover by charters in some cases actually outperform the charters assigned to take them over. While North Star Academy is a relative high performer, we are unable here to account for the fact that North Star loses about half of their students between grades 5 and 12.

These findings raise serious concerns at two levels. First, these findings raise serious questions about the districts own purported methodology for classifying schools. Our analyses suggest the district’s own classifications are arbitrary and capricious, yielding racially and economically disparate effects.  Second, the choice, based on arbitrary and capricious classification, to subject disproportionate shares of low income and minority children to substantial disruption to their schooling, shifting many to schools under private governance, may substantially alter the rights of these children, their parents and local taxpayers.

One Newark is a program that appears to place sanctions on schools – including closure, charter takeover, and “renewal” – on the basis of student test outcomes, without regard for student background. The schools under sanction may have lower proficiency rates, but they also serve more challenging student populations: students in economic disadvantage, students with special educational needs, and students who are Limited English Proficient.

There is a statistically significant difference in the student populations of schools that face One Newark sanctions and those that do not. “Renew” schools serve more free lunch-eligible students, which undoubtedly affects their proficiency rates. Schools slated for charter takeover and closure serve larger proportions of students who are black; those students and their families may have their rights abrogated if they choose to stay at a school that will now be run by a private entity.

There is a clear correlation between student characteristics and proficiency rates on state tests. When we control for student characteristics, we find that many of the schools slated for sanction under One Newark actually have higher proficiency rates than we would predict. Further, the Newark charter schools that may take over those NPS schools perform worse than prediction.

There is, therefore, no empirical justification for assuming that charter takeovers will work, when after adjusting for student populations, schools to be taken over actually outperform the charters assigned to take them over. Further, these charters have no track record actually serving populations like those attending the schools identified for takeover.

Our analysis calls into question NPS’s methodology for classifying schools under One Newark. Without statistical justification that takes into account student characteristics, the school classifications appear to be arbitrary and capricious.

Further, our analyses herein find that the assumption that charter takeover can solve the ills of certain district schools is specious at best.  The charters in question, including TEAM academy, have never served populations like those in schools slated for takeover and have produced only comparable current outcome levels relative to the populations they actually serve.

Finally, as with other similar proposals sweeping the nation arguing to shift larger and larger shares of low income and minority children into schools under private and quasi-private governance, we have significant concerns regarding the protections of the rights of these children and taxpayers in these communities.

In our second report, we evaluated the distribution of staffing consequences from the One Newark proposal.  For us, the One Newark plan raised immediate concerns of possible racial disparity both for students and their teachers. As such, we decided to evaluate the racially disparate impact of the plan on teachers, in relation to the students they serve and also to explore the parallels between the One Newark proposal and past practices which disadvantaged minority teachers. In our second report, we found:

  • There is a historical context of racial discrimination against black teachers in the United States, and “choice” systems of education have previously been found to disproportionately affect the employment of these teachers. One Newark appears to continue this tradition.
  • There are significant differences in race, gender, and experience in the characteristics of NPS staff and the staff of Newark’s charter schools.
  • NPS’s black teachers are far more likely to teach black students; consequently, these black teachers are more likely to face an employment consequence as black students are more likely to attend schools sanctioned under One Newark.
  • Black and Hispanic teachers are more likely to teach at schools targeted by NJDOE for interventions – the “tougher” school assignments.
  • The schools NPS’s black and Hispanic teachers are assigned to lag behind white teachers’ schools in proficiency measures on average; however, these schools show more comparable results in “growth,” the state’s preferred measure for school and teacher accountability.
  • Because the demographics of teachers in Newark’s charter sector differ from NPS teacher demographics, turning over schools to charter management operators may result in an overall Newark teacher corps that is more white and less experienced.

Figure 5

Slide5

This figure is the real kicker here. This figure, based on two separate logistic regression models characterizes a) the likelihood that an NPS teacher is in a school which faces consequences and b) the likelihood that a teacher is a charter school teacher. That is, we estimate the odds, by race, experience and other factors that a teacher is in a school where they are likely to face job consequences and the odds that a teacher works in the favored subset of charter schools. We find that:

  • NPS teachers who face employment consequences as a function of One Newark are 2.11 times as likely to be black as to be white, and 1.766 times as likely to be Hispanic as white.
  • By contrast, charter school teachers in Newark who are not only protected by the plan, but given the opportunity in some cases to take over the schools and thus the jobs of those NPS teachers, are only 74% as likely to be black as to be white, 47% as likely to be Hispanic as white, and 3.6 times more likely to be Asian than white.
  • Both charter teachers and NPS teachers facing employment consequences tend to be female.
  • NPS teachers who face employment consequences as a function of One Newark are about 50% more likely to have 10 to 14 years of experience compared to their peers with 0 to 4 years, and 37% more likely to have 15 to 19 years of experience compared to their peers with 0 to 4 years.
  • Charter teachers, who again may be given the opportunity to take over schools of these NPS teachers, are highly unlikely to have more than 0 to 4 years of experience. Charter teachers are more than 3x as likely to have 0 to 4 years as opposed to 6 to 9 years, 10 times as likely to have 0 to 4 years as opposed to 10 to 14 years, 20 times as likely to have 0 to 4 years as opposed to 15 to 19 years, and nearly 100 times as likely to have 0 to 4 years of experience than to have more than 19 years of experience.

The overall effect of One Newark on the total Newark teaching corps may likely be to make it more white and less experienced than it is currently.

We find patterns of racial bias in the consequences to staff similar to those we found in the consequences to students, largely because the racial profiles of students and staff within the NPS schools are correlated. In other words: Newark’s black teachers tend to teach the district’s black students; therefore, because One Newark disproportionately affects those black students, black teachers are more likely to face an employment consequence.

NPS’s black teachers are also more likely to have positions in the schools that are designated by the state as needing interventions – the more challenging school assignments. The schools of NPS black teachers consequently lag in proficiency rates, but not in student growth. We do not know the dynamics that lead to more black teachers being assigned to these schools; qualitative research on this question is likely needed to understand this phenomenon.

One Newark will turn management of more NPS schools over to charter management organizations. In our previous brief, we questioned the logic of this strategy, as these CMOs currently run schools that do not teach students with similar characteristics to NPS’s neighborhood schools. Evidence suggests these charters would not achieve any better outcomes with this different student population.

This brief adds a new consideration to the shift from traditional public schools to charters: if the CMOs maintain their current teaching corps’ profile in an expansion, Newark’s teachers are likely to become more white and less experienced overall. Given the importance of teacher experience, particular in the first few years of work, Newark’s students would likely face a decline in teacher quality as more students enroll in charters.

The potential change in the racial composition of the Newark teaching corps under One Newark – to a staff that has a smaller proportion of teachers of color – would occur within a historical context of established patterns of discrimination against black teachers. “Choice” plans in education have previously been found to disproportionately impact the employment of black teachers; One Newark continues in this tradition. NPS may be vulnerable to a disparate impact legal challenge on the grounds that black teachers will disproportionately face employment consequences under a plan that arbitrarily targets their schools.

And now to editorialize in no uncertain terms…

One Newark is an ill-conceived plan. It is simply wrong, statistically, conceptually and quite possibly legally. That it can be so wrong on so many levels displays an astounding combination of ignorance and arrogance among its designers, promoters and supporters.

First, the justifications for school closures are, well, unjustified. The data said to support the plan simply don’t.  Even if closing schools based on poor performance could be justified, the data do not indicate a valid performance based reason for the selections. This is either a sign of gross statistical incompetence on the part of district (and by extension, state) officials or evidence that they have made their decisions on some other basis entirely.

Second, the fact that in many cases, lower performing charters are slated to takeover higher performing district schools (when accounting for students served) is utterly ridiculous. Again, this is either evidence of gross statistical malfeasance and complete ignorance on the part of district officials or that their choices are based on something else entirely. Certainly it is a clever strategy for making charters look good to assign them to takeover schools that already outperform them. But I suspect I’m giving district officials too much credit if I assume this to be their rationale.

Third and finally, if I heard someone suggest 10 years ago [in a time when data free ideological punditry was at least somewhat more moderated and history marginally more understood and respected], that we should start reforming Newark or any racially mixed urban district by closing the black schools, firing the black teachers, selling their buildings and turning over their management to private companies (that may ignore many student and employee rights), I’d have thought they were either kidding or members of some crazy extremist organization [Note that this plan is substantively different in many ways from the Philadelphia privatization plan that was adopted over ten years ago, where private companies held contracts with the district, and thus remained under district governance].

It would be one thing if there were valid facilities utilization, safety or health concerns and other legitimate reorganization considerations that just so happened to affect a larger share of black than other students and teachers. It is difficult if not impossible to protect against any and all racially disparate effects, even when making well-reasoned, empirically justifiable policy decisions.

But this proposed plan, as shown in our analyses, is based on nothing, nor has there been any real, thoughtful or statistically reasonable attempt to justify that it is actually based on something. No legitimate data analyses have been provided to support the plan (much like the flimsy parallel proposal in Kansas City).

It is truly a sad commentary on the state of the education reform conversation that we would even entertain the One Newark proposal, and even more so that we would entertain such a proposal with no valid justification and an increasing body of evidence to the contrary.

 


The False Markets of Market Based Reforms

I’ve not spent a great deal of time talking about “corporate reform,” “privatization” or “market based reform,” mainly because I find these labels unproductive and oversimplified. Most of which occurs in our public and private sectors (also a simplification) and provision of public and private goods and services is far more nuanced – well beyond simple classification. Further, as I have noted on a few occasions, the rhetoric which emanates from one side or the other of these ideological debates is often entirely inconsistent. See for example, my explanation that strategies being promoted for public schools as derivative of successful private sector industry are anything but. That which is pitched as “corporate reform” is little more than failed private sector management.

A similar hypocrisy has been nagging at me lately, and I’ve touched on it, in part, on a few previous posts (here’s one). That is, there are sweeping claims that many of the policies being advanced these days are about capitalizing on the virtues of free markets – that we are trying to use the rationality that emerges from less regulated private sector markets to achieve more efficient production of high quality schools.

Enter charter schools and alternative pathways to the teaching profession.

Two interconnected strategies that are often discussed under this umbrella are a) the expansion of charter schooling and b) reduction of barriers to entry to the teaching profession through programs like Teach for America.

The argument is that by providing a fair public subsidy to charter schools (since their competition is subsidized and already established, giving them an unfair head start), we can induce competition improving both charter schools (via attrition of the weak) and public schools with whom they compete for students. Indeed, there is some empirical evidence to support this finding.

On the teacher labor market issue, the argument for reducing barriers to entry is that the market can decide whether it prefers the traditionally prepared teachers, or the alternative, creating competitive pressures for traditional preparation programs to improve and for alternative pathways to produce qualified candidates… those who can compete on a level, but less regulated playing field.

Casual, anecdotal evidence that both of these strategies are “working” to introduce positive market based effects, include evidence of long waiting lists – that is, high consumer demand, for charter schools in major urban centers. Others point to the fact that candidates from programs like Teach for America tend to get hired and quickly, suggesting high demand for their skills on the open market for teachers.

Ten to fifteen years ago, either of these arguments might have been supportable. Back when charter schools were still mainly upstarts, with a few emerging networks and back when alternative temporary teacher pipelines, while seeking exclusive arrangements with districts and some charters, still populated a balance of the two. But that was then and this is now.

Charter waiting lists in the current era are as likely to be policy-induced by deprivation and mass closure of true public options, where those closures are often based on bogus metrics used for declaring district schools as “failure factories.” Worse, these declarations of failure and disruptive intervention and bogus metrics upon which they are based are now codified in state policies promulgated in response to federal pressures (close the worst 5% you must and tweak your accountability measures you may).  Today’s charter waiting lists are as much a function of induced under-supply of public options if not far more so than local community demand for more charter options.

Surely if the government forcibly shut down Amtrak choosing some bogus measure to declare it an abject failure, there would be increased “demand” for other means of transportation along the N/E corridor and if the government forcibly shut the U.S. Postal service, other package delivery services would see a spike in their business. But I find it doubtful anyone would suggest that this spike resulted from a true market driven preference for their products or services.  Charter waiting lists in the wake of forced shutdown of district schools based on low average test scores, or even biased growth metrics, are no different.

Which brings me to my second point.  I’ve not written much on this blog about TFA, nor have I had cause to.  But the recent back and forth speculation on the potential role of TFA in Newark under the proposed reorganization and layoff plan (connected, or not?) led to some back and forth among local writer Bob Braun and TFA leadership, which brought out one blogger’s attempt to find a middle ground in the debate.  This blogger, responding to the twitter trending of #ResistTFA brought up an argument on behalf of TFA that I’ve not heard in a while. That is, that education schools should learn from the market based successes of TFA, specifically “why do principals and schools still line up to hire TFA corps members when they have the chance?” I must admit, I’ve been complicit in making similar arguments in my own research in the past. (here & here)

The implication in this blog post is that TFA has, by virtue of producing high quality candidates, outpaced the competition (traditional preparation programs) on the free market – the open labor market for teachers. This would be all well and good if the speedy placement of TFA candidates had anything to do with an open competitive labor market, but it doesn’t.  And I’m not entirely sure I fault them for that. I only fault their advocates for not acknowledging that.  And, I would suggest that many traditional ed schools also operate in relatively close relationships with local public school districts.

My problem with the current false market scenario regarding TFA is its intersection with the false market for charter schools. Just as charter school expansion – demand – has become heavily dependent on manipulation of markets by policy makers, TFA expansion – demand – has become dependent on those major charter network operators who are dependent on charter market manipulation (forced closure of district schools).[1]

Put simply, this is not market based reform, nor should anyone pretend that market mechanisms (rather than policy preferences and market manipulation) are driving any of this.

Yes, it is reasonable that we might experiment with public subsidies to private providers, be it through direct private management under district contract, or via upstarts like charters (by their original intent). And yes, it is reasonable to test out alternative pathways to teaching.  But when we start forcibly shuttering the public system, under the facade of federally promulgated state policies, and replacing the only true public option with private providers who then establish exclusive arrangements for alternatively prepared short-term staff, we’ve gone too far.

When we start claiming that these shifts are happening due to free market forces and public demand, well… then we’re just full of crap.

It’s time to put a stop to this and rethink where we’re headed before even more damage is done!


[1] It may be that the long term financial viability of major charter networks depends on both incredibly high employee churn and placement of alums in future positions of political power (to continue rigging markets in favor of the institutions that hire them). Churn is required because, as I’ve explained in previous posts, many well-established charter operators actually pay pretty good salaries over the first several years, outpacing local district schools by 20 to 30%, while also offering small class sizes. It is hard to conceive of how these schools would balance their operating budgets were they to retain these teachers much longer than the usual 2 to 5 years. Further, charter advocates in NYC like to point to the exorbitant retirement costs of the city district as one reason why charter spending is actually lower (even though it’s not) than district schools. I’ve noted several times that it is rather absurd to compare a fully matured district with thousands of retirees to an institution less than 10 years old that likely has no retirees as of yet. Presumably they would, eventually, unless of course they can churn, churn, churn.  TFA helps them accomplish this goal.

Clarifications on Charter Rent & Equitable Portfolio Management

Dianne Ravitch today titled her post on my recent Think Tank Review to suggest that I agree with Mayor Bill de Blasio that some charter schools can afford to pay rent.

This statement while true as worded, misses the bigger point – how to create a system of equitable opportunities for New York City school children, regardless of whether that system includes or does not include charter schools. [bearing in mind important legal inequities that result from including charters in this mix]

While it is true that some charters can certainly afford to pay rent, this is all part of the bigger problem of the uneven sorting of children and resources that has occurred with charter expansion in NYC. It’s all part of a bigger systemwide equity issue which I write about in a forthcoming article in Ed Finance and Policy. I will post the conclusions of that article on my blog.

My think tank review piece specifically challenges the flimsy claims by Manhattan Institute that the only way to provide good schools in NY is to provide unfairly disproportionate subsidies to charter operators. The MI brief falsely and disturbingly assumes that deprivation of the public system to the benefit of the charter system can only yield good results. Charters good. District Schools Bad. Ugh…

Where I believe De Blasio has it correct is in his assertion that it is more important to consider the fiscal health of the broader system – that which serves the majority of the kids… and does so under a governance umbrella that affords those children the full array of statutory and constitutional rights of children in truly public schools.

As I explain in the Think Tank Review:

the central problem with the report is that the author assumes that there exists no possible downside when resources are transferred from city schools to charter schools. The assumption is that providing these subsidies benefits charters and harms no one and that not providing these subsidies harms charters and benefits no one. The policy brief entirely ignores the broader and more complex policy questions of what it takes to manage a balanced and equitable portfolio of schooling options.

In my forthcoming article coauthored with Ken Libby and Katy Wiley of U. of Colorado, we illustrate how charter expansion in New York City has played a counterbalancing role to the city’s efforts to improve equity across schools. As I explain in the Think Tank Review, NYC in the 2000s also adopted a fair student funding formula, the goal of which was to provide that schools serving more needy children were able to access more adequate resources. But expansion of charters had led to the following:

In the case of New York City, charter schools, particularly charter schools serving fewer high-need students, receive resources that are provided by foundations and philanthropies. This negatively impacts resource equity. The equity promise of weighted student funding formulas – which is to direct more resources to high-need students – cannot be fulfilled if outside funds are not adequately accounted for.

Perhaps the most troubling pattern in New York City is that charter schools serve fewer high-need students compared to district schools. This, in effect, concentrates higher need students within traditional public schools. While charter schools in New York City only serve 4% of students, there has been a steady push to expand the charter sector. If charter schools continue to serve fewer high-need students this expansion could lead to a more inequitable distribution of both children and resources in the city. Charters will need to educate more high-need students with existing budgets, or raise additional funds to serve more high-need students.
A notable factor in the inequitable situation in New York City is the additional support provided to many charters through the use of school district facilities. For charter schools that are already advantaged in terms of both access to philanthropy and serving less-needy students, access to facilities space in the high-rent marketplace of New York City provides further financial advantage, both over competing district schools and over less well-off charter schools. While the New York City Independent Budget Office has highlighted these concerns in recent reports, their findings seem to have fallen on deaf ears in terms of influencing policy moving forward. Most recently, the IBO has indicated that the predictability of school site budgets for district schools has begun to improve, such that by 2012, need factors in the city’s weighted student funding formula were significant predictors of school site budget variation (IBO, April 2013). However, this study tests whether school site budgets are more predictably a function of formula weights rather than focusing on exogenous cost factors. Further, the study ignores the position of charter schools in the mix.

(Baker, Libby and Wiley, in press)

We spent much time grappling with the policy implications of this finding. These inequities are problematic on many levels and should be addressed, but it remains difficult to determine just how. We explain:

The use of weighted student funding formulas to enhance equity, coupled with charter school expansion, presents a number of significant challenges to achieving more equitable resource distribution within school districts. This is particularly the case if charter schools are funded outside of district weighted student funding formulas, and further exacerbated if charters do not serve a comparable number of low-income students, English Language Learners, and special education students. Additional outside support from private donors and foundations further contribute to within-district inequities.

One might make the case that disparities in resources above and beyond a minimally adequate floor are non-offensive or cause no harm. For example, applying Baker and Green’s (2008) pure adequacy conception to the district and charter portfolio as a whole, it may be the case that the net increase to overall resources, leading to the provision of quality teachers, smaller classes and longer school years resulting in improved outcomes for a subset of the student population increases the aggregate welfare of children.

However, it is also important to consider that the provision of more adequate resources to some children may actually diminish the value of resources received by others, even if it does not lead to a change in dollars available to others. Baker and Green (2008) as well as Koski and Reich (2006) explain that to a large extent education operates as a positional good, whereby the advantages obtained by some necessarily translate to disadvantages for others. For example, Baker and Green (2008) explain that “In a system where children are guaranteed only minimally adequate K–12 education, but where many receive far superior opportunities, those with only minimally adequate education will have limited opportunities in higher education or the workplace.” (p. 210) This concern is particularly pronounced in a city like New York where children and families are constantly jockeying for position to gain access to selective admissions public middle and secondary schools, and where the majority of charter schools serve elementary and middle grades. The competitive position of children in otherwise similar district or charter schools with fewer resources is compromised by the presence of better resourced district or charter schools. Though surely, all would be less well off if all were substantially though equally deprived.

To add further complexity, about 16% of students in grades 1 to 12 in New York City are enrolled in private schools, a much larger share than presently attend charter schools.[1] It is possible that the presence of more elite and better resourced charter or district schools may encourage more families to stay within the publicly subsidized system and that those students may in turn compete more favorably with the many children attending the city’s private schools. Baker (2009b) found that private independent day schools in the city spent in 2007, on average, nearly twice the level of district schools, which is also much more than the highest spending charter networks. Private independent schools make up about 28% of the city’s private school enrollment, or only slightly less than the share served by charter schools during the period under investigation herein.[2]

Finally, the ability of well-funded charter school operators to leverage their additional resources to provide more competitive wages (at given experience and degree level) and more desirable working conditions may put both district schools and less well funded charter operators at a disadvantage regarding recruitment and retention of teachers on the local labor market.[3] That is, unless high resource charter operators are primarily drawing new talent to the city schools that might otherwise choose either to work in another location or other profession altogether, in which case the presence of these schools yields a net gain in teaching quality. The most likely case is some balance of talent influx and internal sorting.

Rather than tackle resource equity, the combination of charter school expansion and weighted student funding formulas encourages a conception of equal opportunity largely based on expanded school choice. This conception of equal opportunity focuses primarily on offering to children and families a portfolio of school options, including charter schools, and attempts to adjust for differences in individual student needs by attaching more dollars to higher-need students. In this context, weighted student funding formulas are used in part to provide an economic incentive for schools to serve high-need students as well as an acknowledgement that some students require additional resources to meet the same level of achievement. Equity, in this scenario, is based primarily on equalizing the freedom to choose from various schools. If schools of choice are equitably resourced to accommodate student needs, equity can be achieved through such a model. But that does not appear to be the case in either Houston or New York City.

In large part, the portfolio approach, without sufficient consideration of resource equity, substitutes preferences for individual liberty (or choice) in place of preferences for equity. This approach is problematic in that it conflates liberty with equity, assuming the former necessarily leads to the latter, regardless of resource distribution. This is simply untrue. This conception fails to acknowledge these two core values often operate in tension with one another, with individual choices collectively leading to substantial inequities. Access to high-resource charter schools serving low-need populations is unevenly distributed across children and families citywide, with most if not all high resource charter schools significantly oversubscribed.

As discussed previously, according to Independent Budget Office reports, the City of New York has, on average, provided relatively equitable public subsidy rates for charter schools. But equity of subsidy rate alone does not ensure equality of educational opportunity, especially where private contributions make up such a large share of total revenue. When evaluating equity achieved by state school aid formulas, it is well understood and broadly accepted that equal state revenue per pupil for local public school districts does not necessarily result in equal educational opportunities, or even equal per pupil expenditures distributed across children (Baker and Green, 2008, Berne  and Stiefel, 1984). But unlike state aid formula adjustments for local fiscal capacity, cities have limited ability to exert direct control over, or account for access to outside resources. The same is true of private giving to local public school districts (Brent, 2002). Solutions to these resource inequities are not readily apparent. While state school aid formulas are means tested (equalized) for local revenue raising capacity, it is difficult to conceive how city school systems could appropriately measure and account for differential access to philanthropy without creating either disincentives to raise private dollars or incentives to conceal private fundraising and assets.  But the fact that these emergent inequities are difficult if not implausible to manage directly does not negate their existence. The reality is that a select subset of children, through luck of the lottery, are provided access to schools with far more resources than others. A better alternative may be to establish mechanisms and create incentives to encourage more equitable distribution of giving across both charter and district schools.

Herein lies the most marked challenge to equity: charter schools that serve lower-need students while also accessing funding support above and beyond what is available to traditional district schools. While segregating high-need students in traditional district schools, this process ensures that opportunities for a well-funded education are contingent upon access, generally through lotteries, to philanthropic giving. KIPP Houston, for instance, received $65 million in pledges from a variety of foundations (Mathews, 2007). Charter schools in NYC also receive substantial contributions for operations and expansion (Hass, 2009). Nation-wide, between 1999 and 2010, charter schools operated by charter management organizations received approximately half a billion dollars in additional resources (Lake et. al, 2010).

The presence of charter schools that are able to secure additional funding while serving fewer high-need students also undermines common claims that charters “do more with less.” It is important to challenge this misconception not because there is something inherently wrong with “doing more with less,” but rather that many charter schools are not actually doing more with less. Recognizing the benefits to a student’s education brought about by extra funding resources may actually strengthen arguments for improving school funding more broadly. While it may be a disappointment to those eager to prove that schools can be pushed to “do more with less,” recognizing that they in fact are not doing more with less, or at the very least serve lower-need student populations than district schools, could bring clarity to this highly politicized issue.

(Baker, Libby and Wiley, in press)

Is the answer as simple as charging rent to charters identified as fiscally advantaged? I don’t think so, but that’s not off the table. The greatest difficulty with this approach is determining the right metric to decide who should pay rent and how much, and doing so in a way that doesn’t simply encourage deceptive accounting practices.
In my view, the answer lies in developing a better overall system for determining the operating subsidy for charter schools, consistent with and integrated with the citywide public school funding model, given the students they serve, and working with authorizers and the state to establish appropriate methods for capital financing.


[1]2012 American Community Survey 1-Year Estimates (C14002) Universe: Population 3 years and over more information. SCHOOL ENROLLMENT BY LEVEL OF SCHOOL BY TYPE OF SCHOOL FOR THE POPULATION 3 YEARS AND OVER

[2] National Center for Education Statistics, Private School Universe Survey 2009-10. As defined by membership in the National Association of Independent Schools, other regional Independent School Associations, or National Independent Private School Association. NAIS Schools alone served nearly 12,500 (among those reporting) in 2009-2010.

[3] See appendix D for salary and class size comparisons for select NYC charter networks with district schools.

Rightsize this! When simple, ignorant solutions & simulations just don’t cut it

Recently, TB Fordham Institute released a report by AIR researcher Michael Hansen on “Rightsizing” the classroom. Hansen based his analysis on data from the state of North Carolina, using distributions of teacher value added scores and class sizes to derive conclusions about how “great” teachers could be given larger classes, thus reducing students exposed to “bad” teachers, leading to overall benefits in terms of student outcomes. This dreadfully oversimplified, a-contextual (even taken out of the constraints of its actual context) extrapolation has since made the rounds across reformy outlets.

The solution to all of our woes is simple and elegant. Just follow these steps.

  • Step 1: Identify “really great” teachers (using your best VAM or SGP) who happen to be currently teaching inefficiently small classes of 14 to 17 students.
  • Step 2: Re-assign to those “really great” teachers another 12 or so students, because whatever losses might occur in relation to increased class size, the benefits of the “really great” teacher will far outweigh those losses.
  • Step 3: Enter underpants Gnomes.
  • Step 4. Test Score Awesomeness!

The research assumption based on the North Carolina data is that the negative effects of increased class size are small, especially for 8th graders.

For most students above the third grade, the evidence points to at most a small class-size effect, if any at all.15 (Using the North Carolina data, I likewise estimate small class-size effects in fifth and eighth grades.)16 Thus in effect, it would take an increase of at least ten to twenty additional students in a good teacher’s class to dilute his productivity to that of an average teacher.17 Put another way, assigning a few extra students to the class of an effective teacher can translate to big gains for these students, while making only very small reductions in that teacher’s performance for everyone else in the class. (page 9)

Further, that the impact of “great teachers” is far more important. Thus, as the report puts it:

Intensively reallocating eighth-grade students—so that the most effective teachers have up to twelve more pupils than the average classroom—may produce gains equivalent to adding roughly two-and-a-half extra weeks of school (see figure ES-1).

While this is a fun/playful thought exercise… simulation, etc… much like the Chetty study extrapolation of the great teacher increasing a classroom full of 3rd grader’s lifetime income by $250k, this simulation ignores so many layers of reality that it’s just mind boggling.

While it certainly makes sense that we’d want to be able to assign more students to our “best” teachers (heck, why would we want to have anything but good teachers on our staff?), the practical constraints to implementing this elegant and oh-so-obvious solution are many:

  1. successful implementation requires that within our school we can actually identify with some consistency, those teachers who are measurably more effective? (and that we have some of each…??)
  2. that those “great” teachers have small enough class sizes for us to add those students without significant consequence and within room size/space constraints?
  3. and that their effectiveness is not particularly sensitive to the size of classes they’ve been teaching (which likely varies across teachers)
  4. that adding 12 students to each of 5 or 6 sections of daily workload for a teacher will not have some cumulative negative effect (on grading/quality of feedback they provide/retention of “great” teachers). That’s 60 to 72 more students. Individual classes are not the only relevant unit of analysis here! Total workload matters. At even 10 minutes of grading per week for each student, we’ve added 10+ hours of weekly work.

What’s been fun to follow about this report is the assertion that it is somehow broadly applicable to any/all policy settings.

Let’s consider above constraints in the context of New York City.

First, can we figure out who those “great” teachers in 7th/8th grade are… even in Math where value-added scores tend to be more stable, and in the school with the largest number of them in the value-added data released a few years back.

Here are the 8th grade math teachers and 7th grade math teachers with their year over year value-added percentiles. For example, we see that in 2008-09 Dorothy is below average (left of vertical line) but in 2009-10, Dorothy is above average. The same is true  for Natalie. Donna is above average both years and two (overlapping) are below average both years. Looking back an additional year, we only have one carry over teacher, Dorothy, who is above average again.

Slide6Donna does show up for 7th grade (below), and is above average there as well, but only average back in 2005-06. Otherwise, a) we don’t have that many teachers who  even persist in the school from year to year, and b) those who do have percentile ranks that jump all over the place.

Slide7So, it’s not really so easy to find those persistently excellent teachers.

And then what of that class size issue? Do we really think that Donna is going to have an inefficiently small class into which we can shove 12 more students?

The likelihood of that occurring in New York City is not great. Here are school average class sizes in 2010, 2011 and 2012.

Slide3Here’s a statewide look at the percent of classes already over certain thresholds.

Slide2

In higher poverty settings, most 8th grade class sizes already exceed 23 students and most in New York City far exceed that. It would be utterly foolish to extrapolate the assertion of minimal downside to increasing an NYC 8th grade math class from, say, 32 up to 44 students (if the room could even hold them).

One might assert that affluent suburban Westchester and Long Island districts with much smaller average class sizes should give more serious consideration to this proposal, that is, if they are a) willing to accept the assertion that they have both “bad” and “good” teachers and b) that parents in their districts are really willing to permit such experimentation with their children? I remain unconvinced.

As for leading private independent schools which continue to use small class size as a major selling point (& differentiator from public districts), I’m currently pondering the construction of the double-decker Harkness table, to accommodate 12 students sitting on the backs of 12 others.  This will be a disruptive innovation like no other!

 

 

 

 

Friday Finance 101: NY State’s Formula for Failure

Below is an excerpt from a recent series of policy briefs on NY State school funding

Statewide Policy Brief with NYC Supplement: BBaker.NYPolicyBrief_NYC

50 Biggest Funding Gaps Supplement: 50 Biggest Aid Gaps 2013-14_15_FINAL

Note: The above briefs received financial support from the New York State Association for Small City School Districts. All opinions are my own.

The 2007 New York State Foundation Aid formula was adopted specifically to achieve compliance with the high court’s 2006 order in the Campaign for Fiscal Equity case. The State argued that this new formula was built on sound empirical analysis of the spending behavior of efficient districts that achieved adequate outcomes on State assessments. The State argued that the Foundation Aid formula applied this evidence, coupled with additional evidence-based adjustments to address student needs and regional cost variation, in order to identify a specific target level of per pupil spending for each district statewide which would provide comparable opportunities to achieve adequate educational outcomes.  The State determined the share of that target spending to be raised through local tax revenues and estimated the amount to be paid by the state toward achieving each districts’ sound basic spending target.

Then, the State simply failed to fund the formula.

When enacted, the State committed to phasing-in the Foundation Aid formula from 2007 to 2010-2011.  The data behind the base spending calculation had been drawn from 2003-2005, and included general education instructional spending of school districts that a) achieved 80% proficiency rates on state assessments, and b) were in the lower half spending districts among those who achieved desired outcomes.  The formula for transitioning these figures to spending targets involves a combination of inflation adjustment, and phase-in percent to bring the dated estimates up to date and project the annual increases for hitting the adequate spending target in future years – four years out in the case of the original proposed remedy.

The current Foundation Aid formula may be described as follows.

District Foundation Aid per Pupil = [Foundation Amount X Pupil Need Index X Regional Cost Index] – Expected Minimum Local Contribution

Under this formula, the State determines the need and cost adjusted target spending for each district by taking the foundation funding level and multiplying it times the pupil need adjustment index (PNI) and then times the regional labor cost adjustment index (RCI). This approach is reasonable only to the extent that the target level of funding generated for each district by the formula represents what the State determines in necessary districts to provide a meaningful high school education, the constitutional standards established in the CFE rulings.

In 2012-13, the inflation adjusted foundation level of funding [for aid calculation purposes] was set to $6,580[1], a value which on its face is far lower than existing spending levels in nearly every New York State public school district or charter school.  The pupil need index combines measures of poverty (U.S. Census Poverty and Free or Reduced Lunch) shares of children with limited English language proficiency, and district population sparsity.  Finally, the Regional Cost Index is intended to recognize “regional variations in purchasing power around the State, based on wages of non-school professionals.”

Once a district’s target level of funding is calculated, the State then determines the share of that target that will be paid for by the local district and the share that will be picked up by the State through Foundation Aid. The State share of aid, or total Foundation Aid is determined as follows:

Total Foundation Aid = Selected Foundation Aid X Selected Total Aidable Foundation Pupil Units (TAFPU). Selected Foundation Aid is the district’s Foundation Aid per pupil, but no less than $500. [2]

It is important to note that, under this formula, the State provides every district a minimum of at least $500 per pupil in Foundation Aid without regard to whether the district has the ability to raise local revenue to meet or exceed their spending target on their own, without State aid.  In this calculation, total Aidable Foundation Pupil Units (TAFPU) include additional weighted adjustments for children with disabilities (not addressed in the PNI), pupils in summer school and half versus full day kindergarten.

The following table lists those districts with the largest per pupil gaps in State Foundation Aid in 2013-14.. In other words, these are the districts with the largest differences between the Foundation Aid the districts should have received had the State actually funded the Foundation Aid formula compared to the actual Foundation Aid the districts receive after the State’s Aid freeze and cuts are applied.  Detailed documentation of the calculations in the table is presented in the appendix.

Top 50 2013-14 Foundation Aid Shortfalls

Slide2

[1] Shortfall per DCAADM = (Foundation Aid before Phase In – Foundation After GEA) /  DCAADM
[2] Shortfall Percent = (Foundation Aid before Phase In – Foundation After GEA) /  Foundation Aid before Phase In
[3] NYSED FARU District Fiscal Profiles (http://www.oms.nysed.gov/faru/Profiles/profiles_cover.html) 2010-11
[4] File DBSAD1 W(FA0001) 00 FOUNDATION AID BEFORE PHASE-IN   03/26/13
[5] (Foundation Aid [DBSAA1, 03/26/13, E(FA0197) 00 2013-14 FOUNDATION AID] + GEA [AA(FA0186) 00 2012-13 GAP ELIMINATION ADJUSTMENT (SA1213)] + GEA Partial Restoration [AB(FA0187) 00 2013-14 GEA RESTORATION])
[6] File DBSAD1 M(OP0088) 00 SELECTED TAFPU 03/26/13
[7] File DBSAD1 P(OP0002) 02 ADJUSTED FOUNDATION AMT/PUPIL  03/26/13

 

Governor’s 2014-15 Budget Shortfalls

On January 17, 2014 district by district data became available for Governor Cuomo’s budget proposal for the 2014-15 school year. As discussed in the appendix the Adequacy Target Funding per Pupil, which is the “Adjusted Foundation aid per TAFPU [Total Aidable Foundation Pupil Units] is arrived at by taking a base funding figure times the pupil needs index (PNI) times the regional cost index (RCI). That base funding figure is intended to be based on average spending of the lower half of local public school districts meeting prescribed outcome standards, as discussed in the policy brief released concurrent with these analyses. Inexplicably, the state has chosen over the past few years to lower that base funding amount, despite increasing outcome standards.

Using the state’s own spreadsheets for aid allotments, one can back these figures out of the state aid worksheets as well by taking each districts’ “Adjusted Foundation per Pupil” divided by their PNI and RCI. For 2012-13, that figure rounds to $6,580 for each district. The 2013-14 aid worksheets yield a foundation level of only $6,515, or a cut to the foundation level of $65.  Backing this figure out of the 2014-15 budget proposal yields $6,458, another cut to the base funding level. This means that the gaps in funding for the past few years are further understated in these tables. Yet, these gaps are still huge. The table below summarizes those gaps for the 50 districts with the largest per pupil funding gaps.

Comparing the Governor’s budget for 2014-15 to prior year gaps provides an appearance that the Governor’s budget helps in closing the gaps substantially for some districts like Utica. This is a false impression however, created by lowering the adequacy target to offset the increase in pupil needs.

Top 50 2014-15 Budgeted Shortfalls

[data run as of 1/17/14]

Slide3

What about New York City?

The following table shows the current year, and Governor’s budgeted shortfalls for New York City:

Slide4These shortfalls remain over $2,500 per pupil for a total approaching $3 billion.

Cutting Basic Funding while Increasing Standards

Put simply, higher student outcome standards cost more to achieve, not less. As explained above, the New York State school finance formula is built on an underlying basic cost estimate of what it would take for a low need (no additional student needs) district to achieve adequate educational outcomes as measured on state assessments. The current formula is built on average spending estimates dating back several years now and based on prior outcome standards, tied to a goal of achieving 80% proficient or higher. More than once in the past several years, the state has substantively increased the measured outcome standards.

For 2010, the Regents adjusted the assessment cut scores to address the inflation issue, and as one might expect proficiency rates adjusted accordingly. The following figure shows the rates of children scoring at level 3 or 4 in 2009 and again in 2010. I have selected a few key, rounded, points for comparison. Districts where 95% of children were proficient or higher in 2009 had approximately 80% in 2010. Districts that had 80% in 2009 had approximately 50% in 2010. This means that the operational standard of adequacy using 2009 data was equivalent to 50% of children scoring level 3 or 4 in 2010. This also means that if we accept as reasonable, a standard of 80% at level 3 or 4 in 2010, that was equivalent to 95% – not 80% – in 2009.

Slide2

This next figure shows the resulting shift of the change in assessments from 2012 to 2013, also for 8th grade math. Again, I’ve applied ballpark cutpoint comparisons.  Here, a school where 60% were proficient in 2012 was likely to have 20% proficient in 2013. A school where 90% were proficient in 2012 was likely to have 50% proficient in 2013.   If, as state policymakers argue, the 2013 assessments do more accurately represent the standard for college readiness, and thus the constitutional standard of meaningful high school education, it is quite likely that the cost of achieving that constitutional standard is much higher than previously estimated. Notably, only a handful of schools surpass the 80% threshold on math proficiency for the 2013 assessments.

Slide3While it appears that the state has been chipping away at funding gaps for districts including New York City, they have not done so by substantively increasing funding, but rather by decreasing the adequate funding target.  This figure shows that the underlying basic cost figure for the foundation aid formula climbed gradually as planned through 2012-13. Note that this climb was based on the assumed 80% success rate on the 2007-08 outcome standard, not considering the 2009-10 adjustment to that outcome standard. But inexplicably, the state has chosen to reduce the basic funding figure for each year since, despite raising the outcome standards dramatically.

Slide1

Even worse, as explained above, the state continues to underfund the foundation aid formula by about 1/3. That is, even after lowering their target funding level, the state continues to fall over 30% short of that funding target.  The primary reason the extent of underfunding has declined is because the state has lowered the target.

Raising outcome standards while cutting funding is a formula for failure.

Appendix

The current “adequacy” target (according to the foundation aid formula) is the fully phased in adequacy target per (selected) aidable pupil unit, or, as laid out above:

PNI x RCI x Base = State Prescribed Adequacy Target[3]

This formula adequacy target represents what the state itself adopted as the quantification of its own constitutional obligation to provide for a sound basic education. Later in this brief, I challenge the validity of this target, but for purposes of this section, it is appropriate to consider this figure as the state’s own definition of its constitutional obligation.

The state aid per pupil (TAFPU) to reach that state prescribed adequacy target is then:

Adj. Foundation per Pupil – Local Contribution per Pupil = State Share per Pupil

And the total state aid to be received, if the formula was both fully phased in and fully funded is:

State Share per Pupil x TAFPU = Foundation Aid [before phase in]

Where “phase in” refers to the fact that the foundation formula is intended to scale toward full adequacy funding over three year periods (originally, four years reaching the target in 2011). Phase in, as referred to in this case, is a reduction to the target funding, representing the progress toward fully phased in funding to be made in the coming year. In the following analyses, and as represented above, I compare current funding against foundation aid before this reduction (phase in) is applied.

Thus, the extent of underfunding is:

State Aid to Reach Adequacy Target – Actual Foundation Formula Funding (after all adjustments) = Underfunding

The underfunding of the foundation formula results from two specific calculations. First, instead of actually basing foundation aid on the above calculations – that is, the actual formula – aid is simply frozen[4] (or proportionately marginally increased) relative to prior year total (not per pupil) aid. Then, in a two-step calculation, aid is reduced using the Gap Elimination Adjustment and then partially restored for most districts.[5]

For example, for the city of Utica:

$12,046Foundation aid per TAFPU x 11,832TAFPU = $133,950,644Foundation Aid (before phase in)

But, as shown in the following table, estimated actual (frozen) foundation aid is:

Estimate for 2013-14 = $72,413,005

So the preliminary foundation aid funding gap for Utica is:

$133,950,644Foundation Aid (before phase in) – $72,413,005Aid Based on Prior Year = $61,537,659Preliminary Aid Gap

But this is the gap before applying the Gap Elimination Adjustment. The deceptively named Gap Elimination Adjustment (or GEA) is really just a cut to state aid, which on average, falls more heavily on districts more dependent on state aid, or higher need districts.

The real gap for Utica is, therefore, as follows:

$72,413,005Aid Based on Prior Year – $2,843,829GEA = $69,569,176Actual Aid

So:

$133,950,644Foundation Aid (before phase in) – $69,569,176Actual Aid = $64,381,488Actual GAP  

In the table, we see that Utica actually receives only about half of the total state aid it should receive if the formula was funded. Other small city districts face similar shortfalls, with Utica also receiving about half of the state aid estimated as needed under the state foundation aid formula.

The table also provides a per pupil calculation of the degree of state aid underfunding across Small City districts and New York City.  I calculate the foundation aid gap per Duplicated Combined Adjusted Average Daily Membership, or DCAADM[6] which is the district enrollment figure commonly used in the state fiscal profiles files for calculating per pupil amounts.


[1] See: http://www.oms.nysed.gov/faru/PDFDocuments/Primer12-13A.pdf.

“The Foundation Amount is the cost of providing general education services. It is measured by determining instructional costs of districts that are performing well. It is adjusted annually to reflect the percentage increase in the consumer price index. For 2007-08 aid, it is $5,258. It is further adjusted by the phase-in foundation percent. For 2009-10, the adjusted amount is: $5,410 x 1.038 (CPI) x 1.025 (phase-in), or $5,756. For 2010-11, the adjusted amount is: $5,708 x 0.996 x 1.078, or $6,122. For 2011-12, the adjusted amount is: $5,685 x 1.016 x 1.1314, or $6,535. For 2012-13, the adjusted amount is: $5,776 x 1.032 x 1.1038, or $6,580.”

In this case, the matching 2012-13 figure is arrived at by taking P(OP0002) 02 ADJUSTED FOUNDATION AMT/PUPIL  for each district and dividing by PNI [O(PC0409) 05 PNI = 1 + EN%, MIN 1; MAX 2]     then  RCI [N(MI0123) 03 REGIONAL COST INDEX (RCI)], from: File DBSAD1, 3-29-12. Prior years also match. Interestingly, however the 2013-14 aid worksheets yield a foundation level of only $6,515, or a cut to the foundation level of $65.

[3] DBSAD1, 3-29-12, P(OP0002) 02 ADJUSTED FOUNDATION AMT/PUPIL

[4] DBSAA1, 3-29-12, E(FA0197) 00 2012-13 FOUNDATION AID

[5] DBSAA1, 3-29-12, GEA [AA(FL0026) 00 2012-13 GAP ELIM ADJUST ON BT1213] + GEA Partial Restoration [AB(FL0027) 00 2012-13 GAP ELIMINATION ADJMT RESTORATION])   

[6] Duplicated CAADM. This item (Duplicated Combined Adjusted Average Daily Membership or DCAADM) is the pupil count used to calculate per pupil amounts for the revenue items and expenditure categories. The pupil count is based on data from State aid worksheets and Basic Educational Data System forms. This pupil count is the best count of the number of students receiving their educational program at district expense. DCAADM includes the average daily membership (ADM) of students enrolled in district programs (including half-day kindergarten pupils weighted at 0.5); plus equivalent secondary attendance of students under 21 years of age who are not on a regular day school register plus pupils with disabilities attending Boards of Cooperative Educational Services (BOCES) full time plus pupils with disabilities in approved private school programs including State schools at Rome and Batavia plus resident students for whom the district pays tuition to another school district plus incarcerated youth. Beginning with the 1999-2000 school year, pupils resident to the district but attending a charter school are included. Beginning with the 2007-08 school year, students attending full-day Pre-K are weighted at 1.0, 1/2 day Pre-K weighted at 0.5. Since residents attending other districts were also included in the CAADM count of the receiving district, this pupil count is a duplicated count. The State total consists of the sum of the rounded pupil counts of each school district. Data Source: State Aid Suspense File. See: http://www.oms.nysed.gov/faru/Profiles/18th/revisedAppendix.html

The Average of Noise is not Signal, It’s Junk! More on NJ SGPs

I explained in my previous post that New Jersey’s school aggregate growth percentile measures are as correlated with things they shouldn’t be (average performance level and low income concentrations) as they are with themselves over time.  That is, while they seem relatively stable – correlation around .60 – it would appear that much of that correlation simply reflects the average composition and prior scores of the students in the schools.

In other words, New Jersey’s SGPs are stably baised – or consistently wrong!

But even the consistency of these measures is giving some school officials reason to pause and ask just how useful these measures are for evaluating their students’ progress or their school as a whole.

There are, for example, a good number of schools that would appear to jump a significant number of percentile points from year 1 to year 2. Here is a scatterplot of the schools moving from Over the 60th percentile to under the 40th percentile and from under the 40th to over the 60th percentile.Slide1That’s right West Cape May elementary… you rock … this year at least. Last year, well, you were less than mediocre. You are the new turnaround experts. Good thing we didn’t use last year’s SGP to shut you down!  Either that, or these data have some real issues – in addition to the fact that much of the correlation that does exist is simply a reflection of persistent conditions of these schools.

So, how is it then, that even with such persistent bias caused by external factors, that we can see schools move so far in the distribution?

I don’t have the raw data to test this particular assumption (nor will I likely ever see it), but I suspect these shifts result from a little discussed, but massive persistent problem in all such SGP and VAM models.

I call it spinning variance where there’s little or none… and more specifically… creating a ruse of “meaningful variance” from a narrow band of noise.

What the heck do I mean by that?

Well, these SGP estimates which range from the 0 to 100 percentile start with classrooms full of kids taking 50 item tests.

The raw numbers correct on those tests are then stretched into scale scores with a mean of 200, using an S-shaped conversion. At the higher and lower ends of the distribution, one or two questions can shift scale scores 20+ or so points. Stretch 1!

While individual kids’ scores might spread out quite widely, differences in classroom averages or schoolwide averages vary much less.

Differences in “growth” (really not growth… but rather estimated differences in year over year test scores) vary even less… often trivially … and quite noisily. But, these relatively trivial differences must still be spread out into 0 to 100 percentile ranks! Stretch 2!

I suspect that the differences in actual additional items answered correctly by the median student in the 60th percentile school are trivial when compared with additional items answered correctly by the median student in the 40th percentile school.

But alas, we must rank, as reformy logic and punative statistical illiteracy dictates, and fractions of individual multiple choice test items will dictate that rank under these methods.

Much of this narrow band of variance is simply noise (after sorting out the bias)… and thus the rankings based on spreading out that noise are completely freakin’ meaningless. These problems are equally bad if not worse [due to smaller sample sizes] when used for rating teachers.

Now for a few more figures. Above I show how even when we retain the bias in the SGPs, we’ve got schools that jump 20 percentile points one direction or the other over a single year! Do not cry Belleville PS10. Alas it is not your fault.

Here’s where these same schools lie, in year 1, with respect to poverty.

Slide2And here’s where they lie in year 2 with respect to poverty.

Slide3Of course, they have switched positions, merely by the way I’ve defined them. West Cape May is now awesome… and still low poverty, while the previous year they were low poverty but stunk! We’ve got some big movers at the other end too. Newark Educators Charter also made the big leap to awesomeness, while U. Heights fall from grace.

Now, the reformy statistical illiterate response to this instability is to take the mean of year 1 and year 2 and call it stable… and more representative… and by doing so we can pull this group to the middle.

Let me put this really bluntly… the average of noise is not signal.

Slide4The position of these schools at the edges of the patterned scatter is the noise.

Now averaging the patterns does give us stronger signal – signal that the growth percentiles are painfully, offensively biased with respect to poverty (the “persistent effect” is one of “persistent poverty”). But averaging the outer bands of this distribution to pull them to the center does not by any stretch make the ratings for these schools more meaningful.

As a fun additional exercise, I’ve used schoolwide proficiency rates and low income concentrations to generate predicted values of year 1 and year 2 growth percentiles and then taken the differences from those predicted values (in standard deviations) and used them as “adjusted” growth percentiles. That is, how much higher or lower is a school’s growth percentile  than predicted given only these two external factors?

In this graph, I identify those schools that jumped from over 1 full standard deviation above to 1 full standard deviation below their expected level, and vice versa.  I’ve used schools that had both 4th and 7th grade proficiency data and free lunch data, reducing my sample, I’ve also used a pretty wide range for identifying performance changes. So, I actually have fewer “outlier” schools.

Slide5

The fun part here is that these aren’t even the same schools that were identified as the big jumpers before correcting for average performance level and % free lunch. No overlap at all.

So, just how useful are the growth percentile data for even making reasonable judgments about schools’ influence on their students outcomes?

Well, as noted in my prior post, the persistent bias is so overwhelming as to call into serious question whether they have any valid use at all.

And the noise surrounding that bias appears to effectively undermine any remaining usefulness one might try to pry from these measures.

One more time on the video explanation of this stuff!

An Update on New Jersey’s SGPs: Year 2 – Still not valid!

I have spent much time criticizing New Jersey’s Student Growth Percentile measures over the past few years, both conceptually and statistically. So why stop now.

We have been told over and over again by the Commissioner and his minions that New Jersey’s SGPs take fully into account student backgrounds by accounting for each student’s initial score and comparing students against others with similar starting point.  I have explained over and over again that just because individual student’s growth percentiles are estimated relative to others with similar starting points by no means validates that classroom median growth percentiles or school median growth percentiles are by any stretch of the imagination a non-biased measure of teacher or school quality.

The assumption is conceptually wrong and it is statistically false! New Jersey’s growth percentile measures are NOT a valid indicator of school or teacher quality [or even school or teacher effect on student test score change from time 1 to time 2], plain and simple. Adding a second year of data to the mix reinforces my previous conclusions.

Now that we have a second year of publicly available school aggregate growth percentile measures, we can ask a few very simple questions. Specifically, we can ask how stable, or how well correlated those school level SGPs are from one year to the next, across all the same schools?

I’ve explained previously, however, that stability of these measures over time may actually reflect more bad than good. It may simply be that the SGPs stay relatively stable from one year of the next because they are picking up factors such as the persistent influence of child poverty, effects of being clustered with higher or lower performing classmates/schoolmates, or that the underlying test scales simply allow for either higher or lower performing students to achieve greater gains.

That is, SPGs might be stable merely because of stable bias! If that is indeed the case, it would be particularly foolish to base significant policy determinations on these measures.

Let’s clarify this using the research terms “reliability” and “validity.”

  • Validity means that a measure measures what is intended to, which in this case, is that the measure is intended to capture the influence of schools and teachers on changes in student test scores  over time. That is, the measure is not simply capturing something else. Validity is presumed good, but only to the extent those choosing what to measure are making good choices.  One might, for example, choose to, and fully accomplish measurement of something totally useless (one can debate the value of measuring differences over time in reading and math scores as representative more broadly of teacher or school quality).
  • Reliability means that a measure is consistent over time, presumed to mean that it is consistently capturing something over time. Too many casual readers of research and users of these terms assume reliability is inherently good. That a reliable measure is always a good measure. That is not the case if the measure is reliable simply because it is consistently measuring the wrong thing. A measure can quite easily be reliably invalid.

So, let’s ask ourselves a few really simple empirical questions using last year’s and this year’s SGP data, and a few other easily accessible measures like average proficiency rates and school rates of children qualified for free lunch (low income).

  • How stable are NJ’s school level SGPs from year 1 to year 2?
  • If they are stable, or reasonably correlated, might it be because they are correlated to other stuff?
    • Average prior performance levels?
    • School level student population characteristics?

If we were seeking a non-biased and stable measure of school or teacher effectiveness, we would expect to find a high correlation from one year to the next on the SGPs, coupled with low correlations between those SGPs and other measures like prior average performance or low income concentrations.

By contrast, if we find relatively high year over year correlation for our SGPS but also find that the SGPS on average over the years are correlated with other stuff (average performance levels and low income concentrations), then it becomes far more likely that the stability we are seeing is “bad” stability (false signal or bias) rather than “good” stability (true signal of teacher or school quality).

That is, we are consistently mis-classifying schools (and by extension their teachers) as good or bad, simply because of the children they serve!

Well then, here’s the correlation matrix (scatterplots below):

Slide1

The bottom line is that New Jersey’s language arts SGPs are:

  • Nearly as strongly (when averaged over two years) correlated with concentrations of low income children as they are with themselves over time!
  • As strongly (when averaged over two years) correlated with prior average performance than they are with themselves over time!

Patterns are similar for math.  Year over year correlations for math (.61) are somewhat stronger than correlations between math SGPs and performance levels (.45 to .53) or low income concentration (-.38). But, correlations with performance levels and low income concentrations remain unacceptably high – signalling substantial bias.

The alternative explanation is to buy into the party line that what we are really seeing here is the distribution of teaching talent across New Jersey schools. Lower poverty schools simply have the better teachers. And thus, those teachers must have been produced by the better colleges/universities.

Therefore, we should build all future policies around these ever-so-logical, unquestionably valid findings. That the teachers in high poverty schools whose children had initially lower performance and thus systematically lower SGPs, must be fired and a new batch brought in to replace them. Heck, if the new batch of teachers is even average (like teachers in schools of average poverty and average prior scores), then they can lift those SGPs and average scores of high poverty below average schools toward the average.

At the same time, we must track down the colleges of education responsible for producing those teachers in high poverty schools who failed their students so miserably and we must impose strict sanctions on those colleges.

That’ll work, right? No perverse incentives here? Especially since we are so confident in the validity of these measures?

Nothing can go wrong with this plan, right?

A vote of no confidence is long overdue here!

Slide2

Slide3

Slide4Slide5Slide6Slide7Slide8Slide9

On Inefficiencies and Value Added In Private Schools: A follow up on the Chubb research summit

schoolfinance101's avatarPrivate Schooling & the Public Interest

Bruce D. Baker, Rutgers

A few weeks back I posted a rather harsh critique of a summit convened by NAIS President John Chubb which he described as a gathering of leading researchers intended to generate ideas on the future of private independent schooling. Among other things, I critiqued the chosen researchers’ balance of ideology, knowledge of private independent schools and in some cases, generally lacking substance of their body of work on educational productivity.

John Chubb, as he has been known to do, graciously responded to my critique, pointing out that he would soon blog about the conversations that emerged among these researchers.

Below are two examples from Chubb’s recent blog posting, which I view as entirely consistent with my original concerns. Mainly, that the researchers gathered have weak understanding of private independent schooling, how private independent school leaders view their market and the broader perception of how private independent…

View original post 1,390 more words

Come with me… if you wanna go to Kansas City? Thoughts on BBQ, Baseball and Reformy BS

Urban school districts are easy targets – often the whipping boy – exemplars of the failures of big government bureaucracy. Kansas City Missouri is a frequent target when it comes to education policy. But as I’ve discussed in more than one peer reviewed article (one, another), and other reports, tales of Kansas City’s failures are largely urban legend.

This past week, the good citizens of Kansas City and Missouri Department of Elementary and Secondary Education were graced with one of the most vacuous manifestos on education reform I’ve read in a really long time. Yes, on my blog, I’ve pontificated about numerous other vacuous manifestos that often take the form of blog posts and op-eds which I suspect have little substantive influence over actual policies.

But this one is a little different. This report by an organization calling itself CEE, or Cities for Education Entrepreneurship Trust, in collaboration with Public Impact, is a bit more serious. No more credible, but more serious, in that it is assumed that state policymakers in Missouri might actually act on the report’s recommendations.

I’ve had the displeasure of reviewing several reports by Public Impact in the Past. Their standard fare is to establish a bold conclusion, and then cite (including self citation) materials that support – with no real validation- their forgone conclusion, cite other stuff that’s totally unrelated, and cite yet other stuff that doesn’t even exist. Thus, they are actually able to construct a report with a few graphs here and there and lots of footnotes, without ever validating a single major (albeit forgone) conclusion (see for example, this one, by the same author, under a different organizational umbrella, or this one).

This report starts with the forgone conclusion (drawn from the oft misguided and always ill-informed rhetoric of Andy Smarick), that:

“Simply put, the traditional urban school system does not work. It is not stable. It does not serve the needs of its students. It does not, nor has it ever, produced the kind of results all children, families, and taxpayers deserve. And it does not create the conditions that research shows enables great urban schools to thrive. It is time to think outside the box and have a robust community conversation about how to build a new and different school system that is structured for success.” (p. 7)

With this hypothesis – actually, forgone conclusion – firmly established the authors need merely connect the dots back to the woes of Kansas City and how to fix them. Here’s a synopsis – call it an advanced organizer – of the story line crafted in the report:

  • Urban districts don’t work (and aren’t stable)
  • Kansas City is an urban district, therefore, it doesn’t work (even though we find it has stabilized)
  • Privately operated charter schools in Newark, New Jersey, New York City, Texas and New Orleans are producing miracles – yielding incredible graduation rates and high test scores while serving comparably low income and otherwise needy children (even though they really aren’t serving similar kids, and many have far more resources)
  • Thus, the same can – no must – work in Kansas City (even though it hasn’t)
  • Somewhat tangentially, decentralized financing – driving money to schools for site based control – is necessarily good (even though reviews of the research suggest otherwise)

Therefore, the only solution is to deconstruct the entire failed urban district, turn control over to a non-government authority which shall loosely govern a confederation of private non-profit entities that shall compete with one another for students, choose which market niche and geographic space within KC they wish to serve and be evaluated on the test scores and graduation rates they ultimately produce.

Are you following? If not, let’s take a stroll through some of the “facts” provided to support their end-game, along with some of the actual facts about the Kansas City Missouri Public School District.

Justification for Intervention?

The authors’ primary justification for the bold transformation of Kansas City Public Schools is that they have low average test scores. And everyone knows that’s bad and can’t be tolerated, whatever the root causes.

Specifically, the evidence they provide is that:

  • 70 percent of KCPS students are below proficient in math and English Language Arts (ELA).
  • ELA proficiency rates have declined in some recent years, despite improved management and operations.
  • Very, very few students graduating from KCPS are ready for college based on their ACT scores.
  • While science and social studies scores have improved this past year, proficiency rates are still below 30 percent.
  • And average KCPS student achievement growth is lower than state predictions based on similar districts’ results, meaning that KCPS students could fall further behind their peers over time.

While some argue that the system has been stabilized after years of dysfunction, one must ask: what good is stability if most students still cannot read, write, or do math proficiently, or graduate from high school ready for college or careers? (p. 7)

Okay, but really, how does that stack up against expectations? Not that we should succumb to low expectations. But certainly, any credible report summarizing student outcomes in a major urban district should summarize some of the background and context for these figures.  But alas, not this one!

Well, let’s take a look. First, Kansas Citians know that their fine city and their fine school district aren’t by any stretch one and the same. Perhaps that right there is an issue to explore. KCPS, formerly KCMSD was crafted as a massive boundary gerrymandering effort in the immediate post-Brown era. Portions of the city limits were consumed by the reorganization and mergers of predominantly white neighborhoods and “suburbs” (which are really now all part of the city) at the time. In many areas, less poor, whiter (though increasingly poor and minority) sections of the city still remain in other school districts to the south and east. The poorest areas of the city, where blacks were relegated to live for decades, were included in KCMSD, along with the western edge of Independence, Missouri, which remained the most “integrated” portion of the city district until the past decade (when clever legislators passed a law allowing that section to vote itself out of KCMSD and into Independence). That’s why I bring this all up – because KCMSD itself was gerrymandered to begin with as a district for poor minority neighborhoods in the city, and because that gerrymandering persisted as recently as 2007-08!

The district really didn’t have much of a chance. Concurrent  trends led to additional pressures. Charter schools began popping up in the late 1990s and grew throughout the 2000s. Figure 1 below shows a) total enrollment for schools within city limits, non-charter enrollments, KCMSD enrollments and charter enrollments, from 1999 to 2011. A really important point here is that  KCMSD’s share of enrollment within the city limits was relatively small to begin with, because of the way in which the city was carved up in the post-brown period – exacerbated in 2008. And enrollments have been on a slow, steady decline in the past decade. Charter enrollments have climbed, and while they represent a significant share of KCMSD’s geographic space, they are a much smaller share of the city limits as a whole.

Figure 1

Slide1Source: NCES Common Core of Data, Public School Universe Survey [error in 2005 data]

Figure 2 shows the shares of low income (% qualified for free lunch, or <130% income level for poverty) children by group. Notably, KCMSD and charters within KCMSD are much higher than other schools in those carved, formerly suburban spaces in city limits (this includes Center, Hickman, a portion of Lee’s Summit, etc.).

Figure 2

Slide2Source: NCES Common Core of Data, Public School Universe Survey [error in 2005 data]

Much has been made of the desegregation litigation that, as the story goes, made Kansas City the highest spending school district in the world… for decades on end… all for naught. Figure 3 gives us a story lined walk-through of the relative state and local revenues of KCMSD compared to the average for its surrounding labor market from 1993 to 2011. Funding really started scaling up around 1988 toward a peak around 1993. But after the U.S. Supreme court in the 1990s indicated that current remedies went a bit too far (taking an approach of trying to attract suburban residents into the city’s magnet schools – because the judge really had no other way to achieve integration), the relative funding for KCMSD schools fell precipitously over time (actually what happened is that it stagnated – and other caught up).

For nearly a decade now, KCMSD state and local revenue per pupil has been only marginally above the average for the labor market.

Figure 3

Slide5Source: U.S. Census Fiscal Survey of Local Governments (F33)

But as figure 4 shows, the average poverty rate of children in the district, compared to surroundings, is anything but average. KCMSD’s student population has remained 2x to nearly 3x as poor as surrounding areas – even Wyandotte!  One certainly can’t expect to achieve stellar outcomes with a population this needy, and only relatively average resource levels to serve them (yes, money matters and even more so for needy kids!).

Figure 4

Slide6Source: U.S. Census Small Area Income and Poverty Estimates

So… to summarize… what we have here is not a simple case of inexcusable bad test scores that simply have to be “fixed” by dismantling the district and replacing it with a miraculous new structure – without changing any of the underlying causes or conditions.

What we have here is a complex, long running case, of disadvantageous housing development, boundary gerrymandering, high poverty and declining resources.

For any report on the future of KCMSD schools to miss all that is completely inexcusable. It’s downright ridiculous, amateur, sloppy and unprofessional.

Justification for Using Chartering as Replacement?

Given that the report’s authors have missed entirely most of the relevant context and history of Kansas City schools, how then do they arrive at their proposed solution – to replace the “failed urban district” with a loosely governed confederation of benevolent non-profit providers?

The answers, of course, can be found in the many miracle charter schools that grace great American cities like Newark, New Jersey (hey… Newark and KC have a lot in common), New York City and New Orleans.

Among their chosen miracles, the authors point to the Uncommon Schools network as proving that one can simply put a non-profit manager in charge and whamo…. kablam! You’ve got transformation of student outcomes! The authors explain:

Across the schools, the average student population is 98% black or Hispanic, and 78% receives free or reduced-price lunch. Uncommon Schools was awarded the 2013 Broad Prize for Public Charter Schools for demonstrating the most outstanding overall improvement in the nation for low-income students and students of color.25 Uncommon Schools closed 56% of achievement gaps between its low-income schools and the state’s non-low-income students.26

And they even provide a nifty graph showing that Uncommon Schools in Newark (uh… that’s just North Star Academy) not only beats the citywide average, but also beats the state wide average on performance measures.

Figure 5:

Slide7This is so laughable it hurts. Really.

What they totally neglect to point out is that:

To summarize, North Star’s overall performance is mediocre at best (given their attrition, lack of special needs students, etc.) and deeply disturbing at worst, when one looks beyond average test scores among those who stay. Choosing North Star as a model of beating the odds, and representing the school as in this report, is either just plain ignorant or outright reckless.

Now, on the one hand, they simply might not have ever looked at any actual numbers on North Star. But that would be equally irresponsible. The choice to use North Star as proof of the value of chartering – as applies to the current proposal for Kansas City – is bafflingly ignorant.

The report provides similarly crude information on New York City charter schools, including reference to New York City’s own Uncommon Schools. The reality is that New York City charters, like North Star in Newark, are anything but miraculous.  They are well privately subsidized schools, serving low need student populations, providing them smaller classes, well paid teachers and yielding less than astounding results.

New Orleans in particular might best be described not as some positive shining star miracle brought on by Hurricane Katrina, but rather as an unmitigated disaster of education policy. This is perhaps best documented in the work of Kristen Buras in Harvard Education Review. I have written about the spotty/questionable performance of New Orleans charter schools here.   See also this critique of attempts at selling the supposed NOLA miracle.

What about Existing Chartering in Kansas City?

What I find most interesting about the proposals in the CEE report is that they justify the shift to a 100% non-profit, loosely coupled charter confederation based on the supposed (albeit completely unfounded) great successes achieved by charters in New Jersey, New York and New Orleans. But if full scaled charterization of Kansas City is going to be the savior of the city, then why hasn’t it already? Why are Kansas City’s own charter school results so lukewarm at best? And why haven’t Kansas City charter operators stepped up to fill the void of serving those children most in need, in the city’s poorest neighborhoods?

The report uses the following deceptively simple figure of average performance?

Figure 6.

Slide8

But what does KC charter performance look like in context? Here are a few figures focused on lower grades (up to 8th) schools in Kansas City, including charter, magnet and regular district schools (though some are special emphasis). Figure 7 shows the MAP Index for schools by percent free lunch. The average for charters is slightly higher (as in the above figure) for charters, but charter performance, like district school performance varies, with the charters serving the highest poverty populations really struggling as one might expect.

Figure 7

Slide9

Figure 8 shows a similar pattern for proficiency rates.

Figure 8

Slide10

The presumption in the proposal is that the diamonds here can simply takeover the green circles and make them more like the diamonds. One problem with that is that in many cases, that change would be a downgrade. But of course, the real presumption of this report is that one can take the charters of New York City and Newark, NJ and transport them onto the circles in this graph and … WHAMO…. miracle cure for the failed urban district?????

Importantly, these graphs don’t even include the likely differences in special education populations.

The bottom line is that charters are certainly no panacea for solving the woes of KCMSD. Rather, like district schools, their performance varies, around a similar average, with those serving higher poverty populations having the most difficulty.

Does Decentralized Budgeting Lead to Better Outcomes?

Next, there’s the somewhat tangential focus in this report on making sure that as much money as possible is allocated to school sites for school site control. This argument is made with full confidence that it is entirely uncontroversial that bringing control over budgets down to individual school sites can only and has only ever yielded positive outcomes. Well, if only there was actually legitimate empirical research to support that contention? Not that it’s an awful idea. But to suggest that it’s necessarily a solution is, well, a bit of… no… a huge stretch.

These same authors  have made this claim on more than one occasion, without any particular citation to support that the share of budget allocated down to school site control meaningfully improves any form of measured outcomes. As I explain here (in a critique of a report on a similar topic):

In a comprehensive review of literature on school-site management (SSM) and budgeting, Plank and Smith (2008) in the Handbook of Education Finance and Policy present mixed findings at best, pointing out that while SSM may lead to a greater sense of involvement and efficacy, it seems to result in “little direct impact on teaching behaviors or student outcomes.”

That is, it sounds good, and can feel good, but there’s little evidence to back the approach as effective or efficient. In fact, there are many reasons to question the efficiency of fully decentralized budgeting, including the increased likelihood that building level administrators and planning teams will be required to divert more of their time and effort to budget planning issues that might better be handled centrally, the reduced rate at which efficiencies might be diffused and adopted across schools, and lost efficiencies in purchasing and contracts.

The Totally Ignored Issue of Student, Employee and Taxpayer Rights

Finally, and this is a really, really big issue that the authors of this report, and others promoting similar reform strategies completely disregard.

The shift from traditional public governance of schools to mixed public/private relationships substantively alters the rights of students, employees and taxpayers. I have a forthcoming article in Emory Law Review on this topic, with coauthors Preston Green (UCONN) and Joseph Oluwole (Montclair State).

In our forthcoming article we explain that:

Children’s rights under school discipline policies may be treated as private contractual agreements with their provider, thus potentially forgoing many constitutional protections (including due process protections related to dismissal, protections of their right to free speech and right not to be compelled to speak, among others).

Employees rights too may be limited, including their rights to organize as would public employees.

And taxpayers may find increasingly that documents and information (and meetings) they perceived as publicly accessible, are not, as organizations shift key roles responsibilities under private governance in order to shield them from public disclosure.

In a model where no true public provider exists, like the one proposed here, parents may be required to choose which rights to forgo (disclosure, discipline, etc.). This is simple bad public policy, with the worst aspect being that we are selectively reducing the rights available to our must vulnerable children and families (no-one is asking the children of Johnson County to forgo their rights in the same way).

Conclusions

While the authors of this report so confidently conclude that the obvious solution is to replace the failed urban district with an under-regulated, loosely governed confederation of benevolent non-profit actors, one might easily alternatively conclude from the evidence herein… that simply put, large scale chartering in urban centers like Kansas City simply doesn’t work. It never has and likely never will. It fails to serve the neediest children because “market forces” and accountability measures favor avoiding those children and the neighborhoods in which they live.

Further, large scale chartering leads to deprivation of important constitutional and statutory rights for children, primarily low income and minority children. Meanwhile, suburban white peers are not being asked to forgo constitutional protections in order to access elementary and secondary schooling.

Finally, large scale chartering has made far more opaque financial and governance accountability as governing institutions have created more complex private structures in order to shield their operations, records and documents from full public view.

One can only hope that this report and its aftermath have the potential to rile up Kansas City as much as Robbie Cano! (baseball)

Really miss Oklahoma Joe’s (bbq)… and Jack Stack (Martin City)

Additional Readings on Kansas City

Green III, P. C., & Baker, B. D. (2006). Urban Legends, Desegregation and School Finance: Did Kansas City Really Prove That Money Doesn’t Matter. Mich. J. Race & L., 12, 57.

Gotham, K. F. (2000). Urban space, restrictive covenants and the origins of racial residential segregation in a US city, 1900–50. International Journal of Urban and Regional Research, 24(3), 616-633.

On School Funding Myths vs Realities

Baker, B. D., & Welner, K. G. (2011). School Finance and Courts: Does Reform Matter, and How Can We Tell?. Teachers College Record, 113(11), 2374-2414.

Baker, B.D. (2012) Revisiting the Age Old Question: Does Money Matter in Education.  Shanker Institute. http://www.shankerinstitute.org/images/doesmoneymatter_final.pdf

On Charter Schooling Myths and Miracles

Baker, B.D., Libby, K., Wiley, K. Charter School Expansion & Within District Equity: Confluence or Conflict? Education Finance and Policy

Baker, B.D. (2012). Review of “New York State Special Education Enrollment Analysis.” Boulder, CO: National Education Policy Center. Retrieved [date] from http://nepc.colorado.edu/thinktank/review-ny-special-ed.

Baker, B.D., Libby, K., & Wiley, K. (2012). Spending by the Major Charter Management Organizations: Comparing charter school and local public district financial resources in New York, Ohio, and Texas. Boulder, CO: National Education Policy Center. Retrieved [date] from http://nepc.colorado.edu/publication/spending-major-charter.

On Charter Schools and Public/Private Distinctions

Green, P.C., Baker, B.C., Oluwole, J. (in press) Having it Both Ways: How Charter Schools try to Obtain Funding of Public Schools and the Autonomy of Private Schools. Emory Law Journal

Critiques of Shoddy Work by Public Impact and Public Impact Authors

Baker, B. D. (2011). Review of “Spend Smart: Fix Our Broken School Funding System.” Boulder, CO: National Education Policy Center. Retrieved [date] from http://nepc.colorado.edu/thinktank/review-spend-smart

NEPC Bernie Madoff Award Winner!

Baker, B.D. & Ferris, R. (2011). Adding Up the Spending: Fiscal Disparities and Philanthropy among New York City Charter Schools. Boulder, CO: National Education Policy Center. Retrieved [date] from http://nepc.colorado.edu/publication/NYC-charter-disparities.

See discussion of Ball State/Public Impact charter funding disparity study

Garcia, D. (2011). Review of “Going Exponential: Growing the Charter School Sector’s Best.” Boulder, CO: National Education Policy Center. Retrieved [date] from http://nepc.colorado.edu/thinktank/review-going-exponential.

Winner: the Cancer is Under-rated Award!