Blog

The Willful Ignorance of the NJ Star Ledger

After having a series of conversations with Star Ledger reporter Julie O’Connor about her desire to write a cover story about how TEAM Academy is producing miracles in Newark, I wrote this post:

https://schoolfinance101.wordpress.com/2015/01/30/ed-writers-try-looking-beyond-propaganda-press-releases-for-success-stories/

The reason for this post is explained in this paragraph:

Well, one reason I’m going there is that I’m sick of getting e-mail and phone inquiry after inquiry about the same charter schools – and only charter schools – asking how/why are they creating miracle outcomes. I try to explain that there may be more to the story. The reporter then says that the charter school’s data person says I’m wrong – validating their miracle outcomes (despite their own data not being publicly available/replicable, etc. and often with reference to awesome outcomes reported in popularly cited studies of totally different charter schools).

For a while after writing this, I figured that the NJ Star Ledger reporter who was so insistent on writing her rah rah TEAM article had simply given up. But alas no. The puff piece finally arrived today: http://www.nj.com/opinion/index.ssf/2015/05/beating_newarks_odds_kipp_charter_network_is_poise.html#incart_river

Now, it’s written as an editorial, so I guess that means it’s okay to make stuff up, ignore lots of stuff, and just generally roll with a combination of propaganda provided to you by the school and your own personal predisposition.

What’s so disturbing about this all is that the title of the editorial itself is directly refuted by the statewide analysis I provided. That TEAM relatively marginally beats expectations, and in fact, several Newark Public schools and a few other charter schools in Newark “beat the odds” so to speak, by much more. AND THE AUTHOR OF THE EDITORIAL WAS FULLY AWARE OF THIS.

I refused to call the reporter in part because I wanted there to be a full, complete transcript of our e-mail conversations. I’m sick of banging my head against this wall.

Below is a transcript of the conversation that started with an inquiry to Diane Ravitch from Julie O’Connor. Others were included on the e-mail chain and jump in at various points.

Reporter Inquiry

Prof. Ravitch,

I’m on the editorial board at The Star-Ledger in New Jersey, and I’m working on a cover story for our Perspective section about the KIPP schools in our state. The college attendance stats of KIPP seniors in Newark seem pretty impressive, and I was wondering if you have the same reaction, and what you think of KIPP’s forays into Camden.

Would really appreciate it if you could give me a call at []. Would like to discuss KIPP in the context of your criticisms of the broader charter school movement, and whether or not you think it is an exception.

Many thanks,

Julie O’Connor

The hand-off

Julie,

I suggest you talk to Mark Weber and Bruce Baker at Rutgers, who have studied charters in NJ. I lean on their research. The question is not whether one chain can produce successful graduates, but whether charters in general are helping the most vulnerable schools, whether they are reducing the funding and capacity of public schools, and whether their success-when it exists–is the result of selection and attrition.

Diane Ravitch

Reporter

Ok, thanks for your prompt reply.

Prof. Baker emailed me his report on free/reduced lunch and the TEAM schools, but I have been unable to reach him on the phone to discuss KIPP or my follow up questions.

Basically, I am looking for a reaction to two claims from KIPP that seem impressive: The college attendance rates (last year, 95 percent of KIPP seniors went to college, 89% to a 4-year, 6 percent to a 2-year), and the fact that KIPP kids in elementary and high school equal or outperform the average for the state of NJ (some years they do in middle school, too, though this year they didn’t).

KIPP kids are 87% free/reduced lunch and the state is in the 30s. I understand that Baker and others are skeptical about comparing KIPP kids to their peers in the Newark district. But what about comparing them to the state average? And what about their college attendance rates?

I would like to discuss the criticisms of the charter school movement and whether you view KIPP as an exception, or more of the same. Prof. Baker, can you please give me a call as soon as you get a chance? []. We are hoping to run the story in the next week or so.

Many thanks,

Julie

Baker to Reporter

My point is, and shall continue to be that news stories on education should NOT be driven by some PR prompt from specific schools touting their “successes” through anecdotes. Thus, my only reaction is the reaction I posted previously about school performance, given analyses across all schools, using comparable, publicly available data:

https://schoolfinance101.wordpress.com/2015/01/30/ed-writers-try-looking-beyond-propaganda-press-releases-for-success-stories/

The bottom line is that KIPP schools performance on comparable measures of student growth, controlling for demography, resources, etc., are relatively average (marginally above average). Many district schools, including ones in Newark, far outperform them.

Reporter to Baker

Ok. Even if KIPP students aren’t representative of their district, isn’t it still impressive that they are beating the state average, given that their student population is significantly poorer?

KIPP says 93 percent of their students stay with them (7 percent leave their schools each year for any reason).

If what this tells us is that KIPP students have high scores and go to college, how do they fit into criticisms of the larger charter school movement? And what do you think of KIPP’s expansion into Camden?

Prof. Baker, read your blog post and would like to discuss. I am not sure how you are measuring growth in these ranked schools. Are you skeptical about the accuracy of the college attendance rates and performance numbers reported by KIPP? If so, why? Please give me a call. []

Thanks.

Baker to Reporter

Not without running a model of demographics against the same outcome measures across all schools, to see how/whether they truly deviate, statistically, from expectations. Anecdotes of this type are unhelpful for understanding what’s “impressive” statistically or not.

For measuring growth, I’m using the state’s own reported school Median Growth Percentile – for 2012, 2013 and 2014.

Skeptical or not, context is what’s needed for them to really mean anything. The context of all other schools, and their demographics, to evaluate statistically whether the KIPP schools actually deviate from what would otherwise be expected (given enough schools to estimate a model of expectations).

Reporter to Baker

Ok. Is the state average not considered a good measure of how schools are doing?

Is your central point in creating your own measurement for whether schools deviate from expectations that KIPP schools have more resources and classroom time and better class sizes, and that’s why their students are doing so well?

Are you trying to account for those factors in your outcome measure, since you might not find such conditions in traditional district schools? That seems to be your argument in this blog post:

https://schoolfinance101.wordpress.com/2013/03/01/the-non-reformy-lessons-of-kipp/

Trying to understand your general view of KIPP’s performance.

Baker to Reporter (w/head banging against desk)

No. State average is NOT a useful comparison.  Given the number of things that vary across schools, one needs to look at any given school in the context of all schools, with all available measures. Not just compare one school to the state average and say, for example, “it’s got higher poverty, and higher outcomes than the state average.” That comparison misses a lot of other factors that may vary across schools. One needs to see how those factors affect the outcome measure across schools and then compare against the overall pattern.

Second – I’m not “creating” my own measurement. I’m doing what I describe above. Taking the state’s measures, and making comparisons among “otherwise similar” schools along the trend of schools, given their various attributes. That is, how much higher, or lower than expected, does a school score (on growth) given all of those factors that vary.

Now, I also use the state’s growth measure,  because, for all its shortcomings, it is actually the best available New Jersey measure of what a school might be contributing to student outcomes (rather than what kids come in with, or who leaves and when). But that measure too is ONLY useful if you control for/account for the various factors. Quite simply, this is how credible analysis of this type is done, knowing full well that even this approach can’t capture some factors that affect outcomes that really aren’t about how good/bad a school is.

Their performance tends to be marginally above average, to about average, considering all schools including district schools. For that matter, several Newark district schools have higher performance. Discovery Charter school is the standout among charters. North Star seems to do well, but I believe that the model isn’t really capturing the effect of their substantially greater attrition, or different student population. But who knows.  But then again, Robert Treat has very different student population and tends to show very weak gains with adjustment for the included factors.

Reporter (who clearly never bothered to read the original post)

What factors that vary are you trying to account for? It is things like resources, classroom time and class sizes?

Baker to Reporter (direct response to ignorant question)

They are all listed in the blog post!

https://schoolfinance101.wordpress.com/2015/01/30/ed-writers-try-looking-beyond-propaganda-press-releases-for-success-stories/

Outcome is Growth

Corrected for:

  1. prior average scale score level
  2. % free lunch
  3. % disability (because I cant’ break out by severity, charters like TEAM actually get an advantage here)
  4. % Ell
  5. total staffing expense per pupil
  6. school grade range served
  7. school size

More Exasperated Baker to Reporter

Schools in Newark: https://schoolfinance101.com/wp-content/uploads/2015/01/slide18.jpg

Charter Schools Statewide: https://schoolfinance101.com/wp-content/uploads/2015/01/slide24.jpg

So again, I ask, why do you feel the necessity to write a story on KIPP schools? And why the apparent obsession on trying to find a miracle in KIPP? How do these supposed miracles (that generally aren’t) come across your desk?

An objective statistical run of all schools in the state, using the state’s own best available measure as the outcome, finds TEAM in Newark to be a decent – relatively above average – school, but no miracle. There are no miracles in this complex endeavor. That’s fine. They do a pretty good job, and seem to do a better job of serving a more representative student population than some others (see also: https://schoolfinance101.wordpress.com/2013/11/27/where-are-the-most-economically-segregated-charter-schools-why-does-it-matter/)

I’m not trying to rain on their parade. I’m just pointing out that if we take all of the data from schools around the state and try to figure out who’s actually “doing better than expected” given who they serve and the resources they have, we don’t identify KIPP as the standout.

Weber to Reporter

Julie, I am going to encourage you to read Bruce’s entire post, as it is far more sophisticated and comprehensive than what I am going to include here.

That said, let me put this in very simple — admittedly, TOO simple — terms:

This is a very quick and very dirty scatterplot that shows the average scores on the NJASK Grade 8 English Language Arts (ELA) exam from last year for every school in the state. I’ve highlighted TEAM on this graph.

The NJASK score is on the vertical or y-axis. On the horizontal or x-axis is the percentage of students who qualify for free or reduced price lunch, a proxy measure for student economic disadvantage (a student’s family has to be at 185% or below the poverty line to qualify for FRPL).

The first and most obvious thing to notice is the relationship between how many FRPL kids a school has and its average test score. Clearly, when FRPL goes up, test scores go down. 70% of the variation in these scores can be statistically explained by the percentage of FRPL kids at the school.

We all know this. Poverty matters.

The green line through the middle is called a regression line: it’s a kinda-sorta “average” that predicts how well a school will do given its FRPL percentage. If you’re above the line, you’re doing better than prediction; if you’re below the line, you’re doing worse.

TEAM is above the line – hooray for them. But how many other schools do you see across the state that are at least as far above the line as TEAM? How many are way, way further above that line compared to TEAM?

Again: what Bruce did in his post was far more sophisticated than this, because he’s using a statistical model to account for other things that will affect student outcomes, like percentages of special education kids and how much a school spends per pupil on staff (yes, money does matter). He’s also judging outcomes on SGPs, which is arguably a better measure of a school effectiveness.

I’m boiling this down, however, to reinforce his point: yes, TEAM is a better-than-average school. Again, good for them… but why all the outsized attention? Why are you writing a story about them and not the many, many other schools that “beat prediction” much better than TEAM? How many district schools could be considered “miracles” relative to TEAM that get ignored by the op-ed pages of your newspaper?

Julie, you and I both know I have been the Star-Ledger Editorial Page’s harshest critic on education. I’ve admitted before that sometimes I have gone too far… but can you understand my frustration? Can you understand how unfair it appears to those of us who have taken the time to study Bruce’s work that TEAM gets all the accolades while many schools that — by TEAM’s own standards — are doing a BETTER job than they are, yet continue to be ignored?

I am asking you to listen to Bruce carefully and take the time to understand what he is saying. This stuff matters. You control arguably the most important space for punditry in the state. You owe it to your readers to get this stuff right.

If I can help further, let me know.

Mark Weber

Reporter (still not bothering to read, and returning to anecdotes provided by school)

What about the 95 percent of KIPP seniors that went to college last year? That seems impressive to me.

Also, when you say comparing KIPP to the state average doesn’t mean anything without “running a model of demographics against the same outcome measures across all schools, to see how/whether they truly deviate, statistically, from expectations” — isn’t that what the Mathematica study does? Control for any differences in student population?

Baker (even more exasperated) to Reporter

Why don’t you write it that way then – that it seems impressive to you.  I’m not going there, with your representation of data, passed along to you most likely by the school, without opportunity run appropriate models on the data. And I don’t have time to be doing that right now, or quibbling with you over your strange incessant desire to write a story on how awesome you think these schools are, without ever bothering to look at the schools in the context of all schools, where many others may, in fact be even more impressive.

And are you speaking of some Mathematica study of TEAM Academy specifically, and their graduation and college matriculation rates? Or Mathematica studies of KIPP schools generally/nationally ? [I believe only the latter exists –http://www.mathematica-mpr.com/~/media/publications/PDFs/education/kipp_middle.pdf]  Yes, the network’s results are solid. Not miraculous. But solid. Driven in part, perhaps by selection issues (see methods critiques below), and in part by resources. KIPP schools in many contexts substantially outspend their “competition” offering higher salaries, much smaller classes, longer days/years, etc. Certainly won’t deny that those types of resources matter.

Comments on related methods here: https://schoolfinance101.wordpress.com/2012/12/20/thoughts-on-randomized-vs-randomized-charter-school-studies/

and: https://schoolfinance101.wordpress.com/2013/07/12/thinking-writing-about-educational-research-policy-implications/

There are indeed limitations these methods.

Some information here on where TEAM fits on resource/demographics, etc in Newark: https://njedpolicy.wordpress.com/2015/01/13/research-note-resource-equity-student-sorting-across-newark-district-charter-schools/

Weber to Reporter

Related to the issue of resources:

Find attached the 2012 tax forms for TEAM, Friends of TEAM, and KIPP. You can access these easily at guidestar.org.

You will notice on page 42 of the KIPP 990 that TEAM received $1,053,147 in direct support from KIPP. This likely does not include all sorts of administrative, logistical, marketing, lobbying, etc. activity KIPP undertakes on behalf of TEAM.

On page 21 of the Friends of TEAM 990, you’ll find a $1,005,332 grant to TEAM. On page 9, you’ll see the group took a rental income loss of $1,813,501, likely to the school’s benefit (were I you, I’d certainly ask them about this).

In 2011-12, TEAM enrolled 1,504.5 students. If you take the grants from KIPP and FOT together, that comes to $1,368 additional expenditures per child, not including the rental loss that FOT took. So far as I know, this extra funding is not reported in the NJ Taxpayers Guide to Education Spending.

Let me be clear: it is, in my opinion (an opinion backed up by a substantial and growing body of research) that spending this extra money on behalf of these students will help their academic growth. This is a good thing.

But it is exactly the sort of issue that is not addressed by the Mathematica report, nor by any number of other “studies” that purport to show the superiority of KIPP’s methods by holding all things constant.

So how does TEAM spend all this extra money? Well, here’s one way:

At all stages of a teachers career, TEAM pays a higher salary, even when adjusted for experience, than NPS (and way more than Newark’s “local” charters). When you pay more and offer better working conditions, you can attract people who are willing to work longer hours (to a point).

But they manage to keep salary costs low by also doing this:

Notice the high number of teachers with only one year of experience at TEAM? Notice how they barely have any teachers with more than 15 years of experience? That’s when the NPS salary guide gives veteran teachers a big boost.

Is this a smart strategy? Absolutely. Is it sustainable? I say almost certainly not. Does TEAM really think they can keep recycling their staff AND expand the number of students enrolled? Are there really that many young people out there willing to make teaching at TEAM a temporary career? And is that really good for the city and its students?

As Bruce says: TEAM does a good job. They are, by the numbers, a good school. But I would argue KIPP’s methods are not replicable at a large scale. In fact, THEY’D probably agree with me, because they have said over and over again that they are not interested in taking over an entire district.

Julie, if you are willing to dig into this and go behind the talking points the KIPP publicity machine feeds the press, I think you will find TEAM’s “success” raises more questions than it answers:

– If more money is good for charter schools, why isn’t it good for pubic schools?

– Is it good for the teaching profession to encourage the growth of schools that appear to run on a policy of churning much of their staff?

– When we get past the issues of different student populations, attrition, extra resources, hiring practices, test prep, etc., what, exactly, is so special about KIPP/TEAM?

Mark

Five Steps to Cagebusting Relinquishment and the Suburban School District of the Future!

As I explained in my previous post, relinquishment in the form of “chartering” has taught us much about how to “fix” urban school districts. But why should urban districts be the only ones to benefit from the wisdom of emergent “disruptive” models of school organization? Here, I provide an overview/preview of what may eventually become my defining academic contribution! How to fix the suburban school district. How to relinquish the leafy ‘burbs! So, here it is, in all its, glory, the rough outline of my forthcoming manifesto on Cagebusting Relinquishment and the Suburban School District of the Future!

Step 1 – Hire a private management company(ies) to manage 100% of district operating funds and any/all subcontracted service agreements, including those addressed under Step 5 below. This will include all employee contracts as well as all additional vendor contracts. What’s really cool about this is that the Local Board of Education’s reported budget and annual financial report become one single line of expense – Contracted Services – to the management company. Nothing else need be disclosed to the public/taxpayers. The rest is at the discretion of the private management company, whose finances and contractual arrangements may not be subject to public access/review. The public only gets to see that one line – that lump sum payment by the board of education to the manager(s). In other words – Step 1 – Relinquish! ‘cuz relinquishment rocks!

Step 2 – The local board of education and private manager quickly concoct a new school rating system that allows them to declare all schools to be failing, requiring that the schools be closed and reconstituted under the private manager. This bold “disruptive” step permits the private manager to establish its own employee contracts and recruit its own employees to fill the roles of the (crappy, self-interested, tenured, government employed) teachers and administrators immediately dismissed by the local board of education because their schools technically no longer exist. The private manager might, for example, choose to establish a feeder/pipeline relationship with an emergency/expedited training program for young suburban saviors (on the expectation of significant turnover to hold long run staffing costs down), or establish its own H1B visa processing entity to enable the schools to employ foreign teachers paid modest stipends. Because these are all employees of the private manager, and not “public” employees, reshuffling them, dismissing them (if they don’t leave fast enough to keep costs down), etc. is easy because many constitutional and statutory protections of public employees are irrelevant.

Step 3 – the private manager establishes rigid no excuses discipline policies and written contractual agreements with parents and their children to abide by those discipline policies or face immediate dismissal. Like the employees, students/parents may be forgoing constitutional and statutory protections they would have in a government operated institution. Rather, discipline policy may be evaluated by the courts as a contractual agreement with the private provider, giving that provider wide latitude to impose draconian requirements if they so choose.

Step 4 – the private manager and local board of education would probably want to declare the system to be one of district wide, open choice where each year, all families in the district rank their school choices and students are sorted by computer algorithm into assigned schools, regardless of proximity to home (or transportation costs systemwide). This way, when draconian school discipline policies get called into question, the board of education and private manager(s) can assert that the children chose the school to which they were assigned and were not forced to enter into this contractual agreement. The private manager might wish to operate a single building in a remote corner of the district for all kids who are dismissed from the various “no excuses” schools. Additional overflow facilities may be required over time. Costs might be held down in these facilities by making them online learning centers with 100/1 pupil/teacher ratios. No rules. No goals. Just a bunch of computers in cubicles where kids, can, if they see fit, log on to K12.com. Children dismissed from any of the “schools of choice” may not require any due process, since they can land in one of the handy, dandy holding pens.

Step 5 – Raise short term cash by selling off all of your facilities (& major capital assets) to a Real Estate Investment Trust. Imagine what you could do with all of that cash? Besides, annual maintenance and operations of a large district’s aging capital stock might be running you about $1,200 per pupil per year. Instead, you can lease the same buildings back from the REIT on a Triple Net Lease (paying lease + property tax* + maintenance) for an expense of, oh, around $3,000 per pupil, with expected annual increases. This, on top of the general management fee paid to the private school management company.

*that is, if these properties become taxable when owned/leased by a for profit REIT

Any takers? Scarsdale? Millburn? Blue Valley (KS)?

Chartering for Thee [& all that comes with it] but Not for Me?

I’m currently in the middle of several research projects which, as they sit on my plate, are not directly related, but intersect. In one set of projects, I have worked with colleagues Preston Green and Joseph Oluwole to better understand how the increasingly complex public-private structures emerging in the charter school sector alter the rights of various constituents – parents, taxpayers, students and employees. We have published two law review articles on these topics:

Separately, I’m involved in a number of projects more central to my own primary area of research involving better understanding variations in schooling resources across schools, districts and children. This includes past work on New York City, Texas and Ohio charter schools, where one of the products of that work will finally appear in the journal Education Finance and Policy this summer. Currently, I continue to explore variations in school site expenditures, including staffing and instructional staffing expenditures by operator type and location – within and across the mix of charter and district providers.

Finally, I’m also trying to finally get a better handle on capital financing issues related to charter schooling in order, if nothing else, to be able to provide instructive summaries of the various mechanisms commonly used, comparison to traditional district municipal financing, and the cost, efficiency and legal rights issues across the various mechanisms. This turns out to be an ugly, messy endeavor, which is why I’ve tried so hard to avoid it so far.

I also love making maps. It’s just fun to see how policies play out in geographic space and with respect to demography.  This post is mainly about maps. But it’s those maps that really heightened my concern over questions I’ve raised previously regarding the distribution of lost rights (law review papers above).

Charter schooling – or more specifically “Chartering” – is pitched most specifically as a solution for long failed “urban” (in quotes for a reason) schools. Point of clarification. I consider “charter schooling” a phrase that represents the original “movement” which through various state statutory structures permitted the start up of independently governed and operated, publicly financed schools. “Chartering” is a more aggressive policy intervention whereby state and local policy makers engage more directly in promoting the expansion of charter schooling by converting district schools to charters, closing district schools to pave the way for charter expansion, transferring district capital assets to charter operators, and generally dismantling the public district in order to expedite its replacement with a “portfolio” of charter operators.

The assumption of the most aggressive “chartering” advocates (or relinquishers – a particularly twisted/warped framing) is that aggressive steps are needed and with all deliberate speed (no time to worry about understanding the law, history, or even why current problems exist) to “save ‘urban’ lives:”

Again, a core assumption of the movement is that we’ve tried everything, including pouring massive sums of money into urban districts – more than they could ever possibly even need – to achieve reasonable outcomes. But we haven’t.

To the extent we ever have put in effort to improve resources, it has actually produced positive results – more broadly and consistently positive than “chartering” as a movement.Evidence consistently points to the importance of financial resources for improving schooling quality.

Yes, some specific charter operators have produced impressive test score gains. Interestingly, these also tend to be very well resourced charter operators, often spending 50% more than district schools, providing substantially longer school days and years, paying their teachers more and providing them smaller class sizes and much smaller total student loads. That is, those highly successful charter operators (as opposed to the dreadfully failing ones) may in fact be providing greater support for the assertion that money matters than for the assertion that “chartering” matters.

As I’ve explained time and time again on this blog, there are many features of “chartering” that require much closer scrutiny – and more systematic evaluation (more so than media or blog reports of “scandal”).  Here are but a few “features” of chartering (and to an extent, “charter schooling”) that I’ve either discussed previously, or are emerging as part of my (or others) current studies.

Feature 1: Compromising the Legal Rights of Taxpayers, Employees and Children

As I’ve explained in previous posts, these particular issues vary by state, due both to differences in language of state charter school laws and to relevant case law. But, as we explain in our law review articles above, there remain, in nearly every circumstance, significant differences in rights of various constituents under traditional Local Education Agency governance than under mixed-private-public governance. [more on this table in this post]

Chartering vs. Traditional District Schooling

Dimension Local Education Agency Privately Governed Charter (Non-State Actor)
Governance Governed by public officials (with all rights & immunities)Elected or appointedNecessarily subject to open public records & open meetings lawsNecessarily required to comply with public bidding requirements

Necessarily required to disclose publicly employee contracts

Governed by appointed (self-appointed) board of private citizensMay not be subject to open records or meetings lawsMay not be required to engage in public contract/bidding requirements

Private appointed board may hire private management firm

Finance Required to disclose finances (reported relatively consistently in most state data systems, including detailed AFRs (annual financial reports) & public posting of budgets) Usually required to report expenditure of public funding. State data systems spotty and inconsistent on charter school revenue/spending data (may be required to disclose IRS filings [form 990])
Disclosure Public officials subject to open meetings laws.All documents/employee contracts/financial documents & communications between officials subject to open records laws. Board members & managers may not be subject to open meetings. Many documents/contracts with private manager, etc. considered private/proprietary.
Employees Public employees with key constitutional and statutory protections Private employees, forgoing certain rights to bring legal challenges against their employer
Students Retain rights to not have their government (school) infringe on various constitutional and statutory rights, and to uphold key statutory obligations. Students may forgo numerous rights under privately governed discipline codes.

Recent Evidence on Children’s Rights in New York City Charter Schools

Specifically pertaining to the treatment of children under charter school discipline policies, Advocates for Children found the following:

  1. 107 of the 164 NYC charter school discipline policies we reviewed permit suspension or expulsion as a penalty for any of the infractions listed in the discipline policy, no matter how minor the infraction. By contrast, the New York City Department of Education’s (DOE) Discipline Code aligns infractions with penalties, limiting suspension to certain violations and prohibiting expulsion for all students under age 17 and for all students with disabilities.6
  2. 82 of the 164 NYC charter school discipline policies we reviewed permit suspension or expulsion as a penalty for lateness, absence, or cutting class, in violation of state law.
  3. 133 of the 164 NYC charter school discipline policies we reviewed fail to include the right to written notice of a suspension prior to the suspension taking place, in violation of state law.
  4. 36 of the 164 NYC charter school discipline policies we reviewed fail to include an opportunity to be heard prior to a short-term suspension, in violation of the U.S. Constitution, New York State Constitution, and state law.
  5. 25 of the 164 NYC charter school discipline policies we reviewed fail to include the right to a hearing prior to a long-term suspension, in violation of the U.S. Constitution, New York State Constitution, and state law.
  6. 59 of the 164 NYC charter school discipline policies we reviewed fail to include the right to appeal charter school suspensions or expulsions, even though state law establishes a distinct process for charter school appeals.
  7. 36 of the 164 NYC charter school discipline policies we reviewed fail to include any additional procedures for suspending or expelling students with disabilities, in violation of federal and state law.
  8. 52 of the 164 NYC charter school discipline policies we reviewed fail to include the right to alternative instruction during the full suspension period, in violation of state law.

http://www.advocatesforchildren.org/sites/default/files/library/civil_rights_suspended.pdf?pt=1

Indeed, many of these policies were found to be non-compliant. And thus, corrective action may be in order. Perhaps a review of district schools’ policies would also turn up violations. But it seems likely that expansion of charter schooling in the city has actually led to a proliferation of non-compliant, student rights-trampling, discipline policies. And policies that may explain (likely explain) disproportionate suspension rates.

Given the prevalence of these policies found in NYC Charter Schools (which are among the better resourced, well established schools in the nation), one might easily argue these policies to be a “feature” not an outlier, or bug of “urban chartering.”

Selling off Public Assets & Draining Operating Resources

This is an area I’m just beginning to get a handle on, and much of the evidence in this area is anecdotal, but as it comes together, it points to a handful of common models involving charter governance, land deals and facilities lease arrangements. One of my big concerns is that, among other things, public assets including valuable land and school facilities are being “relinquished” as district school enrollments drop – often these days because district officials themselves are forcibly closing their schools and handing them over to charter operators – or, sending out pamphlets to parents telling them that the charter schools are better, so choose them, not us. In many cases, citywide enrollments are remaining relatively constant. That is, the number of children that need to be served isn’t changing. Children are being shifted from district schools to charter schools. District facilities (land and buildings representing the investment of taxpayers over decades) are being sold at bargain rates, and there’s no turning back. Many urban districts now lack the capital assets to serve the children they would be responsible for serving, were the charter sector to suddenly collapse. (2013_njeda_teamacademycharterschool_pos , http://njparcels.com/property/0714/1801/15 , http://njparcels.com/sales/0714_2570_1 , http://njparcels.com/property/0714/2569/1 , http://njparcels.com/property/0714/2570/1 , and Elsewhere).

Then there are these particularly suspect (and illegal) examples, which involve complicated intersections of governance and land/real estate and facilities financing.

Imagine/Renaissance Deal

In a case decided in Federal District Court in August, 2014, it was found that Imagine Schools Inc. had engaged in several suspect governance and finance arrangements. Generally, in terms of governance, the court explained:

Imagine Schools recruited the board members, arranged for the board members to apply for the charter and then entered into an Operating Agreement with the Renaissance Board that required the Board to give Imagine Schools all of the tax revenues that the Board was entitled to receive as a charter school. Under Missouri law, Imagine Schools could not obtain that revenue stream itself absent the formation of the Renaissance Board.

In short, there is no evidence that Imagine Schools made any effort to recruit an independent board or to strengthen the independence of the Renaissance Board once selected. In fact, it is the policy of Imagine Schools to control the board rather than vice versa, as evidenced by the statement of Dennis Bakke, the owner and founder of Imagine Schools. Mr. Bakke clearly believed that the Renaissance Academies belonged to Imagine Schools and that the job of the Renaissance Board was to go along with Imagine Schools’ decisions unless Imagine Schools was engaging in illegal activity. In fact, Mr. Bakke encouraged his executives to limit and discourage board member control of “Imagine’s” charter schools by obtaining pre-signed, undated resignation letters from board members at the time they joined the a board so that board members could be expelled at any time he or she asserted too much authority. Id. It is therefore not a surprise that Mr. Rogers, with all his experience as a public school administrator, did not understand that the in contrast to the status of the Renaissance Board, Imagine Schools is one of the nation’s largest charter school management companies and specializes in managing the operations of charter schools.

This case also involved a facilities leasing twist whereby the initial property owner (SchoolHouse Finance) of the facility leased to the school was an arm of the management company (Imagine) itself. Among other things, the court found that the property owner had gouged the charter school, charging a 2% higher rate than appropriate.  The court explained that Imagine used its SchoolHouse Finance arm to flip the property: “SchoolHouse Finance sold the buildings to EPR Properties, a real estate investment trust, in order to free itself up to make more real estate purchases for other charter schools it was starting. EPR Properties then leased the properties back to SchoolHouse Finance for an annual rental rate of approximately 10 percent of the total development cost of the properties. SchoolHouse itself had been charging 12%.  The court mandated repayment of the additional 2% that had been cumulatively charged by SchoolHouse, accepting as reasonable the rate charged by EPR.

Chester Community Charter School

An audit of Chester Community Charter School in Pennsylvania revealed similar issues.

“Chester Community Charter School (Charter School) improperly received $1,276,660 in state lease reimbursements for buildings that were ineligible for those payments. We question these buildings’ eligibility since one for the Charter School’s Founders previously owned them and later transferred them to a related nonprofit (Nonprofit) established for the sole purpose of supporting the Charter School We also found that the Charter School’s Founder was the buildings’ landlord until October 2010. Furthermore, this same individual started a for profit Management Company for which he is currently its Chief Executive Officer (CEO). This Management Company runs the Charter School, and the Management Company and the Nonprofit are located at the same address. These ownership transfers and questionable transactions among associated individuals and entities created circular lease arrangements among related parties sharing ownership interest in the buildings.” (p. 12)

In October 2010, the Charter School Founder/Management Company CEO sold the buildings to a newly created Nonprofit that he and some associates created with the primary purpose of leasing the properties back to the Charter School. The buildings were sold to the Nonprofit for $50.7 million and financed through a municipal bond.” (p 12-13)

At that time, a new 30 year lease agreement was created between the Charter School and the Nonprofit effective October 9, 2010 to August 31, 2040. According to the Nonprofit’s Internal Revenue Service tax returns (2010, 2011, and 2012), all of the Nonprofit’s reported income and expenses have been related to the Charter School’s leased buildings. (p. 13)

Yes, these are cases where institutions went beyond the scope of permissible behavior, got caught and ended up paying a price. But it is through these cases, litigation and audits that we better understand the legally employable mechanisms that set the stage for these actions.

A common mechanism in each case is the some private entity is created to take on bond debt to acquire some land and facility (be it previously public property or not). That entity renovates the property to be minimally suitable for running a charter school and that entity may also serve as the leasing agent for the first few years in which the charter operates. Amazingly, in the Kansas City (Renaissance charter) case, that leasing agent was found to have gouged the school even more so than than the for profit entity to which the property was later flipped.

Notably, several charter schools around the country have lease arrangements with EPR. [http://www.eprkc.com/portfolio-overview/public-charter-schools-list/ ] As an example, one of the few schools in New Jersey with an EPR lease (or at least listed on the EPR site) is Camden Community Charter School (affiliated with Chester Community Charter School), the school that reports by far the highest administrative expenses (likely significantly influenced by contracted lease payments) of any in the state of New Jersey. CCCS spends a reported $5,325 per pupil on administration, or 43.6% of its total spending.  Similar cases/arrangements have been reported widely from Michigan to Ohio to St. Louis. These are simply the emerging models for facilities acquisition and management in charter schooling in many places. They are what they are. It is conceivable that these mechanisms can be used in mutually beneficial ways. And it’s possible that they can be abused as in the examples above. It’s also possible that a traditional district could sell it’s own buildings to EPR to raise cash, and then lease them back from EPR on a triple net lease. I’d love to know if any actually do?

These capital financing deals are pretty much a “feature” of the system, not an outlier, or bug.

Again, my concern is that we are allowing these practices to become the standard for “chartering”, largely unchecked if not endorsed and promoted, by policy. But would other communities allow the same? Do they want the same? Or is this some grand experiment we are willing to test out only on certain communities? On other people’s children?

And Who’s Lucky Enough to Lose those Rights & Assets?

Let’s take a spacial and racial look at the distribution of “chartering” using the 2013 Common Core of Data Public School Universe Survey. In the following maps of major metropolitan areas:

  • Large Red Circles = Schools with >80% Black Enrollment
  • Large Blue Circles = Schools with >80% White Enrollment
  • Large Green Circles = Schools with >80% Hispanic Enrollment
  • Smaller dots = more integrated schools
  • Yellow Stars = Charter Schools

 Baltimore and Washington, DC

Balt_DC

Detroit

DetroitPhiladelphia

PhillyNew York City/Newark, NJ

NY MetroOhio

Ohio

Lots of yellow stars on big red dots. Not so many on big blue dots.

What Needs to be Done Right Now

I will have much more to say about these issues as my current batch of research projects progresses. I’m still trying to sort through it all. But certainly even in the short run, these issues need a closer look.

They need a closer look in part because they so disproportionately affect low income, urban and minority communities.  At the very least in the short term:

1. states must tighten charter school laws to guarantee that local constituents, including parents, students, taxpayers and employees have the rights afforded them that would be afforded to anyone by their relationship to a predominantly publicly financed elementary/secondary school.

2. states must scrutinize carefully any new/forthcoming or recent past transfers of public assets (land, buildings) by local public districts and establish policies to protect taxpayers against future such transfers and ensure that local public districts retain the capacity to serve the public good.

3. states must also scrutinize any/all facilities lease arrangements.

That’s a short list for now. There will be much more to come later this Summer.

Cheers.

What about those high income families that opted out long before the school year started?

Pro-Annual Testing of Everyone pundits are all in a tizzy about Opt-Out. In their view, parents who opt out are severely compromising accountability for our public education system. They are eroding the public interest in the most selfish possible way. What seems to irk these pundits as much as anything is the possibility that the recent pattern of opting out appears (empirical question for a later day) to be disproportionately occurring in upper middle income to upper income communities – A group, over which pundits have have little control or possible leverage [little opportunity for punitive policy – which drives them crazy].

So the pundits say, the disproportionate opting out of upper income white children from testing will severely compromise the ability of policy makers to accurately measure achievement gaps between those children and poor and minority children more compliantly sit down, shut up and fill in the bubbles (ok… point and click).

If the affluent families opt out, we really won’t know how far behind those who are less affluent really are?

Do we?

But do we anyway?

This whole line of reasoning is yet another example of the lack of demographic/contextual understanding and related number sense of those making these arguments. The edu-pundit-innumerati strike again!

These same innumerate pundits previously claimed that annual testing of everyone is absolutely necessary for accurately measuring within school and within district achievement gaps among student subgroups, totally failing to understand that few schools and districts – even when everyone is tested – actually have sufficient populations of subgroups for measuring gaps – and further that the approach most often used for measuring gaps is total BS – statistically that is. Actually, measuring within school and within district gaps and using those measures to penalize schools and districts ends up selectively penalizing only those schools for whom the gaps can be measured – integrated schools.

So then, why is this new argument equally statistically and demographically bankrupt? Certainly it would be the case that if those not taking the test were disproportionately of a certain race or of higher income, that average scores would be biased, and likely biased downward for any data aggregation that would/should include these families. So then, of course it’s a problem, right?

Well, yes… and no.

What the edu-pundit-innumerati fail to realize is that there already exist larger shares of disproportionately higher income kids in nearly every state who are already opting out of these assessments by opting into schools that generally don’t give them. Are these kids somehow not an issue of public policy concern, merely because they attend private schools (or homeschooling)?

If parents in Scarsdale, NY or Millburn, NJ opting out of state assessments matters toward our understanding of gaps in educational opportunity across children of the state by income and race, then so too do the unmeasured outcomes of children opting out of the public education system as a whole.

Here are the numbers for children between the ages of 8 and 17 (those who might fall in tested grades) for New Jersey and New York.

In New Jersey, over 110,000 children between the ages of 8 and 17 attend private schools (just under 150,000 when summing enrollments for k-12 private schools).

Slide1

In New York, over 300,000 children attend private schools (just under 350,000 when summing enrollments for k-12 private schools).

Slide2

In each state, over 10% of children in this age range do not attend the public schools.

Slide3

In New Jersey, the average Total Family Income of those in private schools is about $160k, compared to about $110k for those attending public schools.

Slide4

In New York, the average Total Family Income of those in private schools is about $140k, compared to $87k for those attending public schools.

Slide5

In other words, these states, among others, have relatively large shares of kids outside the system entirely, and the average income of their families is much higher than the average income of those inside the system.

That is, there already exists substantial bias – due to omitted data – in our measurement of gaps in educational outcomes!

Should we try to mitigate any additional bias? Perhaps. But can we pretend that if we do – if we reduce opt outs among affluent public school attendees, we’ve adequately measured outcome equity? Uh… no.

Here’s the breakout of those enrollments by primary affiliation of school, based on the most recent Private School Universe Survey from NCES.

Slide6

Slide7

So, is the National Catholic Education Association on board yet [w/CCSS perhaps, but the tests?]? Are they fully adopting/implementing annual testing of everyone?

How about the most (economically) elite schools in this mix, most of whom are members of the National Association of Independent Schools?

The reason why our School Funding Fairness report includes measures of “coverage” and income gaps by coverage is to make clear that even our measures of fiscal disparity across children attending public schools in any state suffer from the bias resulting from our inability to capture the resources available to the relatively large shares of children not in the public system at all, which, for 5 to 17 year olds, exceeds 20% in states like Delaware and Louisiana.

So, to those in a tizzy about opt out.

Chill.

Annual testing of everyone really isn’t annually testing everyone anyway, and as a result, really isn’t serving the public interest as well as you might think!

Innumerati: Blatantly, belligerently mathematically and statistically inept political hacks who really like to use numbers and statistical arguments to make their case. Almost always out of context and/or simply wrong.

Friday Graphs: Bad Teachers? or Bad Policy & Crappy Measures in New York?

A while back, I wrote this post explaining the problems of using measures of student achievement growth to try to sort out “where the bad teachers go.”

The gist of the post was to explain that when we have estimates of student achievement growth linked to teachers, and when those estimates show that average growth is lower in schools serving more low income children, or schools with more children with disabilities, we really can’t tell the extent to which these patterns indicate that weaker teachers are sorting into higher need settings, or that teachers are receiving lower growth ratings because they are in high need settings. The reformy line of argument is that it’s 100% the former. That bad teachers are in high poverty schools, and that it’s because of bad teachers that these schools underperform. Fire those bad teachers. Hire all of the average ones waiting in line.

Even the best measures of student growth, linked to teachers, addressing as thoroughly as possibly numerous contextual factors beyond teachers control, can’t totally get the job done – isolating only the teacher…. well… classroom level… effect. And, as I’ve noted in previous posts, many if not most state and district adopted measures are far from the best.. or even respectable attempts.

I explain in this policy brief, that in New Jersey, factors including student population characteristics, average resource levels available in schools, competitive wages of teachers (relative to surrounding districts) and other factors are significant predictors of differences in school average “growth” ratings. Schools with more resources and less needy students, and higher average scores to begin with, in New Jersey, get significantly higher growth ratings.

I also showed in this post that in either Massachusetts or New Jersey, teachers in schools with larger shares of their populations that are female, are less likely to receive bad ratings (Mass) or, conversely in schools that receiving higher growth scores (New Jersey). The implication, accepting reformy dogma about what these measures mean, is that our best teachers are teaching the girls.

So then what about those New York teacher ratings I addressed in the previous post. We saw, for example, that teachers rated “Ineffective” on the growth measure tend to be in high poverty schools:

Slide6

Tend to be in schools with larger classes:

Slide5

And those really effective teachers tend to be in schools with lower poverty and smaller classes.

So, does that mean that the “great” teachers are just getting the cushy jobs? Or is the rating system simply labeling them as such?  While there may indeed be some sorting, especially in a state with one of the least equitable funding systems in the nation, it certainly seems likely that the estimates of teacher effect on student achievement growth… well… simply suck! They don’t measure what they purport to measure.

They measure, to a large extent, the conditions into which teachers are placed, and NOT the effect of teachers on student outcomes.

Combining the above factors into a logistic regression analysis to predict how a handful of conditions affect the likelihood that a teacher is rated either “ineffective” (you really suck) or “developing” (you kinda suck, and we’ll tell you you really suck next year), we get the following:

NY Ratings Bias

So, even when considered together (holding the other “constant”), teachers in schools with larger classes (at constant low income share and funding gap) have greater likelihood of being rated “bad.” Teachers in schools with higher low income concentrations, even if class sizes and funding gaps are the same, are much more likely to be rated “bad.” But, teachers in schools in districts that have smaller state aid gaps, are less likely to be rated “bad.”

So, on the one hand, we can stick to the King’s grand plan….

  • Step 1 – Disproportionately label as “bad” those teachers in schools serving more low income kids, and doing so with fewer resources, including larger class sizes, and dump those lazy failing teachers out on the street…
  • Step 2 – Wait for that long line of “average” teachers to sign up to take their place… stepping into the very same working conditions of their predecessors, which likely led, at least in part, to those bad ratings….
  • Step 3 – Repeat

And the cycle continues, until a) those conditions are improved and b) the measures for rating teacher effect are also improved (if they even can be).

Alternatively, maybe the actual policy implication here is to a) reduce aid gaps and b) use that funding to improve class sizes?

UPDATE –

I figured I’d go check out that gender bias issue I found in NJ and MA. And wow – there it is again. I’ve rescaled the low income concentration and female concentration effects to relate odds changes (of being labeled bad) to a 10% point shift in enrollment (e.g. from 50% to 60% low income, or female). Here are the updated model results:

NY Ratings Bias

So once again – is it that all of the “bad” teachers are teaching in schools with higher percentages of boys? or is something else going on here? Are teachers really sorting this way? Are they being assigned by central office this way? Or is there something about a class with a larger share of boys that makes it harder to generate comparable gains on fill in the bubble, standardized tests? Why do the girls get all the good teachers? or do they?

Relinquishing Efficiency: NOLA Data Snapshots

There’s always plenty of bluster about the post-Katrina NOLA miracle. I’ve done a few posts on the topic, but none recently.

See:

The NOLA model of “relinquishment” continues to be pitched as a handy-dandy reformy solution for dismantling the dysfunctional urban school district and achieving miraculous gains in overall student outcomes (like those reported by CREDO), of course, at little or no increased expense. Indeed this latter piece is merely implied, by the complete and utter silence on the question of just how much money is being thrown at this alternative model in order to prove it “works.”

The purpose of this post is merely to put some of this NOLA bluster into context, using readily available data sources, including the NCES Common Core Public School Universe and NCES Fiscal Survey of Local Governments, along with CRDC/Ed Facts data released for states to conduct equity analyses to support their “teaching equity” plans.

First off, here are the pre, to post Katrina enrollment patterns for district and charter schools identified as within city boundaries of New Orleans:

Slide1City enrollments remain far lower than they were pre-Katrina, and any comparisons of the present, to that era, or even to the immediate post-Katrina era, when nearly all students remained displaced, are not useful. Most students are now in Charter schools, meaning that establishing a “counterfactual” comparison of charter students against “non-charter” students, as in the typical charter pissing match studies, is, well, rather difficult if not implausible.

As one might expect, once you’ve got most kids in charter schools, then the charters must somewhat mirror the population that had been in district schools, and remain in the few non-charters as of the final year of these data.

Slide2Really, no surprises here. Of course, we might find a different story if I had readily available data on children with disabilities, by the severity of those disabilities.

This next graph shows the per pupil current spending over time.

Slide3Now, that spike in 2006 is NOT because all of the sudden NOLA schools spent a whole lot more, but rather because the denominator – Pupils – nearly disappeared. Per pupil spending goes up when pupils decline, if spending does not decline commensurately.

It’s a simple math thing. But, even after the system stabilized at its new level, the state of Louisiana has seen fit to boost spending for the Recovery School District to 55% higher than state average spending. Prior to Katrina, NOLA schools were merely at parity with state averages. That’s a substantive boost. And one I’m certainly not complaining about, given the needs of these children. But certainly any claims of NOLA miracles, if they do exist, must include conversation about the “massive infusion of funding” in relative terms associated with this “relinquishment” experiment.

This increase (relative to surroundings) is greater than the boost received by Newark, NJ at any point during school funding litigation in NJ.

And where has some of that money gone? Well, this graph shows transportation expenditures per pupil over time.

Slide4While a bit volatile from year to year, the NOLA experiment seems to be leading to at least DOUBLE state average (non-rural) transportation spending per pupil – AND this is occurring in the most population dense part of the state, where one would expect average transportation costs to be lower. To put these figures into context, taking the margin of difference in transportation spending as about $600 per pupil in the most recent year, that figure is about 6% over the state average $10k per pupil operational expense (that is, consuming the first 6% of the 55% elevated spending on RSD, for a non-transportation RSD margin of 49%, still a healthy boost).

But what had been going on at ground level – within the “district” across schools – when there still existed district and charter schools? Here are some snapshots of total staffing expenditures per pupil by school organized first by low income concentration and then by special education.

Slide5Slide6Visually, it would certainly appear that the edge was being given to charter schools in terms of resources. In which case, any policy inferences based on assertions that charter schools yielded better outcomes, should certainly consider the influence of the additional resources. To clarify, the following table shows the output of  a regression comparing the per pupil staffing expenditure across charter, and “other” schools in New Orleans, for schools serving similar shares of low income children, children with disabilities and serving similar grade range distributions.

Slide7On average, the CRDC/Ed Facts data indicate that Charter Schools in New Orleans were spending $1,604 per pupil more than were “other” schools serving similar student populations. And that’s a hefty boost given that spending ranged from about $4,000 to $8,000 for most districts. That’s 40% of the $4,000 figure.

Again, any interpretation of differential effectiveness of charters versus other schools in New Orleans should consider the potential relevance of a 40% differential in staffing expenditure per pupil.

Setting aside the HUGE ACCOUNTABILITY concerns associated with this model (which no-one should ever set aside), and significant concerns over the legal rights of children and taxpayers (again, which should never be set aside), there are some potential lessons for pundits and policymakers here.  If there is even a success story to be told in NOLA (which I’m unconvinced), that success isn’t free, and it isn’t cheap.

So many pundits over time have ridiculed as the most inefficient experiment in social engineering of all time, the Kansas City desegregation plan of the 1990s. Now, there’s much misguided bluster – urban legend – in those characterizations, as I’ve written in the past. Perhaps one of my greatest fears about the NOLA experiment is that it will provide more fodder for the assertion that money doesn’t matter. Heck, they’ve thrown a lot of money at this so far. They’re just not talking about it. It’s being spent on exorbitant transportation costs, among other things.

Strangely, for now, all I hear is silence from the anti-spending, efficiency warriors of the ed policy world when it comes to NOLA.  Does that mean that money really matters (accepting the NOLA miracle characterization), or, alternatively, is NOLA proving (by not substantively improving outcomes with a 55% boost in funding) that the inefficiencies of a 100% charter/choice/unified enrollment system are equal to or greater than those of the urban school district of the past?

Data notes:

The original data sources for the above analysis are:

  1. enrollment data: http://nces.ed.gov/ccd/pubschuniv.asp
  2. fiscal data (PPCSTOT – or current operating expenditure per pupil) http://www.census.gov/govs/school/
  3. CRDC/Ed Facts School Site staffing expenditure data: http://www2.ed.gov/programs/titleiparta/equitable/laepd.xlsx

For current operating expenditure comparisons, the State of Louisiana reports different per pupil spending figures, combining RSD operated schools and Type 5 charters [whereas NCES reports RSD operated schools, where students shift from one – RSD operated – to the other – Charters – over time, both under RSD Governance]. Both, as far as I can tell, by relevant notations, exclude short term emergency funds. And both are current spending (excluding capital investment) figures. State data are reported below. Notably, the margin of difference is smaller than the operating expenditure figure above. But, interestingly, as more students shift to Type 5 charters, the margin of spending difference increases.

This is a trend worth watching over time. This margin, which is still substantial (and growing), might be consumed almost entirely by increased transportation expense, but may also continue to rise (or not?).

Note that these differences are unrelated to the school level CRDC/Ed Facts analyses above, which include independently reported staffing expenditure data on individual school sites where charter schools have sufficient additional resources  to substantially outspend (+40%) non-charters. These large differentials (huge for some schools) are likely a function of privately contributed resources which may not be showing up in either the State or NCES data.

Finally, there’s rarely need to speculate or make anecdotal claims about data being “wrong” or “different,” or whatever, when one can simply look up the relevant data and make the relevant comparison. Tables w/relevant URL citations can even be conveyed via twitter!

District(s) 2011-12 2012-13 2013-14
Other Parrish Schools $     10,543 $     10,368 $     10,611
Orleans Parish School Board $     14,273 $     14,601 $     13,527
Recovery School District (Operated & Type 5 Charters)* $     11,420 $     11,665 $     11,998
RSD Margin over “Other” 8.3% 12.5% 13.1%
https://www.louisianabelieves.com/resources/library/fiscal-data

NYSED Recommends “Teacher Effectiveness Gnomes” to Fix Persistent Inequities

I guess I knew that when ED released their “teacher equity” regs late fall of 2014, that we were in for a whole lot of stupid.

You see, there was some good in those regulations and the data released to accompany them. There was discussion of teacher salary and qualifications parity, and some financial measures provided that would allow states to do cursory analyses, based on 2011-12 data, of the extent to which there existed objectionable inequities in either cumulative salary expenditures per child across schools, or average salary expenditures. The idea was that states would set out plans to evaluate these disparities, using data provided and using their own data sources. And then, states would provide plans of action for mitigating the disparities. This is where I knew it could get silly.

But state officials in New York have far surpassed my wildest expectations.Here’s their first cut at this issue: http://www.regents.nysed.gov/meetings/2015Meetings/April/415p12hed2.pdf

In this memo, NYSED officials identify the following inequities:

According to the USED published equity profile, the average teacher in a highest poverty quartile school in New York earns $66,138 a year, compared to $87,161 for the average teacher in the lowest poverty quartile schools. (These numbers are adjusted to account for regional differences in the cost of living.) Information in the New York profile also suggests that students in high poverty schools are nearly three times more likely to have a first-year teacher, 22 times more likely to have an unlicensed teacher, and 11 times more likely to have a teacher who is not highly qualified.

& you know what? They’re right. Here’s the full continuum of average salaries and low income concentrations across NY state schools, first with, and then without NYC included.

Slide1

Slide2

As I’ve pointed out over, and over and over again on this blog, NY State maintains one of the least equitable educational systems in the nation. See, for example:

  1. On how New York State crafted a low-ball estimate of what districts needed to achieve adequate outcomes and then still completely failed to fund it.
  2. On how New York State maintains one of the least equitable state school finance systems in the nation.
  3. On how New York State’s systemic, persistent underfunding of high need districts has led to significant increases of numbers of children attending school with excessively large class sizes.
  4. On how New York State officials crafted a completely bogus, racially and economically disparate school classification scheme in order to justify intervening in the very schools they have most deprived over time.

Ah, but I’m just blowin’ hot air again, about that funding stuff, and the fact that NY State continues to severely underfund the highest need districts in the state, like this:

Slide2

But I digress. Who needs all of this silly talk (and actual data) about funding disparities anyway? And what do funding disparities possibly have to do with teacher equity problems, or salary disparities like those identified above by NYSED using USED data?

Well: https://www.youtube.com/watch?feature=player_detailpage&v=wfgnNI9-ImY&list=PLuzsMod17tiHrlaBvDcm2us_k68uxZcSy#t=801

Of course, NYSED official know better – much better what’s behind those ugly salary and ultimately, teacher qualification disparities plaguing NY State schools. The ED regs require that states first identify problems/disparities. Then, ROOT CAUSES, thus, leading to logical policy interventions – Strategery at it’s finest!

PROBLEM –> ROOT CAUSE –> STRATEGERY

So what then are the root causes of the disparities identified above by NYSED?

Through the collaborative sharing of lessons learned through the STLE program and research, the Department has determined that the following five common talent management struggles contribute significantly to equitable access:

  1. Preparation
  2. Hiring and recruitment
  3. Professional development and growth
  4. Selective retention
  5. Extending the reach of top talent to the most high-need students

Although the Department believes the challenges described here are reflective of broad “root causes” for the statewide equity gaps, it is still important for each LEA to examine their unique equity issues and potential root causes. In talking with superintendents, principals, and teachers involved in STLE, the Department was able to see that equity gaps that appear similar across contexts may in fact stem from different root causes in various LEAs. For example, one district struggling with inequitable access for low-performing students may find that inequities stem from a pool of low quality applicants, whereas a second district may find that they have a large pool of high quality applicants but tend to lose top talent early in their careers to neighboring districts who offer more leadership opportunities for teachers.

Ah… okay… I thought equitable funding to actually pay equitable salaries might have had something to do with it. How silly am I? It’s about bad teacher preparation programs which somehow produce bad teachers who ask for lower salaries in high poverty districts? and high poverty districts selectively retaining only their bad teachers, intentionally, by just not paying well. It’s a conspiracy that can be fixed by clever talent development strategies. No money, except some chump change in competitive grants, needed.

And thus, if we know that bad teacher prep and crappy local management of talent is the root cause, the solutions are really easy?

The Department believes the overall quality of teaching and learning can be raised through the implementation of comprehensive systems of talent management, including sound implementation of the teacher and principal evaluation system.

Key Component 1 (Educator Preparation): The Department will continue to support and monitor improvements to access and entry into the profession, such as the redesign of teacher and principal preparation programs through performance-based assessments, clinically grounded instruction, and innovative new educator certification pathways.

Key Component 2 (Educator Evaluation): With the foundation laid by Education Law §3012-c, the Department will continue to provide support and monitoring to LEAs as they implement enhanced teacher and principal evaluation systems that meaningfully differentiate the effectiveness of educators and inform employment decisions.

Key Component 3 (The TLE Continuum): The Department will provide resources and support to LEAs utilizing evaluation results in the design and implementation of robust career ladder

All that’s missing from this brilliant plan are the teacher effectiveness gnomes.

So yeah… it all comes down to the state’s brilliant model for rating, ranking and dumping “bad” teachers to open the door to all the really good teachers who are currently waiting in line to work in schools that …

serve high concentrations of low income and minority students,

Slide6

have larger class sizes,

Slide5

and still (and moving forward) have the largest state aid shortfalls!

Slide4

What’s really great about all of this, is that these teachers – all chomping at the bit to work in these schools for low pay – can have it all! Funding gaps and greater needs. Note that the majority of “ineffective” teachers (as so declared by growth rating along) are clustered in schools with high low income concentrations and big aid gaps. Interestingly, even those in districts with fewer low income children, are also in districts with big aid gaps.

CRDC Ed Facts Data – NY State 2011-12

To summarize – the framework laid out by ED, was:

PROBLEM –> ROOT CAUSE –> STRATEGERY

The brilliant application of that framework by NYSED was:

Problem=Huge salary & teacher qualification disparities by school poverty

Root Cause=Bad teachers, Teacher Prep & Administration

Strategery=Talent Development (fire bad teachers)

Are you kidding me? Really? In my wildest dreams…

To clarify – if it wasn’t already sufficiently clear – I do not at all accept that the patterns above represent the actual distribution of teacher effectiveness, but rather, that the crappy measures adopted by NYSED for rating teacher effect on growth systematically disadvantage those teachers serving needier students, in larger classes and schools with more scarce resources.

Yeah… I get it… NYSED and the Regents don’t pull the budget strings. The Gov has done that damage. But that doesn’t make the logic of the NYSED brief any less ridiculous!

Head… desk…

Angry Andy’s not so generous state aid deal: A look at the 2015-16 Aid Runs in NY

Not much time to write about this, but I finally got my hands on the state aid runs for NY state school districts which were, in an unprecedented and utterly obnoxious move by the Gov, held hostage throughout the budget “negotiations” (if we can call  it that).

Quick review – NY operates a state aid calculation formula built on the premise that each district, given its geographic location (labor costs) and pupil needs requires a certain target level of funding to achieve desired outcomes.

Target = Base x Pupil Needs x Regional Cost

The state then determines what share of that target shall be paid by local districts, the rest to be allocated in state aid.

State Aid = Target – Local Contribution

A few really important points are in order before I move forward with the updated estimates. First, those targets are supposed to be aligned with costs of achieving desired outcomes. Higher outcomes cost more to achieve, with greater marginal cost effects where student needs are higher. As I’ve explained previously, the state has continued to increase those outcome targets, but has continued to lower the funding target. This is a formula for failure!

And, in 2015-16, they’ve done it again. The “base cost” figure which drives the formula has again been decreased, thus leveling down target funding across the board, all else equal.

Slide1

So, with this in mind, any/all funding gaps I discuss below should be considered only funding gaps with respect to what the state would like to pretend is its full funding obligation. What in reality is a low-balled, manipulated figure that downplays substantially the true obligation with respect to current outcome goals. The actual full funding obligation, given increased standards over time, is likely much higher… much higher. There’s no excuse for lowering the target – and continuing year after year to push the date for hitting that target out further. None.

However, from the state perspective, this manipulative game of lowering the outcome target can make it appear that they are getting closer to hitting it. Separately, as I explained on another recent post, one can make the state aid shortfalls look less bad if one requires a higher local contribution, another game used in previous budget years.

Let’s start with the positive. Yes, the adopted state budget does, on average, increase per pupil state aid and does so in higher amounts in districts serving needier pupils:

Slide6

Not bad. We’ve got districts getting what would appear to be hundreds of dollars per pupil in increased state aid. But, remember, this is only a small dent in the funding gaps. Let’s first look at the funding gaps for 2015-16 for those districts Angry Andy called miserable failures who should be subjected to the death penalty.

Slide2

Here, we’ve got districts that in the best case, are still being shorted around $1,500 per pupil in state aid. Every one of Angry Andy’s failing districts will continue to be substantially underfunded – against the state’s own low-ball estimates – for yet another year. All in the name of Angry Andy’s Awesome Austerity Experiment. Regarding a similar “experiment” in Kansas, a 3 judge panel noted it is experimenting with our children which have no recourse from a failure of the experiment.”

And what about small city school districts, who recently had their case heard in Albany? Well, first off, some of them are among the Angry Andy failures.

Slide3

And generally, their state aid gaps remain large – really large. And again, these are gaps with respect to low-balled targets – and after jacking up the supposed local responsibility to fund those targets.

So, who’s to blame here? Well, obviously, it’s not the funding gaps – it’s those lazy teachers and the complicit administrators who give those teachers good ratings even when they can’t produce test score gains.

I close with an update of the 50 districts with the largest funding gaps going into 2015-16. And here they are:

Slide4

Slide5

  For previous reports/lists, see:

  1. Statewide Policy Brief with NYC Supplement: BBaker.NYPolicyBrief_NYC
  2. 50 Biggest Funding Gaps Supplement: 50 Biggest Aid Gaps 2013-14_15_FINAL

On School Finance Equity & Money Matters: A Primer

Conceptions of Equity, Equal Opportunity and Adequacy

Reforms across the nation to state school finance systems have been focused on simultaneously achieving equal educational opportunity and educational adequacy. While achieving and maintaining educational adequacy requires a school finance system that consistently and equitably meets a certain level of educational outcomes, it is important to maintain equal education opportunity in those cases where the funding provided falls below adequacy thresholds. That is, whatever the outcome currently attained across the system, that outcome should be equally attainable regardless of where a child resides or attends school and regardless of his or her background.

Conceptions of school finance equity and adequacy have evolved over the years. Presently, the central assumption is that state finance systems should be designed to provide children, regardless of where they live and attend school, with equal opportunity to achieve some constitutionally adequate level of outcomes.[i] Much is embedded in this statement and it is helpful to unpack it, one layer at a time.

The main concerns of advocates, policymakers, academics and state courts from the 1960s through the 1980s were to a) reduce the overall variation in per-pupil spending across local public school districts; and b) disrupt the extent to which that spending variation was related to differences in taxable property wealth across districts. That is, the goal was to achieve more equal dollar inputs – or nominal spending equity – coupled with fiscal neutrality – or reducing the correlation between local school resources and local property wealth. While modern goals of providing equal opportunity and achieving educational adequacy are more complex and loftier than mere spending equity or fiscal neutrality, achieving the more basic goals remains relevant and still elusive in many states.

An alternative to nominal spending equity is to look at the real resources provided across children and school districts: the programs and services, staffing, materials, supplies and equipment, and educational facilities provided. (Still, the emphasis is on equal provision of these inputs.)[ii] Providing real resource equity may, in fact, require that per-pupil spending not be perfectly equal if, for example, resources such as similarly qualified teachers come at a higher price (competitive wage) in one region than in another. Real resource parity is more meaningful than mere dollar equity. Further, if one knows how the prices of real resources differ, one can better compare the value of the school dollar from one location to the next.

Modern conceptions of equal educational opportunity and educational adequacy shift emphasis away from schooling inputs and onto schooling outcomes and more specifically equal opportunity to achieve some level of educational outcomes. References to broad outcome standards in the school finance context often emanate from the seven standards[iii] articulated in Rose v. Council for Better Education,[iv] a school funding adequacy case in 1989 in Kentucky argued by scholars to be the turning point from equity toward adequacy in school finance legal theory.[v] There are two separable but often integrated goals here – equal opportunity and educational adequacy. The first goal is achieved where all students are provided the real resources to have equal opportunities to achieve some common level of educational outcomes. Because children come to school with varied backgrounds and needs, striving for common goals requires moving beyond mere equitable provision of real resources. For example, children with disabilities and children with limited English language proficiency may require specialized resources (personnel), programs, materials, supplies, and equipment. Schools and districts serving larger shares of these children may require substantively more funding to provide these resources. Further, where poverty is highly concentrated, smaller class sizes and other resource-intensive interventions may be required to strive for those outcomes commonly achieved by the state’s average child.

Meanwhile, conceptions of educational adequacy require that policymakers determine the desired level of outcome to be achieved. Essentially, adequacy conceptions attach a “level” of outcome expectation to the equal educational opportunity concept. Broad adequacy goals are often framed by judicial interpretation of state constitutions. It may well be that the outcomes achieved by the average child are deemed to be sufficient. But it may also be the case that the preferences of policymakers or a specific legal mandate are somewhat higher (or lower) than the outcomes achieved by the average child. The current buzz phrase is that schools should ensure that children are “college ready.” [vi]

One final distinction, pertaining to both equal educational opportunity and adequacy goals, is the distinction between striving to achieve equal or adequate outcomes versus providing the resources that yield equal opportunity for children, regardless of their backgrounds or where they live to achieve those outcomes. Achieving equal outcomes is statistically unlikely at best, and of suspect policy relevance, given that perfect equality of outcomes requires leveling down (actual outcomes) as much as leveling up. The goal of school finance policy in particular is to provide the resources to offset pre-existing inequalities in the likelihood that one child has greater chance of achieving the desired outcome levels than any other.

[i] Baker, B. D., Green, P. C. (2009) Conceptions, Measurement and Application of Educational Adequacy Standards. In D.N. Plank (Ed.) AERA Handbook on Education Policy. New York: Routledge.

Baker, B., & Green, P. (2014). Conceptions of equity and adequacy in school finance. Handbook of research in education finance and policy, 203-221.

Baker, B., & Green, P. (2008). Conceptions of equity and adequacy in school finance. Handbook of research in education finance and policy, 203-221

[ii]               While often treated as a newer approach to equity analysis than measuring pure fiscal inputs, equity evaluations of real resources pre-date modern school finance equity, often being used for example to evaluate the uniformity of segregated black and white schools operating in the pre-Brown, “separate but equal” era.

Baker, B. D., & Green, P. C. (2009). Does increased state involvement in public schooling necessarily increase equality of educational opportunity? The Rising State: How State Power is Transforming Our Nation’s Schools, 133.

[iii]              As per the court’s declaration: “an efficient system of education must have as its goal to provide each and every child with at least the seven following capacities: (i) sufficient oral and written communication skills to enable students to function in a complex and rapidly changing civilization; (ii) sufficient knowledge of economic, social, and political systems to enable the student to make informed choices; (iii) sufficient understanding of governmental processes to enable the student to understand the issues that affect his or her community, state, and nation; (iv) sufficient self-knowledge and knowledge of his or her mental and physical wellness; (v) sufficient grounding in the arts to enable each student to appreciate his or her cultural and historical heritage; (vi) sufficient training or preparation for advanced training in either academic or vocational fields so as to enable each child to choose and pursue life work intelligently; and (vii) sufficient levels of academic or vocational skills to enable public school students to compete favorably with their counterparts in surrounding states, in academics or in the job market.

Rose v. Council for Better Educ., Inc., 790 S.W.2d 186, 212(Ky. 1989).

http://law-apache.uky.edu/wordpress/wp-content/uploads/2012/06/Thro-II.pdf

[iv] Rose v. Council for Better Educ., Inc., 790 S.W.2d 186 (Ky. 1989).

[v] Clune, W. H. (1994). The shift from equity to adequacy in school finance. Educational Policy, 8(4), 376-394.

[vi] http://www.parcconline.org/pennsylvania

School Finance Reforms & Student Outcomes

There exists an increasing body of evidence that substantive and sustained state school finance reforms matter for improving both the level and distribution of short-term and long-run student outcomes. A few studies have attempted to tackle school finance reforms broadly applying multi-state analyses over time. Card and Payne (2002) found “evidence that equalization of spending levels leads to a narrowing of test score outcomes across family background groups.”[i] (p. 49) Most recently, Jackson, Johnson & Persico (2015) evaluated long-term outcomes of children exposed to court-ordered school finance reforms, finding that “a 10 percent increase in per-pupil spending each year for all twelve years of public school leads to 0.27 more completed years of education, 7.25 percent higher wages, and a 3.67 percentage-point reduction in the annual incidence of adult poverty; effects are much more pronounced for children from low-income families.”(p. 1) [ii]

Numerous other researchers have explored the effects of specific state school finance reforms over time, applying a variety of statistical methods to evaluate how changes in the level and targeting of funding affect changes in outcomes achieved by students directly affected by those funding changes. Figlio (2004) explains that the influence of state school finance reforms on student outcomes is perhaps better measured within states over time, explaining that national studies of the type attempted by Card and Payne confront problems of a) the enormous diversity in the nature of state aid reform plans, and b) the paucity of national level student performance data.[iii]

Several such studies provide compelling evidence of the potential positive effects of school finance reforms. Studies of Michigan school finance reforms in the 1990s have shown positive effects on student performance in both the previously lowest spending districts, [iv] and previously lower performing districts. [v] Similarly, a study of Kansas school finance reforms in the 1990s, which also involved primarily a leveling up of low-spending districts, found that a 20 percent increase in spending was associated with a 5 percent increase in the likelihood of students going on to postsecondary education.[vi]

Three studies of Massachusetts school finance reforms from the 1990s find similar results. The first, by Thomas Downes and colleagues, found that the combination of funding and accountability reforms “has been successful in raising the achievement of students in the previously low-spending districts.”(p. 5)[vii]The second found that “increases in per-pupil spending led to significant increases in math, reading, science, and social studies test scores for 4th- and 8th-grade students.”[viii] The most recent of the three, published in 2014 in the Journal of Education Finance, found that “changes in the state education aid following the education reform resulted in significantly higher student performance.”(p. 297)[ix] Such findings have been replicated in other states, including Vermont. [x]

Indeed, the role of money in improving student outcomes is often contested. Baker (2012) explains the evolution of assertions regarding the unimportance of money for improving student outcomes, pointing out that these assertions emanate in part from misrepresentations of the work of Coleman and colleagues in the 1960s, which found that school factors seemed less associated with student outcome differences than did family factors. This was not to suggest, however, that school factors were entirely unimportant, and more recent re-analyses of the Coleman data using more advanced statistical techniques than available at the time clarify the relevance of schooling resources.[xi]

Hanushek (1986) ushered in the modern era “money doesn’t matter” argument, in a study in which he tallied studies reporting positive and negative correlations between spending measures and student outcome measures, proclaiming as his major finding:

“There appears to be no strong or systematic relationship between school expenditures and student performance.” (p. 1162)[xii]

Baker (2012) summarized re-analyses of the studies tallied by Hanushek, wherein authors applied quality standards to determine study inclusion, finding that more of the higher quality studies yielded positive findings with respect to the relationship between schooling resources and student outcomes.[xiii] While Hanushek’s above characterization continues to permeate policy discourse over school funding, often used as evidence that “money doesn’t matter,” it is critically important to understand that this statement is merely one of uncertainty about the direct correlation between spending measures and outcome measures, based on studies prior to 1986. Neither this statement, nor the crude tally behind it ever provided any basis for assuming with certainty that money doesn’t matter.

A separate body of literature challenges the assertion of positive influence of state school finance reforms in general and court ordered reforms in particular. Baker and Welner (2011) explain that much of this literature relies on anecdotal characterizations of lagging student outcome growth following court ordered infusions of new funding. Hanushek and Lindseth (2009) provide one example of this anecdote-driven approach in a book chapter which seeks to prove that court-ordered school funding reforms in New Jersey, Wyoming, Kentucky, and Massachusetts resulted in few or no measurable improvements. However, these conclusions are based on little more than a series of descriptive graphs of student achievement on the National Assessment of Educational Progress in 1992 and 2007 and an undocumented assertion that, during that period, each of the four states infused substantial additional funds into public education in response to judicial orders. That is, the authors merely assert that these states experienced large infusions of funding, focused on low income and minority students, within the time period identified. They necessarily assume that, in all other states which serve as a comparison basis, similar changes did not occur. Yet they validate neither assertion.

Baker and Welner (2011) explain that Hanushek and Lindseth failed to measure whether substantive changes had occurred to the level or distribution of school funding as well as when and for how long. In New Jersey, for example, infusion of funding occurred from 1998 to 2003 (or 2005), thus Hanushek and Lindseth’s window includes 6 years on the front end where little change occurred. Kentucky reforms had largely faded by the mid to late 1990s, yet Hanushek and Lindseth measure post reform effects in 2007. Further, in New Jersey, funding was infused into approximately 30 specific districts, but Hanushek and Lindseth explore overall changes to outcomes among low-income children and minorities using NAEP data, where some of these children attend the districts receiving additional support but many did not.[xiv] Finally, the authors concede that Massachusetts did, in fact experience substantive achievement gains, but attribute those gains to changes in accountability policies rather than funding.

In equally problematic analysis, Neymotin (2010) set out to show that court ordered infusions of funding in Kansas following Montoy v. Kansas led to no substantive improvements in student outcomes. However, Neymotin evaluated changes in school funding from 1997 to 2006, but the first additional funding infused following the January 2005 Supreme Court decision occurred in the 2005-06 school year, the end point of Neymotin’s outcome data.[xv] Finally, Greene and Trivitt (2008) present a study in which they claim to show that court ordered school finance reforms let to no substantive improvements in student outcomes. However, the authors test only whether the presence of a court order is associated with changes in outcomes, and never once measure whether substantive school finance reforms followed the court order, but still express the conclusion that court order funding increases had no effect.[xvi]

To summarize, there exist no methodologically competent analyses yielding convincing evidence that significant and sustained funding increases provide no educational benefits, and a relative few which do not show decisively positive effects.[xvii] On balance, it is safe to say that a sizeable and growing body of rigorous empirical literature validates that state school finance reforms can have substantive, positive effects on student outcomes, including reductions in outcome disparities or increases in overall outcome levels.[xviii]

Schooling Resources & Student Outcomes

The premise that money matters for improving school quality is grounded in the assumption that having more money provides schools and districts the opportunity to improve the qualities and quantities of real resources. The primary resources involved in the production of schooling outcomes are human resources – or quantities and qualities of teachers, administrators, support and other staff in schools. Quantities of school staff are reflected in pupil to teacher ratios and average class sizes. Reduction of class sizes or reductions of overall pupil to staff ratios require additional staff, thus additional money, assuming the wages and benefits for additional staff remain constant. Qualities of school staff depend in part on the compensation available to recruit and retain them – specifically salaries and benefits, in addition to working conditions. Notably, working conditions may be reflected in part through measures of workload, like average class sizes, as well as the composition of the student population.

A substantial body of literature has accumulated to validate the conclusion that both teachers’ overall wages and relative wages affect the quality of those who choose to enter the teaching profession, and whether they stay once they get in. For example, Murnane and Olson (1989) found that salaries affect the decision to enter teaching and the duration of the teaching career,[xix] while Figlio (1997, 2002) and Ferguson (1991) concluded that higher salaries are associated with more qualified teachers.[xx] Loeb and Page (2000) tackled the specific issues of relative pay noted above. They showed that:

“Once we adjust for labor market factors, we estimate that raising teacher wages by 10 percent reduces high school dropout rates by 3 percent to 4 percent. Our findings suggest that previous studies have failed to produce robust estimates because they lack adequate controls for non-wage aspects of teaching and market differences in alternative occupational opportunities.”[xxi]

In short, while salaries are not the only factor involved, they do affect the quality of the teaching workforce, which in turn affects student outcomes.

Research on the flip side of this issue – evaluating spending constraints or reductions – reveals the potential harm to teaching quality that flows from leveling down or reducing spending. For example, David Figlio and Kim Rueben (2001) note that, “Using data from the National Center for Education Statistics we find that tax limits systematically reduce the average quality of education majors, as well as new public school teachers in states that have passed these limits.”[xxii]

Salaries also play a potentially important role in improving the equity of student outcomes. While several studies show that higher salaries relative to labor market norms can draw higher quality candidates into teaching, the evidence also indicates that relative teacher salaries across schools and districts may influence the distribution of teaching quality. For example, Ondrich, Pas and Yinger (2008) “find that teachers in districts with higher salaries relative to non-teaching salaries in the same county are less likely to leave teaching and that a teacher is less likely to change districts when he or she teaches in a district near the top of the teacher salary distribution in that county.”[xxiii]

Others have argued that the dominant structure of teacher compensation which ties salary growth to years of experience and degrees obtained, despite weak correlations between those measures and student achievement gains, creates inefficiencies that negate the overall relationship between school spending and school quality. [xxiv] This argument is built on the assertion that existing funds could instead be used to compensate teachers according to (measures of) their effectiveness, while dismissing high cost “ineffective” teachers, replacing them with better ones with existing resources, thus achieving better outcomes with the same or less money.[xxv]

This argument depends on three assumptions. First, that adopting a pay-for-performance, rather than step-and-lane salary model would dramatically improve performance at the same or less expense. Second, that shedding the “bottom 5% of teachers” according to statistical estimates of their “effectiveness” can lead to dramatic improvements at equal or lower expense. Third and finally, both the incentive pay argument and deselecting the bottom 5% argument depend on sufficiently accurate and precise measures of teaching effectiveness, across settings and children.

Existing studies of pay for performance compensation models fail to provide empirical support for this argument – either that these alternatives can substantially boost outcomes, or that they can do so at equal or lower total salary expense.[xxvi] Simulations purporting to validate the long run benefits of deselecting “bad” teachers depend on the average pool of replacements lining up to take those jobs being substantively better than those who were let go (average replacing “bad”). Simulations promoting the benefits of “bad teacher” deselection assume this to be true, without empirical basis, and without consideration for potential labor market consequences of the deselection policy itself.[xxvii] Finally, existing measures of teacher “effectiveness” fall well short of these demands.[xxviii]

Most importantly, arguments about the structure of teacher compensation miss the bigger point – the average level of compensation matters with respect to the average quality of the teacher labor force. To whatever degree teacher pay matters in attracting good people into the profession and keeping them around, it’s less about how they are paid than how much. Furthermore, the average salaries of the teaching profession, with respect to other labor market opportunities, can substantively affect the quality of entrants to the teaching profession, applicants to preparation programs, and student outcomes. Diminishing resources for schools can constrain salaries and reduce the quality of the labor supply. Further, salary differentials between schools and districts might help to recruit or retain teachers in high need settings. In other words, resources used for teacher quality matter.

Ample research indicates that children in smaller classes achieve better outcomes, both academic and otherwise, and that class size reduction can be an effective strategy for closing racial or socio-economic achievement gaps. [xxix] While it’s certainly plausible that other uses of the same money might be equally or even more effective, there is little evidence to support this. For example, while we are quite confident that higher teacher salaries may lead to increases in the quality of applicants to the teaching profession and increases in student outcomes, we do not know whether the same money spent toward salary increases would achieve better or worse outcomes if it were spent toward class size reduction. Some have raised concerns that large scale-class size reductions can lead to unintended labor market consequences that offset some of the gains attributable to class size reduction (such as the inability to recruit enough fully qualified teachers). For example, studies of California’s statewide class size reduction initiative suggest that as districts across the socioeconomic spectrum reduced class sizes, fewer high quality teachers were available in high poverty settings.[xxx]

Many over time have argued the need for more precise cost/benefit analysis regarding the tradeoffs between applying funding to class size reduction versus increased compensation.[xxxi] Still, the preponderance of existing evidence suggests that the additional resources expended on class size reductions do result in positive effects. Both reductions to class sizes and improvements to competitive wages can yield improved outcomes, but the efficiency gains of choosing one strategy over the other are unclear, and local public school districts rarely have complete flexibility to make tradeoffs.[xxxii] Class size reduction may be constrained by available classrooms. Smaller class sizes and reduced total student loads are a relevant working condition simultaneously influencing teacher recruitment and retention.[xxxiii] That is, providing smaller classes may partly offset the need for higher wages for recruiting or retaining teachers. High poverty schools require a both/and rather than either/or strategy when it comes to smaller classes and competitive wages.

As discussed above, achieving equal educational opportunity requires leveraging additional real resources, lower class sizes and more intensive support services, in high need settings. Merely achieving equal qualities of real resources, including equally qualified teachers, likely requires higher competitive wages, not merely equal pay in a given labor market. As such, higher need settings may require substantially greater financial inputs than lower need settings. Lacking sufficient financial inputs to do both, districts must choose one or the other. In some cases, higher need districts may lack sufficient resources to do either.

Notes

[i] Card, D., and Payne, A. A. (2002). School Finance Reform, the Distribution of School Spending, and the Distribution of Student Test Scores. Journal of Public Economics, 83(1), 49-82.

[ii] Jackson, C. K., Johnson, R., & Persico, C. (2014). The Effect of School Finance Reforms on the Distribution of Spending, Academic Achievement, and Adult Outcomes (No. w20118). National Bureau of Economic Research.

Jackson, C. K., Johnson, R., & Persico, C. (2015). The Effects of School Spending on Educational and Economic Outcomes: Evidence from School Finance Reforms (No. w 20847) National Bureau of Economic Research.

[iii] Figlio, D. N. (2004) Funding and Accountability: Some Conceptual and Technical Issues in State Aid Reform. In Yinger, J. (Ed.) p. 87-111 Helping Children Left Behind: State Aid and the Pursuit of Educational Equity. MIT Press.

[iv] Roy, J. (2011). Impact of school finance reform on resource equalization and academic performance: Evidence from Michigan. Education Finance and Policy, 6(2), 137-167.

Roy (2011) published an analysis of the effects of Michigan’s 1990s school finance reforms which led to a significant leveling up for previously low-spending districts. Roy, whose analyses measure both whether the policy resulted in changes in funding and who was affected, found that “Proposal A was quite successful in reducing interdistrict spending disparities. There was also a significant positive effect on student performance in the lowest-spending districts as measured in state tests.” (p. 137)

[v] Papke, L. (2005). The effects of spending on test pass rates: evidence from Michigan. Journal of Public Economics, 89(5-6). 821-839.

Hyman, J. (2013). Does Money Matter in the Long Run? Effects of School Spending on Educational Attainment. http://www-personal.umich.edu/~jmhyman/Hyman_JMP.pdf.

Papke (2001), also evaluating Michigan school finance reforms from the 1990s, found that “increases in spending have nontrivial, statistically significant effects on math test pass rates, and the effects are largest for schools with initially poor performance.” (p. 821)

Most recently, Hyman (2013) also found positive effects of Michigan school finance reforms in the 1990s, but raised some concerns regarding the distribution of those effects. Hyman found that much of the increase was targeted to schools serving fewer low income children. But, the study did find that students exposed to an additional “12%, more spending per year during grades four through seven experienced a 3.9 percentage point increase in the probability of enrolling in college, and a 2.5 percentage point increase in the probability of earning a degree.” (p. 1)

[vi] Deke, J. (2003). A study of the impact of public school spending on postsecondary educational attainment using statewide school district refinancing in Kansas, Economics of Education Review, 22(3), 275-284. (p. 275)

[vii] Downes, T. A., Zabel, J., and Ansel, D. (2009). Incomplete Grade: Massachusetts Education Reform at 15. Boston, MA. MassINC.

[viii] Guryan, J. (2001). Does Money Matter? Estimates from Education Finance Reform in Massachusetts. Working Paper No. 8269. Cambridge, MA: National Bureau of Economic Research.

“The magnitudes imply a $1,000 increase in per-pupil spending leads to about a third to a half of a standard-deviation increase in average test scores. It is noted that the state aid driving the estimates is targeted to under-funded school districts, which may have atypical returns to additional expenditures.” (p. 1)

[ix] Nguyen-Hoang, P., & Yinger, J. (2014). Education Finance Reform, Local Behavior, and Student Performance in Massachusetts. Journal of Education Finance, 39(4), 297-322.

[x] Downes had conducted earlier studies of Vermont school finance reforms in the late 1990s (Act 60). In a 2004 book chapter, Downes noted “All of the evidence cited in this paper supports the conclusion that Act 60 has dramatically reduced dispersion in education spending and has done this by weakening the link between spending and property wealth. Further, the regressions presented in this paper offer some evidence that student performance has become more equal in the post-Act 60 period. And no results support the conclusion that Act 60 has contributed to increased dispersion in performance.” (p. 312)

Downes, T. A. (2004). School Finance Reform and School Quality: Lessons from Vermont. In Yinger, J. (Ed.), Helping Children Left Behind: State Aid and the Pursuit of Educational Equity. Cambridge, MA: MIT Press.

[xi] Konstantopolous, S., Borman, G. (2011) Family Background and School Effects on Student Achievement: A Multilevel Analysis of the Coleman Data. Teachers College Record. 113 (1) 97-132

Borman, G.D., Dowling, M. (2010) Schools and Inequality: A Multilevel Analysis of Coleman’s Equality of Educational Opportunity Data. Teachers College Record. 112 (5) 1201-1246

[xii] Hanushek, E.A. (1986) Economics of Schooling: Production and Efficiency in Public Schools. Journal of Economic Literature 24 (3) 1141-1177. A few years later, Hanushek paraphrased this conclusion in another widely cited article as “Variations in school expenditures are not systematically related to variations in student performance”

Hanushek, E.A. (1989) The impact of differential expenditures on school performance. Educational Researcher. 18 (4) 45-62

Hanushek describes the collection of studies relating spending and outcomes as follows:

“The studies are almost evenly divided between studies of individual student performance and aggregate performance in schools or districts. Ninety-six of the 147 studies measure output by score on some standardized test. Approximately 40 percent are based upon variations in performance within single districts while the remainder look across districts. Three-fifths look at secondary performance (grades 7-12) with the rest concentrating on elementary student performance.” (fn #25)

[xiii] Baker, B. D. (2012). Revisiting the Age-Old Question: Does Money Matter in Education?. Albert Shanker Institute.

Relevant re-analyses include:

Greenwald, R., Hedges, L., Laine, R. (1996) The Effect of School Resources on Student Achievement. Review of Educational Research 66 (3) 361-396

Wenglinsky, H. (1997) How Money Matters: The effect of school district spending on academic achievement. Sociology of Education 70 (3) 221-237

[xiv] Hanushek (2006) goes so far as to title a concurrently produced volume on the same topic “How School Finance Lawsuits Exploit Judges’ Good Intentions and Harm Our Children.” [emphasis added] The premise that additional funding for schools often leveraged toward class size reduction, additional course offerings or increased teacher salaries, causes harm to children is, on its face, absurd. The book which implies as much in its title never once validates that such reforms ever cause observable harm. Rather, the title is little more than a manipulative attempt to instill fear of pending harm in mind of the un-critical spectator. The book also includes two examples of a type of analysis that occurred with some frequency in the mid-2000s which also had the intent of showing that school funding doesn’t matter. These studies would cherry pick anecdotal information on either or both a) poorly funded schools that have high outcomes or b) well-funded schools that have low outcomes (see Evers & Clopton, 2006, Walberg, 2006).

[xv] Baker, B. D., & Welner, K. G. (2011). School finance and courts: Does reform matter, and how can we tell. Teachers College Record, 113(11), 2374-2414.

Hanushek, E. A., and Lindseth, A. (2009). Schoolhouses, Courthouses and Statehouses. Princeton, N.J.: Princeton University Press., See also: http://edpro.stanford.edu/Hanushek/admin/pages/files/uploads/06_EduO_Hanushek_g.pdf

Hanushek, E. A. (ed.). (2006). Courting failure: How school finance lawsuits exploit judges’ good intentions and harm our children (No. 551). Hoover Press.

Evers, W. M., and Clopton, P. (2006). “High-Spending, Low-Performing School Districts,” in Courting Failure: How School Finance Lawsuits Exploit Judges’ Good Intentions and Harm our Children (Eric A. Hanushek, ed.) (pp. 103-194). Palo Alto, CA: Hoover Press.

Walberg, H. (2006) High Poverty, High Performance Schools, Districts and States. in Courting Failure: How School Finance Lawsuits Exploit Judges’ Good Intentions and Harm our Children (Eric A. Hanushek, ed.) (pp. 79-102). Palo Alto, CA: Hoover Press.

Hanushek, E. A., and Lindseth, A. (2009). Schoolhouses, Courthouses and Statehouses. Princeton, N.J.: Princeton University Press., See also: http://edpro.stanford.edu/Hanushek/admin/pages/files/uploads/06_EduO_Hanushek_g.pdf

[xvi] Greene, J. P. & Trivitt, (2008). Can Judges Improve Academic Achievement? Peabody Journal of Education, 83(2), 224-237.

Neymotin, F. (2010) The Relationship between School Funding and Student Achievement in Kansas Public Schools. Journal of Education Finance 36 (1) 88-108.

[xvii] Baker, B. D., & Welner, K. G. (2011). School finance and courts: Does reform matter, and how can we tell. Teachers College Record, 113(11), 2374-2414.

[xviii] Baker, B. D., & Welner, K. G. (2011). School finance and courts: Does reform matter, and how can we tell. Teachers College Record, 113(11), 2374-2414.

Two reports from Cato Institute are illustrative (Ciotti, 1998, Coate & VanDerHoff, 1999).

Ciotti, P. (1998). Money and School Performance: Lessons from the Kansas City Desegregations Experience. Cato Policy Analysis #298.

Coate, D. & VanDerHoff, J. (1999). Public School Spending and Student Achievement: The Case of New Jersey. Cato Journal, 19(1), 85-99.

[xix] Richard J. Murnane and Randall Olsen (1989) The effects of salaries and opportunity costs on length of state in teaching. Evidence from Michigan. Review of Economics and Statistics 71 (2) 347-352

[xx] David N. Figlio (2002) Can Public Schools Buy Better-Qualified Teachers?” Industrial and Labor Relations Review 55, 686-699. David N. Figlio (1997) Teacher Salaries and Teacher Quality. Economics Letters 55 267-271. Ronald Ferguson (1991) Paying for Public Education: New Evidence on How and Why Money Matters. Harvard Journal on Legislation. 28 (2) 465-498.

[xxi] Loeb, S., Page, M. (2000) Examining the Link Between Teacher Wages and Student Outcomes: The Importance of Alternative Labor Market Opportunities and Non-Pecuniary Variation. Review of Economics and Statistics 82 (3) 393-408

[xxii] Figlio, D.N., Rueben, K. (2001) Tax Limits and the Qualifications of New Teachers. Journal of Public Economics. April, 49-71

See also:

Downes, T. A. Figlio, D. N. (1999) Do Tax and Expenditure Limits Provide a Free Lunch? Evidence on the Link Between Limits and Public Sector Service Quality52 (1) 113-128

[xxiii] Ondrich, J., Pas, E., Yinger, J. (2008) The Determinants of Teacher Attrition in Upstate New York. Public Finance Review 36 (1) 112-144

[xxiv] Hanushek, E. A. (2011). The economic value of higher teacher quality. Economics of Education Review, 30(3), 466-479.

[xxv] Hanushek, E. A. (2009). Teacher deselection. Creating a new teaching profession, 168, 172-173.

[xxvi] Springer, M. G., Ballou, D., Hamilton, L., Le, V. N., Lockwood, J. R., McCaffrey, D. F., … & Stecher, B. M. (2011). Teacher Pay for Performance: Experimental Evidence from the Project on Incentives in Teaching (POINT). Society for Research on Educational Effectiveness.

Yuan, K., Le, V. N., McCaffrey, D. F., Marsh, J. A., Hamilton, L. S., Stecher, B. M., & Springer, M. G. (2012). Incentive Pay Programs Do Not Affect Teacher Motivation or Reported Practices Results From Three Randomized Studies. Educational Evaluation and Policy Analysis, 0162373712462625.

Goodman, S. F., & Turner, L. J. (2013). The design of teacher incentive pay and educational outcomes: Evidence from the New York City bonus program. Journal of Labor Economics, 31(2), 409-420.

Goodman, S., & Turner, L. (2011). Does Whole-School Performance Pay Improve Student Learning? Evidence from the New York City Schools. Education Next, 11(2), 67-71.

[xxvii] Baker, B. D., Oluwole, J. O., & Green III, P. C. (2013). The Legal Consequences of Mandating High Stakes Decisions Based on Low Quality Information: Teacher Evaluation in the Race-to-the-Top Era. education policy analysis archives, 21(5), n5.

[xxviii] Baker, B. D., Oluwole, J. O., & Green III, P. C. (2013). The Legal Consequences of Mandating High Stakes Decisions Based on Low Quality Information: Teacher Evaluation in the Race-to-the-Top Era. education policy analysis archives, 21(5), n5.

[xxix] See http://www2.ed.gov/rschstat/research/pubs/rigorousevid/rigorousevid.pdf;

Jeremy D. Finn and Charles M. Achilles, “Tennessee’s Class Size Study: Findings, Implications, Misconceptions,” Educational Evaluation and Policy Analysis, 21, no. 2 (Summer 2009): 97-109;

Jeremy Finn et. al, “The Enduring Effects of Small Classes,” Teachers College Record, 103, no. 2, (April 2001): 145–183; http://www.tcrecord.org/pdf/10725.pdf;

Alan Krueger, “Would Smaller Class Sizes Help Close the Black-White Achievement Gap.” Working Paper #451 (Princeton, NJ: Industrial Relations Section, Department of Economics, Princeton University, 2001) http://www.irs.princeton.edu/pubs/working_papers.html;

Henry M. Levin, “The Public Returns to Public Educational Investments in African American Males,” Dijon Conference, University of Bourgogne, France. May 2006. http://www.u-bourgogne.fr/colloque-iredu/posterscom/communications/LEVIN.pdf;

Spyros Konstantopoulos Spyros and Vicki Chun, “What Are the Long-Term Effects of Small Classes on the Achievement Gap? Evidence from the Lasting Benefits Study,” American Journal of Education 116, no. 1 (November 2009): 125-154.

[xxx] Jepsen, C., Rivkin, S. (2002) What is the Tradeoff Between Smaller Classes and Teacher Quality? NBER Working Paper # 9205, Cambridge, MA. http://www.nber.org/papers/w9205

“The results show that, all else equal, smaller classes raise third-grade mathematics and reading achievement, particularly for lower-income students. However, the expansion of the teaching force required to staff the additional classrooms appears to have led to a deterioration in average teacher quality in schools serving a predominantly black student body. This deterioration partially or, in some cases, fully offset the benefits of smaller classes, demonstrating the importance of considering all implications of any policy change.” p. 1

For further discussion of the complexities of evaluating class size reduction in a dynamic policy context, see:

David Sims, “A Strategic Response to Class Size Reduction: Combination Classes and Student Achievement in California,” Journal of Policy Analysis and Management, 27(3) (2008): 457–478

David Sims, “Crowding Peter to Educate Paul: Lessons from a Class Size Reduction Externality,” Economics of Education Review, 28 (2009): 465–473.

Matthew M. Chingos, “The Impact of a Universal Class-Size Reduction Policy: Evidence from Florida’s Statewide Mandate,” Program on Education Policy and Governance Working Paper 10-03 (2010).

[xxxi] Ehrenberg, R.G., Brewer, D., Gamoran, A., Willms, J.D. (2001) Class Size and Student Achievement. Psychological Science in the Public Interest 2 (1) 1-30

[xxxii] Baker, B., & Welner, K. G. (2012). Evidence and rigor scrutinizing the rhetorical embrace of evidence-based decision making. Educational Researcher, 41(3), 98-101.

[xxxiii] Loeb, S., Darling-Hammond, L., & Luczak, J. (2005). How teaching conditions predict teacher turnover in California schools. Peabody Journal of Education, 80(3), 44-70.

Isenberg, E. P. (2010). The Effect of Class Size on Teacher Attrition: Evidence from Class Size Reduction Policies in New York State. US Census Bureau Center for Economic Studies Paper No. CES-WP-10-05.

Angry Andy’s Failing Schools & the Finger of Blame

NY Governor Andrew Cuomo’s office has released a report in which it identifies what it refers to in bold type on the cover as “Failing Schools.”

Report here: https://www.governor.ny.gov/sites/governor.ny.gov/files/atoms/files/NYSFailingSchoolsReport.pdf

Presumably, these are the very schools on which Angy Andy would like to impose death penalties – or so he has opined in the past.

The report identifies 17 districts in particular that are home to failing schools. The point of the report is to assert that the incompetent bureaucrats, high paid administrators and lazy teachers in these schools simply aren’t getting the job done and must be punished/relieved of their duties. Angry Andy has repeatedly vociferously asserted that he and his less rabid predecessors have poured obscene sums of funding into these districts for decades. Thus – it’s their fault – certainly not his, for why they stink!

Slide3Slide4

I have addressed over and over again on this blog the plight of high need, specifically small city school districts under Governor Cuomo.

  1. On how New York State crafted a low-ball estimate of what districts needed to achieve adequate outcomes and then still completely failed to fund it.
  2. On how New York State maintains one of the least equitable state school finance systems in the nation.
  3. On how New York State’s systemic, persistent underfunding of high need districts has led to significant increases of numbers of children attending school with excessively large class sizes.
  4. On how New York State officials crafted a completely bogus, racially and economically disparate school classification scheme in order to justify intervening in the very schools they have most deprived over time.

I have also written reports on New York State’s underfunding of the school finance formula – a formula adopted to comply with prior court order in CFE v. State.

  1. Statewide Policy Brief with NYC Supplement: BBaker.NYPolicyBrief_NYC
  2. 50 Biggest Funding Gaps Supplement: 50 Biggest Aid Gaps 2013-14_15_FINAL

Among my reports is one in which I identified the 50 districts with the biggest state aid shortfalls with respect to what the state itself says these districts require for providing a sound basic (constitutional standard) education.  Districts across NY state have funding gaps for a variety of reasons, but I have shown in the past that it is generally districts with greater needs – high poverty concentrations & more children with limited English language proficiency, as well as more minority children – which tend to have larger funding gaps.

I have also pointed out very recently on this blog that some high need upstate cities in NY have had persistently inequitable/inadequate funding for decades, including this one from Angry Andy’s hit list.

Slide4

Personally, even I was shocked to see the relationship between my 50 most underfunded districts list and Angry Andy’s 17 districts that suck.

NY State has over 650 school districts, many of which may be showing relatively low test scores for a variety of reasons, including & especially due to serving high concentrations of needy students.

Based on my updated 2015 runs (final adopted budget) of 50 biggest state aid shortfalls, 12 of Angry Andy’s sucky 17 had among the 50 largest state aid shortfalls.

Yeah… that’s right… 12 of 17 had really big funding shortfalls.

5 of the top 10 biggest funding shortfall districts are on Angry Andy’s list. Yeah.. the list of schools that have supposedly been subjected to obscene amounts of support and additional funding, but due only to their own ineptitude, have failed.

So how big are those funding shortfalls? How much state aid is supposed to be allocated to these districts to provide a sound basic education? Here are a few cuts at the numbers. First, here are the failing 17, by their state aid gap rank for 2014 and 2015. Included also are their state aid gaps per Aidable Foundation Pupil Unit. Note that their gaps per actual warm body – enrolled pupil – are larger (TAFPU includes some additional “weighted” pupils).

But even with this conservative figure, Hempstead’s gap – the amount of state aid they are not getting with respect to their calculated target – is over $6,000 per pupil. Yes – OVER $6,000 PER PUPIL!  (where’s that NY lottery guy when you need him?). Note that the apparent reduction in gaps from 2014 to 2015 occurs due to a manipulation by the state of funding targets and required local contributions – with a smaller share of that reduction actually coming from new state aid.

Slide2All of these are high need districts, having Pupil Need Index values well above 1.5.

Here’s what it looks like in graph form, with local contribution, actual state aid and the gap identified.

Slide1In some cases, the actual state aid received is not a whole lot more than the gap. All of Angry Andy’s failing districts have substantial shortfalls from the funding targets.  Funding targets that were specifically identified as funding needed to achieve desired outcome levels.

Notably, as I’ve explained in the past – the outcome levels used for determining those funding targets were much lower than the outcome levels expected under the state’s current testing and accountability system.

Even then, the state’s approach to estimating the cost of achieving those (much lower) outcomes results in a low-ball manipulated number. (I actually have a book chapter that explains this as an exemplar of classic school finance manipulation)

So, where should that finger of blame point here? 

Or is this just how things work these days – slash the funding of the highest need districts – call them failing – close their schools – give their property and their teacher’s jobs to someone else – and claim victory – leaving others, years down the line to clean up your mess?

Angry Andy – this is your mess. Now do the right thing and fix it!

 

 

Disclaimer: Yes, I spent all day Monday this week testifying at trial about the funding shortfalls for New York State districts, specifically Small City districts with a pending lawsuit against the state. My opinions are the same here as they were there, and have been for several years as reflected in numerous published sources. That’s because my opinions here merely reflect the factual status of the state school finance system in New York, as represented by the state’s own formula calculations and data.