Blog

Will Kiryas Joel Finally Get its Way? Who really benefits from NY’s “Invest in Ed” Tax Credit?

Tuition tax credit programs establish privately governed entities that provide scholarships, typically to “lower income” families, for their children to attend private schools. The idea is to provide tax credits to corporations and individuals who give money to these tuition scholarship entities.

The compelling governmental interest for these policies, as we often hear is that low income kids are trapped in failing urban public schools and that these tuition scholarships will help them attend either outstanding, elite private independent schools, or established catholic schools. Certainly, as I illustrated in a recent blog post, a large share of private schooled children in New York State do attend Catholic schools. And more perhaps might, if provided with tuition scholarships.

But, as I’ve shown in other contexts, the primary beneficiaries of these scholarships – where the money actually lands – isn’t necessarily where we might assume, given the framing of the policy.

Which brings us to Kiryas Joel (and other similar contexts in New York State, like the area served by East Ramapo School District). Years ago, the tiny village of Kiryas Joel, in the town of Monroe sought to establish itself as its own independent public school district – and independent public school district that would, in effect, serve an exclusively, homogeneous religious community. The New York legislature, on their behalf actually passed a districting law, just for them, that would create these boundaries. Why? They just wanted to have a fully publicly subsidized school system where they could serve – the way they saw fit – their specific religious community. To which the high court of this land said – nay nay.

Here’s a quick summary from oyez.org:

BOARD OF EDUC.KIRYAS JOEL VILLAGE SCHOOL v. GRUMET

Facts of the Case 

In 1989, the New York legislature passed a school districting law that intentionally drew its boundaries in accordance with the boundaries of the Village of Kiryas Joel, a religious enclave of Satmar Hasidim who practice a strict form of Judaism. Shortly before the new district commenced operations, the taxpayers and the association of state school boards embarked on a lawsuit claiming that the statute created a school district that limited access only to residents of Kiryas Joel.

Question 

Did the 1989 statute violate the First Amendment’s Establishment?

Yes. In a 6-to-3 decision, the Court held that the statute’s purpose was to exclude all but those who lived in and practiced the village enclave’s extreme form of Judaism. This exclusionary intent failed to respect the Establishment Clause’s requirement that states maintain a neutral position with respect to religion, because it clearly created a school zone which excluded those who were non-religious and/or did not practice Samtar Hasidism. Indeed, the very essence of the Establishment Clause is that government should not demonstrate a preference for one religion over another, or religion over non-religion in general.

http://www.oyez.org/cases/1990-1999/1993/1993_93_517

But alas, those were different times. Since that time, our high court has determined, for example, that if a tuition voucher program is established by a government entity, and if that voucher program is based on the choices of individual, private actors (parents and students), even if the majority of those students/families take their voucher to religious institutions (as they did in Cleveland), the policy being neutral to those choices, does not violate the establishment clause.

This has led me to ponder, in recent years, what if New York State had simply established a voucher model for the region surrounding Kiryas Joel? Rather than declaring a separate government entity that happened to be homogenously religious? That might work. The kids/families would just so happen to choose, 100%, their village yeshivas. The policy on its face would be “neutral.” Except perhaps that the legislation creating the policy would have chosen this religious community specifically for this “pilot” program. That might again tip the tables against Kiryas Joel (unless perhaps, as in the failed, proposed NJ legislation, the policy just happened to include, but not exclusively, the religious community).

But alas, we need not even worry about that, because our high court a few years back created an immunity shield for policies that indirectly rather than directly allocate those tuition scholarships! You see, if a state instead creates a tax credit for individuals and corporations to give to a scholarship granting entity, rather than directly allocating those same tax dollars, it appears that resident/citizen/taxpayers don’t have a right to bring legal challenges to the policy to begin with. [you see, in a case like Zelman/Cleveland Vouchers, taxpayers can challenge the use of their tax dollars for religious institutions on an objection of conscience basis. But, taxpayers can’t challenge the distribution of someone else’s “untaxed” contributions, even if the fiscal effect is the same]. To summarize:

Arizona Christian School Tuition Organization v. Winn

Facts of the Case

Arizona taxpayers challenged the constitutionality of Arizona’s tuition tax credit in an Arizona federal district court. They alleged the tax credit violated the Establishment Clause of the First Amendment because it funneled money to private religious schools. The district court dismissed the case. On appeal, the U.S. Court of Appeals for the Ninth Circuit reversed, holding that the taxpayers had standing to bring their suit and had alleged a viable Establishment Clause claim.

Question 

Do the plaintiffs lack standing because they cannot allege that the Arizona tuition tax credit involves the appropriation or expenditure of state funds?

Yes. The Supreme Court overturned the lower court in an opinion by Justice Anthony Kennedy. The majority held that the challengers to the tax credit in Arizona lack standing under Article III. Justice Elena Kagan filed a dissenting opinion joined by Justices Ruth Bader Ginsburg, Stephen Breyer and Sonia Sotomayor. “State sponsorship of religion sometimes harms individuals only (but this ‘only’ is no small matter) in their capacity as contributing members of our national community,” Kagan wrote for the dissenters.

http://www.oyez.org/cases/2010-2019/2010/2010_09_987

Which all brings us to the here and now in New York State. There now exist (at least) two versions of a tuition tax credit program being pitched in New York State. The pitch is the same as usual. The faces of that pitch are the same as usual – including Cardinal Dolan of the NY Archdiocese [perhaps on the assumption that this policy will a) help firm up the financial condition of NY’s Catholic Schools while b) providing low income NYC children “trapped” in failing schools the opportunity to attend Catholic schools].

Like previous New Jersey bills, the Assembly version (largely mirrored by the Governor’s budget language version) of the bill places emphasis on providing scholarships to children from families with income below a certain level. As stated in the bill:

Section three of this bill also provides definitions for terms such as “authorized contribution,” “public education entity,” “local education fund,” and “educational scholarship organization.” An “eligible student” who can receive a scholarship must reside in a household with not more than $250,000 in adjusted gross income; however, to ensure the needs of low-income communities are addressed, on educational scholarship organization must provide at least half its scholarships for students from households with income below 150% of the reduced-price lunch income thresholds. http://assembly.state.ny.us/leg/?default_fld=&bn=A02551&term=2015&Summary=Y&Actions=Y&Votes=Y&Memo=Y&Text=Y

The NY Senate version doesn’t worry itself with allocating the scholarships to lower income children, as far as I can tell.

But as I have previously laid out in New Jersey, the casual observer might be surprised to learn where those communities are, in which the largest shares of children already enrolled in private schools qualify by income status for these scholarships (by the 150% of the 185% income threshold for poverty, which is the 277% income threshold).

Let’s take a look first at the Public Use Micro Data Areas in New York State from 2011-2013 by Public Use Micro Data Area.

Table 1 – Private School Enrollments in New York State for Public Use Micro Data Areas with HIGHEST % Low Income Enrolled in Private Schools

Slide1

Interestingly, but not surprisingly (since I’ve seen this pattern elsewhere), the highest rates of “private school enrollment” among “income qualified families” are in places like Brooklyn, Rockland County and Orange County. These latter two public use microdata areas are home to two very unique New York State school districts: Kiryas Joel Village (Orange County) and East Ramapo. The plight of East Ramapo school district has been covered extensively in the media in recent years and is analogous to that of Lakewood, New Jersey in many respects.

Kiryas Joel is no stranger to media coverage either, with many stories specifically covering the village leadership’s creativity in accessing public subsidies and tax breaks.

Half, to more than half of children in these communities attend private schools, much like Lakewood, NJ. That is, half of children from families claiming low income!  Families that would/will qualify for preferential treatment under the scholarship program.

Now, one might say, this is all private school students. Surely they represent a mix of school types. They aren’t all in one type – one religion – of schools. Well, here are the private school enrollments in 2011-12 for towns in the Rockland area, and in Monroe which is home to Kiryas Joel. These enrollments are from the NCES Private School Universe Survey.

Table 2 – Private School Enrollments (total enrollments)

Slide2

Well, okay, in fact most of them are in Orthodox schools, which is pretty well understood by anyone familiar with these communities.  In fact, if the Census data and Private School Survey data were precise enough to isolate specifically schools and enrolled children in the village of Kiryas Joel, I believe we’d still find 100%.

So then, what’s the big deal? Well, what we have here, in New York State, is quite similar to what I found in New Jersey- that the immediate big beneficiaries of the tax credit scholarships will likely NOT be low income minority children heading off to Horace Mann, Dalton, Trinity or NYC Catholic schools, but rather, the vast population of already enrolled “low income” families in the state’s burgeoning orthodox communities – those who disproportionately declare themselves as low income, yet already attend private schools.

So, in other words, Kiryas Joel finally gets its way after all these years – the opportunity to have their exclusively religious schools fully, or nearly fully subsidized at public/taxpayer expense, albeit indirectly.

And thanks to more recent supreme court decisions, no one even has legal standing to challenge the policy as designed, if it’s actually adopted.

Is this what “invest in ed” is really all about?

Unconstitutional by any other name is still Unconstitutional

schoolfinance101's avatarConsortium on Education Policy & Finance

A Review of the Kansas “Block Grant” School Finance Legislation

BBaker.Hsub7.Kansas.4_9_15

Bruce D. Baker, Rutgers University

In this report, I present a brief review of Kansas House Substitute for Senate Bill 7, which has been labeled the “Block Grant” funding formula, adopted as a replacement to the School District Finance Act, and to be enacted through the 2016-17 fiscal year. Major features of the plan include:

  1. Calling it a “block grant;”
  2. Freezing in place prior year general fund aggregate (not per pupil) state aid through the next two fiscal years;
    1. To include a .4% across the board cut for FY2015-16 and FY2016-17 to generate “extraordinary” needs aid pool;
  3. Imposing cuts to prior year supplemental fund state aid for the next fiscal year;
  4. Imposing cuts to prior year capital outlay state aid for the next fiscal year.

Put simply, a freeze by any other name is still a freeze and a…

View original post 537 more words

The Willful Ignorance of the NJ Star Ledger

After having a series of conversations with Star Ledger reporter Julie O’Connor about her desire to write a cover story about how TEAM Academy is producing miracles in Newark, I wrote this post:

https://schoolfinance101.wordpress.com/2015/01/30/ed-writers-try-looking-beyond-propaganda-press-releases-for-success-stories/

The reason for this post is explained in this paragraph:

Well, one reason I’m going there is that I’m sick of getting e-mail and phone inquiry after inquiry about the same charter schools – and only charter schools – asking how/why are they creating miracle outcomes. I try to explain that there may be more to the story. The reporter then says that the charter school’s data person says I’m wrong – validating their miracle outcomes (despite their own data not being publicly available/replicable, etc. and often with reference to awesome outcomes reported in popularly cited studies of totally different charter schools).

For a while after writing this, I figured that the NJ Star Ledger reporter who was so insistent on writing her rah rah TEAM article had simply given up. But alas no. The puff piece finally arrived today: http://www.nj.com/opinion/index.ssf/2015/05/beating_newarks_odds_kipp_charter_network_is_poise.html#incart_river

Now, it’s written as an editorial, so I guess that means it’s okay to make stuff up, ignore lots of stuff, and just generally roll with a combination of propaganda provided to you by the school and your own personal predisposition.

What’s so disturbing about this all is that the title of the editorial itself is directly refuted by the statewide analysis I provided. That TEAM relatively marginally beats expectations, and in fact, several Newark Public schools and a few other charter schools in Newark “beat the odds” so to speak, by much more. AND THE AUTHOR OF THE EDITORIAL WAS FULLY AWARE OF THIS.

I refused to call the reporter in part because I wanted there to be a full, complete transcript of our e-mail conversations. I’m sick of banging my head against this wall.

Below is a transcript of the conversation that started with an inquiry to Diane Ravitch from Julie O’Connor. Others were included on the e-mail chain and jump in at various points.

Reporter Inquiry

Prof. Ravitch,

I’m on the editorial board at The Star-Ledger in New Jersey, and I’m working on a cover story for our Perspective section about the KIPP schools in our state. The college attendance stats of KIPP seniors in Newark seem pretty impressive, and I was wondering if you have the same reaction, and what you think of KIPP’s forays into Camden.

Would really appreciate it if you could give me a call at []. Would like to discuss KIPP in the context of your criticisms of the broader charter school movement, and whether or not you think it is an exception.

Many thanks,

Julie O’Connor

The hand-off

Julie,

I suggest you talk to Mark Weber and Bruce Baker at Rutgers, who have studied charters in NJ. I lean on their research. The question is not whether one chain can produce successful graduates, but whether charters in general are helping the most vulnerable schools, whether they are reducing the funding and capacity of public schools, and whether their success-when it exists–is the result of selection and attrition.

Diane Ravitch

Reporter

Ok, thanks for your prompt reply.

Prof. Baker emailed me his report on free/reduced lunch and the TEAM schools, but I have been unable to reach him on the phone to discuss KIPP or my follow up questions.

Basically, I am looking for a reaction to two claims from KIPP that seem impressive: The college attendance rates (last year, 95 percent of KIPP seniors went to college, 89% to a 4-year, 6 percent to a 2-year), and the fact that KIPP kids in elementary and high school equal or outperform the average for the state of NJ (some years they do in middle school, too, though this year they didn’t).

KIPP kids are 87% free/reduced lunch and the state is in the 30s. I understand that Baker and others are skeptical about comparing KIPP kids to their peers in the Newark district. But what about comparing them to the state average? And what about their college attendance rates?

I would like to discuss the criticisms of the charter school movement and whether you view KIPP as an exception, or more of the same. Prof. Baker, can you please give me a call as soon as you get a chance? []. We are hoping to run the story in the next week or so.

Many thanks,

Julie

Baker to Reporter

My point is, and shall continue to be that news stories on education should NOT be driven by some PR prompt from specific schools touting their “successes” through anecdotes. Thus, my only reaction is the reaction I posted previously about school performance, given analyses across all schools, using comparable, publicly available data:

https://schoolfinance101.wordpress.com/2015/01/30/ed-writers-try-looking-beyond-propaganda-press-releases-for-success-stories/

The bottom line is that KIPP schools performance on comparable measures of student growth, controlling for demography, resources, etc., are relatively average (marginally above average). Many district schools, including ones in Newark, far outperform them.

Reporter to Baker

Ok. Even if KIPP students aren’t representative of their district, isn’t it still impressive that they are beating the state average, given that their student population is significantly poorer?

KIPP says 93 percent of their students stay with them (7 percent leave their schools each year for any reason).

If what this tells us is that KIPP students have high scores and go to college, how do they fit into criticisms of the larger charter school movement? And what do you think of KIPP’s expansion into Camden?

Prof. Baker, read your blog post and would like to discuss. I am not sure how you are measuring growth in these ranked schools. Are you skeptical about the accuracy of the college attendance rates and performance numbers reported by KIPP? If so, why? Please give me a call. []

Thanks.

Baker to Reporter

Not without running a model of demographics against the same outcome measures across all schools, to see how/whether they truly deviate, statistically, from expectations. Anecdotes of this type are unhelpful for understanding what’s “impressive” statistically or not.

For measuring growth, I’m using the state’s own reported school Median Growth Percentile – for 2012, 2013 and 2014.

Skeptical or not, context is what’s needed for them to really mean anything. The context of all other schools, and their demographics, to evaluate statistically whether the KIPP schools actually deviate from what would otherwise be expected (given enough schools to estimate a model of expectations).

Reporter to Baker

Ok. Is the state average not considered a good measure of how schools are doing?

Is your central point in creating your own measurement for whether schools deviate from expectations that KIPP schools have more resources and classroom time and better class sizes, and that’s why their students are doing so well?

Are you trying to account for those factors in your outcome measure, since you might not find such conditions in traditional district schools? That seems to be your argument in this blog post:

https://schoolfinance101.wordpress.com/2013/03/01/the-non-reformy-lessons-of-kipp/

Trying to understand your general view of KIPP’s performance.

Baker to Reporter (w/head banging against desk)

No. State average is NOT a useful comparison.  Given the number of things that vary across schools, one needs to look at any given school in the context of all schools, with all available measures. Not just compare one school to the state average and say, for example, “it’s got higher poverty, and higher outcomes than the state average.” That comparison misses a lot of other factors that may vary across schools. One needs to see how those factors affect the outcome measure across schools and then compare against the overall pattern.

Second – I’m not “creating” my own measurement. I’m doing what I describe above. Taking the state’s measures, and making comparisons among “otherwise similar” schools along the trend of schools, given their various attributes. That is, how much higher, or lower than expected, does a school score (on growth) given all of those factors that vary.

Now, I also use the state’s growth measure,  because, for all its shortcomings, it is actually the best available New Jersey measure of what a school might be contributing to student outcomes (rather than what kids come in with, or who leaves and when). But that measure too is ONLY useful if you control for/account for the various factors. Quite simply, this is how credible analysis of this type is done, knowing full well that even this approach can’t capture some factors that affect outcomes that really aren’t about how good/bad a school is.

Their performance tends to be marginally above average, to about average, considering all schools including district schools. For that matter, several Newark district schools have higher performance. Discovery Charter school is the standout among charters. North Star seems to do well, but I believe that the model isn’t really capturing the effect of their substantially greater attrition, or different student population. But who knows.  But then again, Robert Treat has very different student population and tends to show very weak gains with adjustment for the included factors.

Reporter (who clearly never bothered to read the original post)

What factors that vary are you trying to account for? It is things like resources, classroom time and class sizes?

Baker to Reporter (direct response to ignorant question)

They are all listed in the blog post!

https://schoolfinance101.wordpress.com/2015/01/30/ed-writers-try-looking-beyond-propaganda-press-releases-for-success-stories/

Outcome is Growth

Corrected for:

  1. prior average scale score level
  2. % free lunch
  3. % disability (because I cant’ break out by severity, charters like TEAM actually get an advantage here)
  4. % Ell
  5. total staffing expense per pupil
  6. school grade range served
  7. school size

More Exasperated Baker to Reporter

Schools in Newark: https://schoolfinance101.com/wp-content/uploads/2015/01/slide18.jpg

Charter Schools Statewide: https://schoolfinance101.com/wp-content/uploads/2015/01/slide24.jpg

So again, I ask, why do you feel the necessity to write a story on KIPP schools? And why the apparent obsession on trying to find a miracle in KIPP? How do these supposed miracles (that generally aren’t) come across your desk?

An objective statistical run of all schools in the state, using the state’s own best available measure as the outcome, finds TEAM in Newark to be a decent – relatively above average – school, but no miracle. There are no miracles in this complex endeavor. That’s fine. They do a pretty good job, and seem to do a better job of serving a more representative student population than some others (see also: https://schoolfinance101.wordpress.com/2013/11/27/where-are-the-most-economically-segregated-charter-schools-why-does-it-matter/)

I’m not trying to rain on their parade. I’m just pointing out that if we take all of the data from schools around the state and try to figure out who’s actually “doing better than expected” given who they serve and the resources they have, we don’t identify KIPP as the standout.

Weber to Reporter

Julie, I am going to encourage you to read Bruce’s entire post, as it is far more sophisticated and comprehensive than what I am going to include here.

That said, let me put this in very simple — admittedly, TOO simple — terms:

This is a very quick and very dirty scatterplot that shows the average scores on the NJASK Grade 8 English Language Arts (ELA) exam from last year for every school in the state. I’ve highlighted TEAM on this graph.

The NJASK score is on the vertical or y-axis. On the horizontal or x-axis is the percentage of students who qualify for free or reduced price lunch, a proxy measure for student economic disadvantage (a student’s family has to be at 185% or below the poverty line to qualify for FRPL).

The first and most obvious thing to notice is the relationship between how many FRPL kids a school has and its average test score. Clearly, when FRPL goes up, test scores go down. 70% of the variation in these scores can be statistically explained by the percentage of FRPL kids at the school.

We all know this. Poverty matters.

The green line through the middle is called a regression line: it’s a kinda-sorta “average” that predicts how well a school will do given its FRPL percentage. If you’re above the line, you’re doing better than prediction; if you’re below the line, you’re doing worse.

TEAM is above the line – hooray for them. But how many other schools do you see across the state that are at least as far above the line as TEAM? How many are way, way further above that line compared to TEAM?

Again: what Bruce did in his post was far more sophisticated than this, because he’s using a statistical model to account for other things that will affect student outcomes, like percentages of special education kids and how much a school spends per pupil on staff (yes, money does matter). He’s also judging outcomes on SGPs, which is arguably a better measure of a school effectiveness.

I’m boiling this down, however, to reinforce his point: yes, TEAM is a better-than-average school. Again, good for them… but why all the outsized attention? Why are you writing a story about them and not the many, many other schools that “beat prediction” much better than TEAM? How many district schools could be considered “miracles” relative to TEAM that get ignored by the op-ed pages of your newspaper?

Julie, you and I both know I have been the Star-Ledger Editorial Page’s harshest critic on education. I’ve admitted before that sometimes I have gone too far… but can you understand my frustration? Can you understand how unfair it appears to those of us who have taken the time to study Bruce’s work that TEAM gets all the accolades while many schools that — by TEAM’s own standards — are doing a BETTER job than they are, yet continue to be ignored?

I am asking you to listen to Bruce carefully and take the time to understand what he is saying. This stuff matters. You control arguably the most important space for punditry in the state. You owe it to your readers to get this stuff right.

If I can help further, let me know.

Mark Weber

Reporter (still not bothering to read, and returning to anecdotes provided by school)

What about the 95 percent of KIPP seniors that went to college last year? That seems impressive to me.

Also, when you say comparing KIPP to the state average doesn’t mean anything without “running a model of demographics against the same outcome measures across all schools, to see how/whether they truly deviate, statistically, from expectations” — isn’t that what the Mathematica study does? Control for any differences in student population?

Baker (even more exasperated) to Reporter

Why don’t you write it that way then – that it seems impressive to you.  I’m not going there, with your representation of data, passed along to you most likely by the school, without opportunity run appropriate models on the data. And I don’t have time to be doing that right now, or quibbling with you over your strange incessant desire to write a story on how awesome you think these schools are, without ever bothering to look at the schools in the context of all schools, where many others may, in fact be even more impressive.

And are you speaking of some Mathematica study of TEAM Academy specifically, and their graduation and college matriculation rates? Or Mathematica studies of KIPP schools generally/nationally ? [I believe only the latter exists –http://www.mathematica-mpr.com/~/media/publications/PDFs/education/kipp_middle.pdf]  Yes, the network’s results are solid. Not miraculous. But solid. Driven in part, perhaps by selection issues (see methods critiques below), and in part by resources. KIPP schools in many contexts substantially outspend their “competition” offering higher salaries, much smaller classes, longer days/years, etc. Certainly won’t deny that those types of resources matter.

Comments on related methods here: https://schoolfinance101.wordpress.com/2012/12/20/thoughts-on-randomized-vs-randomized-charter-school-studies/

and: https://schoolfinance101.wordpress.com/2013/07/12/thinking-writing-about-educational-research-policy-implications/

There are indeed limitations these methods.

Some information here on where TEAM fits on resource/demographics, etc in Newark: https://njedpolicy.wordpress.com/2015/01/13/research-note-resource-equity-student-sorting-across-newark-district-charter-schools/

Weber to Reporter

Related to the issue of resources:

Find attached the 2012 tax forms for TEAM, Friends of TEAM, and KIPP. You can access these easily at guidestar.org.

You will notice on page 42 of the KIPP 990 that TEAM received $1,053,147 in direct support from KIPP. This likely does not include all sorts of administrative, logistical, marketing, lobbying, etc. activity KIPP undertakes on behalf of TEAM.

On page 21 of the Friends of TEAM 990, you’ll find a $1,005,332 grant to TEAM. On page 9, you’ll see the group took a rental income loss of $1,813,501, likely to the school’s benefit (were I you, I’d certainly ask them about this).

In 2011-12, TEAM enrolled 1,504.5 students. If you take the grants from KIPP and FOT together, that comes to $1,368 additional expenditures per child, not including the rental loss that FOT took. So far as I know, this extra funding is not reported in the NJ Taxpayers Guide to Education Spending.

Let me be clear: it is, in my opinion (an opinion backed up by a substantial and growing body of research) that spending this extra money on behalf of these students will help their academic growth. This is a good thing.

But it is exactly the sort of issue that is not addressed by the Mathematica report, nor by any number of other “studies” that purport to show the superiority of KIPP’s methods by holding all things constant.

So how does TEAM spend all this extra money? Well, here’s one way:

At all stages of a teachers career, TEAM pays a higher salary, even when adjusted for experience, than NPS (and way more than Newark’s “local” charters). When you pay more and offer better working conditions, you can attract people who are willing to work longer hours (to a point).

But they manage to keep salary costs low by also doing this:

Notice the high number of teachers with only one year of experience at TEAM? Notice how they barely have any teachers with more than 15 years of experience? That’s when the NPS salary guide gives veteran teachers a big boost.

Is this a smart strategy? Absolutely. Is it sustainable? I say almost certainly not. Does TEAM really think they can keep recycling their staff AND expand the number of students enrolled? Are there really that many young people out there willing to make teaching at TEAM a temporary career? And is that really good for the city and its students?

As Bruce says: TEAM does a good job. They are, by the numbers, a good school. But I would argue KIPP’s methods are not replicable at a large scale. In fact, THEY’D probably agree with me, because they have said over and over again that they are not interested in taking over an entire district.

Julie, if you are willing to dig into this and go behind the talking points the KIPP publicity machine feeds the press, I think you will find TEAM’s “success” raises more questions than it answers:

– If more money is good for charter schools, why isn’t it good for pubic schools?

– Is it good for the teaching profession to encourage the growth of schools that appear to run on a policy of churning much of their staff?

– When we get past the issues of different student populations, attrition, extra resources, hiring practices, test prep, etc., what, exactly, is so special about KIPP/TEAM?

Mark

Five Steps to Cagebusting Relinquishment and the Suburban School District of the Future!

As I explained in my previous post, relinquishment in the form of “chartering” has taught us much about how to “fix” urban school districts. But why should urban districts be the only ones to benefit from the wisdom of emergent “disruptive” models of school organization? Here, I provide an overview/preview of what may eventually become my defining academic contribution! How to fix the suburban school district. How to relinquish the leafy ‘burbs! So, here it is, in all its, glory, the rough outline of my forthcoming manifesto on Cagebusting Relinquishment and the Suburban School District of the Future!

Step 1 – Hire a private management company(ies) to manage 100% of district operating funds and any/all subcontracted service agreements, including those addressed under Step 5 below. This will include all employee contracts as well as all additional vendor contracts. What’s really cool about this is that the Local Board of Education’s reported budget and annual financial report become one single line of expense – Contracted Services – to the management company. Nothing else need be disclosed to the public/taxpayers. The rest is at the discretion of the private management company, whose finances and contractual arrangements may not be subject to public access/review. The public only gets to see that one line – that lump sum payment by the board of education to the manager(s). In other words – Step 1 – Relinquish! ‘cuz relinquishment rocks!

Step 2 – The local board of education and private manager quickly concoct a new school rating system that allows them to declare all schools to be failing, requiring that the schools be closed and reconstituted under the private manager. This bold “disruptive” step permits the private manager to establish its own employee contracts and recruit its own employees to fill the roles of the (crappy, self-interested, tenured, government employed) teachers and administrators immediately dismissed by the local board of education because their schools technically no longer exist. The private manager might, for example, choose to establish a feeder/pipeline relationship with an emergency/expedited training program for young suburban saviors (on the expectation of significant turnover to hold long run staffing costs down), or establish its own H1B visa processing entity to enable the schools to employ foreign teachers paid modest stipends. Because these are all employees of the private manager, and not “public” employees, reshuffling them, dismissing them (if they don’t leave fast enough to keep costs down), etc. is easy because many constitutional and statutory protections of public employees are irrelevant.

Step 3 – the private manager establishes rigid no excuses discipline policies and written contractual agreements with parents and their children to abide by those discipline policies or face immediate dismissal. Like the employees, students/parents may be forgoing constitutional and statutory protections they would have in a government operated institution. Rather, discipline policy may be evaluated by the courts as a contractual agreement with the private provider, giving that provider wide latitude to impose draconian requirements if they so choose.

Step 4 – the private manager and local board of education would probably want to declare the system to be one of district wide, open choice where each year, all families in the district rank their school choices and students are sorted by computer algorithm into assigned schools, regardless of proximity to home (or transportation costs systemwide). This way, when draconian school discipline policies get called into question, the board of education and private manager(s) can assert that the children chose the school to which they were assigned and were not forced to enter into this contractual agreement. The private manager might wish to operate a single building in a remote corner of the district for all kids who are dismissed from the various “no excuses” schools. Additional overflow facilities may be required over time. Costs might be held down in these facilities by making them online learning centers with 100/1 pupil/teacher ratios. No rules. No goals. Just a bunch of computers in cubicles where kids, can, if they see fit, log on to K12.com. Children dismissed from any of the “schools of choice” may not require any due process, since they can land in one of the handy, dandy holding pens.

Step 5 – Raise short term cash by selling off all of your facilities (& major capital assets) to a Real Estate Investment Trust. Imagine what you could do with all of that cash? Besides, annual maintenance and operations of a large district’s aging capital stock might be running you about $1,200 per pupil per year. Instead, you can lease the same buildings back from the REIT on a Triple Net Lease (paying lease + property tax* + maintenance) for an expense of, oh, around $3,000 per pupil, with expected annual increases. This, on top of the general management fee paid to the private school management company.

*that is, if these properties become taxable when owned/leased by a for profit REIT

Any takers? Scarsdale? Millburn? Blue Valley (KS)?

Chartering for Thee [& all that comes with it] but Not for Me?

I’m currently in the middle of several research projects which, as they sit on my plate, are not directly related, but intersect. In one set of projects, I have worked with colleagues Preston Green and Joseph Oluwole to better understand how the increasingly complex public-private structures emerging in the charter school sector alter the rights of various constituents – parents, taxpayers, students and employees. We have published two law review articles on these topics:

Separately, I’m involved in a number of projects more central to my own primary area of research involving better understanding variations in schooling resources across schools, districts and children. This includes past work on New York City, Texas and Ohio charter schools, where one of the products of that work will finally appear in the journal Education Finance and Policy this summer. Currently, I continue to explore variations in school site expenditures, including staffing and instructional staffing expenditures by operator type and location – within and across the mix of charter and district providers.

Finally, I’m also trying to finally get a better handle on capital financing issues related to charter schooling in order, if nothing else, to be able to provide instructive summaries of the various mechanisms commonly used, comparison to traditional district municipal financing, and the cost, efficiency and legal rights issues across the various mechanisms. This turns out to be an ugly, messy endeavor, which is why I’ve tried so hard to avoid it so far.

I also love making maps. It’s just fun to see how policies play out in geographic space and with respect to demography.  This post is mainly about maps. But it’s those maps that really heightened my concern over questions I’ve raised previously regarding the distribution of lost rights (law review papers above).

Charter schooling – or more specifically “Chartering” – is pitched most specifically as a solution for long failed “urban” (in quotes for a reason) schools. Point of clarification. I consider “charter schooling” a phrase that represents the original “movement” which through various state statutory structures permitted the start up of independently governed and operated, publicly financed schools. “Chartering” is a more aggressive policy intervention whereby state and local policy makers engage more directly in promoting the expansion of charter schooling by converting district schools to charters, closing district schools to pave the way for charter expansion, transferring district capital assets to charter operators, and generally dismantling the public district in order to expedite its replacement with a “portfolio” of charter operators.

The assumption of the most aggressive “chartering” advocates (or relinquishers – a particularly twisted/warped framing) is that aggressive steps are needed and with all deliberate speed (no time to worry about understanding the law, history, or even why current problems exist) to “save ‘urban’ lives:”

Again, a core assumption of the movement is that we’ve tried everything, including pouring massive sums of money into urban districts – more than they could ever possibly even need – to achieve reasonable outcomes. But we haven’t.

To the extent we ever have put in effort to improve resources, it has actually produced positive results – more broadly and consistently positive than “chartering” as a movement.Evidence consistently points to the importance of financial resources for improving schooling quality.

Yes, some specific charter operators have produced impressive test score gains. Interestingly, these also tend to be very well resourced charter operators, often spending 50% more than district schools, providing substantially longer school days and years, paying their teachers more and providing them smaller class sizes and much smaller total student loads. That is, those highly successful charter operators (as opposed to the dreadfully failing ones) may in fact be providing greater support for the assertion that money matters than for the assertion that “chartering” matters.

As I’ve explained time and time again on this blog, there are many features of “chartering” that require much closer scrutiny – and more systematic evaluation (more so than media or blog reports of “scandal”).  Here are but a few “features” of chartering (and to an extent, “charter schooling”) that I’ve either discussed previously, or are emerging as part of my (or others) current studies.

Feature 1: Compromising the Legal Rights of Taxpayers, Employees and Children

As I’ve explained in previous posts, these particular issues vary by state, due both to differences in language of state charter school laws and to relevant case law. But, as we explain in our law review articles above, there remain, in nearly every circumstance, significant differences in rights of various constituents under traditional Local Education Agency governance than under mixed-private-public governance. [more on this table in this post]

Chartering vs. Traditional District Schooling

Dimension Local Education Agency Privately Governed Charter (Non-State Actor)
Governance Governed by public officials (with all rights & immunities)Elected or appointedNecessarily subject to open public records & open meetings lawsNecessarily required to comply with public bidding requirements

Necessarily required to disclose publicly employee contracts

Governed by appointed (self-appointed) board of private citizensMay not be subject to open records or meetings lawsMay not be required to engage in public contract/bidding requirements

Private appointed board may hire private management firm

Finance Required to disclose finances (reported relatively consistently in most state data systems, including detailed AFRs (annual financial reports) & public posting of budgets) Usually required to report expenditure of public funding. State data systems spotty and inconsistent on charter school revenue/spending data (may be required to disclose IRS filings [form 990])
Disclosure Public officials subject to open meetings laws.All documents/employee contracts/financial documents & communications between officials subject to open records laws. Board members & managers may not be subject to open meetings. Many documents/contracts with private manager, etc. considered private/proprietary.
Employees Public employees with key constitutional and statutory protections Private employees, forgoing certain rights to bring legal challenges against their employer
Students Retain rights to not have their government (school) infringe on various constitutional and statutory rights, and to uphold key statutory obligations. Students may forgo numerous rights under privately governed discipline codes.

Recent Evidence on Children’s Rights in New York City Charter Schools

Specifically pertaining to the treatment of children under charter school discipline policies, Advocates for Children found the following:

  1. 107 of the 164 NYC charter school discipline policies we reviewed permit suspension or expulsion as a penalty for any of the infractions listed in the discipline policy, no matter how minor the infraction. By contrast, the New York City Department of Education’s (DOE) Discipline Code aligns infractions with penalties, limiting suspension to certain violations and prohibiting expulsion for all students under age 17 and for all students with disabilities.6
  2. 82 of the 164 NYC charter school discipline policies we reviewed permit suspension or expulsion as a penalty for lateness, absence, or cutting class, in violation of state law.
  3. 133 of the 164 NYC charter school discipline policies we reviewed fail to include the right to written notice of a suspension prior to the suspension taking place, in violation of state law.
  4. 36 of the 164 NYC charter school discipline policies we reviewed fail to include an opportunity to be heard prior to a short-term suspension, in violation of the U.S. Constitution, New York State Constitution, and state law.
  5. 25 of the 164 NYC charter school discipline policies we reviewed fail to include the right to a hearing prior to a long-term suspension, in violation of the U.S. Constitution, New York State Constitution, and state law.
  6. 59 of the 164 NYC charter school discipline policies we reviewed fail to include the right to appeal charter school suspensions or expulsions, even though state law establishes a distinct process for charter school appeals.
  7. 36 of the 164 NYC charter school discipline policies we reviewed fail to include any additional procedures for suspending or expelling students with disabilities, in violation of federal and state law.
  8. 52 of the 164 NYC charter school discipline policies we reviewed fail to include the right to alternative instruction during the full suspension period, in violation of state law.

http://www.advocatesforchildren.org/sites/default/files/library/civil_rights_suspended.pdf?pt=1

Indeed, many of these policies were found to be non-compliant. And thus, corrective action may be in order. Perhaps a review of district schools’ policies would also turn up violations. But it seems likely that expansion of charter schooling in the city has actually led to a proliferation of non-compliant, student rights-trampling, discipline policies. And policies that may explain (likely explain) disproportionate suspension rates.

Given the prevalence of these policies found in NYC Charter Schools (which are among the better resourced, well established schools in the nation), one might easily argue these policies to be a “feature” not an outlier, or bug of “urban chartering.”

Selling off Public Assets & Draining Operating Resources

This is an area I’m just beginning to get a handle on, and much of the evidence in this area is anecdotal, but as it comes together, it points to a handful of common models involving charter governance, land deals and facilities lease arrangements. One of my big concerns is that, among other things, public assets including valuable land and school facilities are being “relinquished” as district school enrollments drop – often these days because district officials themselves are forcibly closing their schools and handing them over to charter operators – or, sending out pamphlets to parents telling them that the charter schools are better, so choose them, not us. In many cases, citywide enrollments are remaining relatively constant. That is, the number of children that need to be served isn’t changing. Children are being shifted from district schools to charter schools. District facilities (land and buildings representing the investment of taxpayers over decades) are being sold at bargain rates, and there’s no turning back. Many urban districts now lack the capital assets to serve the children they would be responsible for serving, were the charter sector to suddenly collapse. (2013_njeda_teamacademycharterschool_pos , http://njparcels.com/property/0714/1801/15 , http://njparcels.com/sales/0714_2570_1 , http://njparcels.com/property/0714/2569/1 , http://njparcels.com/property/0714/2570/1 , and Elsewhere).

Then there are these particularly suspect (and illegal) examples, which involve complicated intersections of governance and land/real estate and facilities financing.

Imagine/Renaissance Deal

In a case decided in Federal District Court in August, 2014, it was found that Imagine Schools Inc. had engaged in several suspect governance and finance arrangements. Generally, in terms of governance, the court explained:

Imagine Schools recruited the board members, arranged for the board members to apply for the charter and then entered into an Operating Agreement with the Renaissance Board that required the Board to give Imagine Schools all of the tax revenues that the Board was entitled to receive as a charter school. Under Missouri law, Imagine Schools could not obtain that revenue stream itself absent the formation of the Renaissance Board.

In short, there is no evidence that Imagine Schools made any effort to recruit an independent board or to strengthen the independence of the Renaissance Board once selected. In fact, it is the policy of Imagine Schools to control the board rather than vice versa, as evidenced by the statement of Dennis Bakke, the owner and founder of Imagine Schools. Mr. Bakke clearly believed that the Renaissance Academies belonged to Imagine Schools and that the job of the Renaissance Board was to go along with Imagine Schools’ decisions unless Imagine Schools was engaging in illegal activity. In fact, Mr. Bakke encouraged his executives to limit and discourage board member control of “Imagine’s” charter schools by obtaining pre-signed, undated resignation letters from board members at the time they joined the a board so that board members could be expelled at any time he or she asserted too much authority. Id. It is therefore not a surprise that Mr. Rogers, with all his experience as a public school administrator, did not understand that the in contrast to the status of the Renaissance Board, Imagine Schools is one of the nation’s largest charter school management companies and specializes in managing the operations of charter schools.

This case also involved a facilities leasing twist whereby the initial property owner (SchoolHouse Finance) of the facility leased to the school was an arm of the management company (Imagine) itself. Among other things, the court found that the property owner had gouged the charter school, charging a 2% higher rate than appropriate.  The court explained that Imagine used its SchoolHouse Finance arm to flip the property: “SchoolHouse Finance sold the buildings to EPR Properties, a real estate investment trust, in order to free itself up to make more real estate purchases for other charter schools it was starting. EPR Properties then leased the properties back to SchoolHouse Finance for an annual rental rate of approximately 10 percent of the total development cost of the properties. SchoolHouse itself had been charging 12%.  The court mandated repayment of the additional 2% that had been cumulatively charged by SchoolHouse, accepting as reasonable the rate charged by EPR.

Chester Community Charter School

An audit of Chester Community Charter School in Pennsylvania revealed similar issues.

“Chester Community Charter School (Charter School) improperly received $1,276,660 in state lease reimbursements for buildings that were ineligible for those payments. We question these buildings’ eligibility since one for the Charter School’s Founders previously owned them and later transferred them to a related nonprofit (Nonprofit) established for the sole purpose of supporting the Charter School We also found that the Charter School’s Founder was the buildings’ landlord until October 2010. Furthermore, this same individual started a for profit Management Company for which he is currently its Chief Executive Officer (CEO). This Management Company runs the Charter School, and the Management Company and the Nonprofit are located at the same address. These ownership transfers and questionable transactions among associated individuals and entities created circular lease arrangements among related parties sharing ownership interest in the buildings.” (p. 12)

In October 2010, the Charter School Founder/Management Company CEO sold the buildings to a newly created Nonprofit that he and some associates created with the primary purpose of leasing the properties back to the Charter School. The buildings were sold to the Nonprofit for $50.7 million and financed through a municipal bond.” (p 12-13)

At that time, a new 30 year lease agreement was created between the Charter School and the Nonprofit effective October 9, 2010 to August 31, 2040. According to the Nonprofit’s Internal Revenue Service tax returns (2010, 2011, and 2012), all of the Nonprofit’s reported income and expenses have been related to the Charter School’s leased buildings. (p. 13)

Yes, these are cases where institutions went beyond the scope of permissible behavior, got caught and ended up paying a price. But it is through these cases, litigation and audits that we better understand the legally employable mechanisms that set the stage for these actions.

A common mechanism in each case is the some private entity is created to take on bond debt to acquire some land and facility (be it previously public property or not). That entity renovates the property to be minimally suitable for running a charter school and that entity may also serve as the leasing agent for the first few years in which the charter operates. Amazingly, in the Kansas City (Renaissance charter) case, that leasing agent was found to have gouged the school even more so than than the for profit entity to which the property was later flipped.

Notably, several charter schools around the country have lease arrangements with EPR. [http://www.eprkc.com/portfolio-overview/public-charter-schools-list/ ] As an example, one of the few schools in New Jersey with an EPR lease (or at least listed on the EPR site) is Camden Community Charter School (affiliated with Chester Community Charter School), the school that reports by far the highest administrative expenses (likely significantly influenced by contracted lease payments) of any in the state of New Jersey. CCCS spends a reported $5,325 per pupil on administration, or 43.6% of its total spending.  Similar cases/arrangements have been reported widely from Michigan to Ohio to St. Louis. These are simply the emerging models for facilities acquisition and management in charter schooling in many places. They are what they are. It is conceivable that these mechanisms can be used in mutually beneficial ways. And it’s possible that they can be abused as in the examples above. It’s also possible that a traditional district could sell it’s own buildings to EPR to raise cash, and then lease them back from EPR on a triple net lease. I’d love to know if any actually do?

These capital financing deals are pretty much a “feature” of the system, not an outlier, or bug.

Again, my concern is that we are allowing these practices to become the standard for “chartering”, largely unchecked if not endorsed and promoted, by policy. But would other communities allow the same? Do they want the same? Or is this some grand experiment we are willing to test out only on certain communities? On other people’s children?

And Who’s Lucky Enough to Lose those Rights & Assets?

Let’s take a spacial and racial look at the distribution of “chartering” using the 2013 Common Core of Data Public School Universe Survey. In the following maps of major metropolitan areas:

  • Large Red Circles = Schools with >80% Black Enrollment
  • Large Blue Circles = Schools with >80% White Enrollment
  • Large Green Circles = Schools with >80% Hispanic Enrollment
  • Smaller dots = more integrated schools
  • Yellow Stars = Charter Schools

 Baltimore and Washington, DC

Balt_DC

Detroit

DetroitPhiladelphia

PhillyNew York City/Newark, NJ

NY MetroOhio

Ohio

Lots of yellow stars on big red dots. Not so many on big blue dots.

What Needs to be Done Right Now

I will have much more to say about these issues as my current batch of research projects progresses. I’m still trying to sort through it all. But certainly even in the short run, these issues need a closer look.

They need a closer look in part because they so disproportionately affect low income, urban and minority communities.  At the very least in the short term:

1. states must tighten charter school laws to guarantee that local constituents, including parents, students, taxpayers and employees have the rights afforded them that would be afforded to anyone by their relationship to a predominantly publicly financed elementary/secondary school.

2. states must scrutinize carefully any new/forthcoming or recent past transfers of public assets (land, buildings) by local public districts and establish policies to protect taxpayers against future such transfers and ensure that local public districts retain the capacity to serve the public good.

3. states must also scrutinize any/all facilities lease arrangements.

That’s a short list for now. There will be much more to come later this Summer.

Cheers.

What about those high income families that opted out long before the school year started?

Pro-Annual Testing of Everyone pundits are all in a tizzy about Opt-Out. In their view, parents who opt out are severely compromising accountability for our public education system. They are eroding the public interest in the most selfish possible way. What seems to irk these pundits as much as anything is the possibility that the recent pattern of opting out appears (empirical question for a later day) to be disproportionately occurring in upper middle income to upper income communities – A group, over which pundits have have little control or possible leverage [little opportunity for punitive policy – which drives them crazy].

So the pundits say, the disproportionate opting out of upper income white children from testing will severely compromise the ability of policy makers to accurately measure achievement gaps between those children and poor and minority children more compliantly sit down, shut up and fill in the bubbles (ok… point and click).

If the affluent families opt out, we really won’t know how far behind those who are less affluent really are?

Do we?

But do we anyway?

This whole line of reasoning is yet another example of the lack of demographic/contextual understanding and related number sense of those making these arguments. The edu-pundit-innumerati strike again!

These same innumerate pundits previously claimed that annual testing of everyone is absolutely necessary for accurately measuring within school and within district achievement gaps among student subgroups, totally failing to understand that few schools and districts – even when everyone is tested – actually have sufficient populations of subgroups for measuring gaps – and further that the approach most often used for measuring gaps is total BS – statistically that is. Actually, measuring within school and within district gaps and using those measures to penalize schools and districts ends up selectively penalizing only those schools for whom the gaps can be measured – integrated schools.

So then, why is this new argument equally statistically and demographically bankrupt? Certainly it would be the case that if those not taking the test were disproportionately of a certain race or of higher income, that average scores would be biased, and likely biased downward for any data aggregation that would/should include these families. So then, of course it’s a problem, right?

Well, yes… and no.

What the edu-pundit-innumerati fail to realize is that there already exist larger shares of disproportionately higher income kids in nearly every state who are already opting out of these assessments by opting into schools that generally don’t give them. Are these kids somehow not an issue of public policy concern, merely because they attend private schools (or homeschooling)?

If parents in Scarsdale, NY or Millburn, NJ opting out of state assessments matters toward our understanding of gaps in educational opportunity across children of the state by income and race, then so too do the unmeasured outcomes of children opting out of the public education system as a whole.

Here are the numbers for children between the ages of 8 and 17 (those who might fall in tested grades) for New Jersey and New York.

In New Jersey, over 110,000 children between the ages of 8 and 17 attend private schools (just under 150,000 when summing enrollments for k-12 private schools).

Slide1

In New York, over 300,000 children attend private schools (just under 350,000 when summing enrollments for k-12 private schools).

Slide2

In each state, over 10% of children in this age range do not attend the public schools.

Slide3

In New Jersey, the average Total Family Income of those in private schools is about $160k, compared to about $110k for those attending public schools.

Slide4

In New York, the average Total Family Income of those in private schools is about $140k, compared to $87k for those attending public schools.

Slide5

In other words, these states, among others, have relatively large shares of kids outside the system entirely, and the average income of their families is much higher than the average income of those inside the system.

That is, there already exists substantial bias – due to omitted data – in our measurement of gaps in educational outcomes!

Should we try to mitigate any additional bias? Perhaps. But can we pretend that if we do – if we reduce opt outs among affluent public school attendees, we’ve adequately measured outcome equity? Uh… no.

Here’s the breakout of those enrollments by primary affiliation of school, based on the most recent Private School Universe Survey from NCES.

Slide6

Slide7

So, is the National Catholic Education Association on board yet [w/CCSS perhaps, but the tests?]? Are they fully adopting/implementing annual testing of everyone?

How about the most (economically) elite schools in this mix, most of whom are members of the National Association of Independent Schools?

The reason why our School Funding Fairness report includes measures of “coverage” and income gaps by coverage is to make clear that even our measures of fiscal disparity across children attending public schools in any state suffer from the bias resulting from our inability to capture the resources available to the relatively large shares of children not in the public system at all, which, for 5 to 17 year olds, exceeds 20% in states like Delaware and Louisiana.

So, to those in a tizzy about opt out.

Chill.

Annual testing of everyone really isn’t annually testing everyone anyway, and as a result, really isn’t serving the public interest as well as you might think!

Innumerati: Blatantly, belligerently mathematically and statistically inept political hacks who really like to use numbers and statistical arguments to make their case. Almost always out of context and/or simply wrong.

Friday Graphs: Bad Teachers? or Bad Policy & Crappy Measures in New York?

A while back, I wrote this post explaining the problems of using measures of student achievement growth to try to sort out “where the bad teachers go.”

The gist of the post was to explain that when we have estimates of student achievement growth linked to teachers, and when those estimates show that average growth is lower in schools serving more low income children, or schools with more children with disabilities, we really can’t tell the extent to which these patterns indicate that weaker teachers are sorting into higher need settings, or that teachers are receiving lower growth ratings because they are in high need settings. The reformy line of argument is that it’s 100% the former. That bad teachers are in high poverty schools, and that it’s because of bad teachers that these schools underperform. Fire those bad teachers. Hire all of the average ones waiting in line.

Even the best measures of student growth, linked to teachers, addressing as thoroughly as possibly numerous contextual factors beyond teachers control, can’t totally get the job done – isolating only the teacher…. well… classroom level… effect. And, as I’ve noted in previous posts, many if not most state and district adopted measures are far from the best.. or even respectable attempts.

I explain in this policy brief, that in New Jersey, factors including student population characteristics, average resource levels available in schools, competitive wages of teachers (relative to surrounding districts) and other factors are significant predictors of differences in school average “growth” ratings. Schools with more resources and less needy students, and higher average scores to begin with, in New Jersey, get significantly higher growth ratings.

I also showed in this post that in either Massachusetts or New Jersey, teachers in schools with larger shares of their populations that are female, are less likely to receive bad ratings (Mass) or, conversely in schools that receiving higher growth scores (New Jersey). The implication, accepting reformy dogma about what these measures mean, is that our best teachers are teaching the girls.

So then what about those New York teacher ratings I addressed in the previous post. We saw, for example, that teachers rated “Ineffective” on the growth measure tend to be in high poverty schools:

Slide6

Tend to be in schools with larger classes:

Slide5

And those really effective teachers tend to be in schools with lower poverty and smaller classes.

So, does that mean that the “great” teachers are just getting the cushy jobs? Or is the rating system simply labeling them as such?  While there may indeed be some sorting, especially in a state with one of the least equitable funding systems in the nation, it certainly seems likely that the estimates of teacher effect on student achievement growth… well… simply suck! They don’t measure what they purport to measure.

They measure, to a large extent, the conditions into which teachers are placed, and NOT the effect of teachers on student outcomes.

Combining the above factors into a logistic regression analysis to predict how a handful of conditions affect the likelihood that a teacher is rated either “ineffective” (you really suck) or “developing” (you kinda suck, and we’ll tell you you really suck next year), we get the following:

NY Ratings Bias

So, even when considered together (holding the other “constant”), teachers in schools with larger classes (at constant low income share and funding gap) have greater likelihood of being rated “bad.” Teachers in schools with higher low income concentrations, even if class sizes and funding gaps are the same, are much more likely to be rated “bad.” But, teachers in schools in districts that have smaller state aid gaps, are less likely to be rated “bad.”

So, on the one hand, we can stick to the King’s grand plan….

  • Step 1 – Disproportionately label as “bad” those teachers in schools serving more low income kids, and doing so with fewer resources, including larger class sizes, and dump those lazy failing teachers out on the street…
  • Step 2 – Wait for that long line of “average” teachers to sign up to take their place… stepping into the very same working conditions of their predecessors, which likely led, at least in part, to those bad ratings….
  • Step 3 – Repeat

And the cycle continues, until a) those conditions are improved and b) the measures for rating teacher effect are also improved (if they even can be).

Alternatively, maybe the actual policy implication here is to a) reduce aid gaps and b) use that funding to improve class sizes?

UPDATE –

I figured I’d go check out that gender bias issue I found in NJ and MA. And wow – there it is again. I’ve rescaled the low income concentration and female concentration effects to relate odds changes (of being labeled bad) to a 10% point shift in enrollment (e.g. from 50% to 60% low income, or female). Here are the updated model results:

NY Ratings Bias

So once again – is it that all of the “bad” teachers are teaching in schools with higher percentages of boys? or is something else going on here? Are teachers really sorting this way? Are they being assigned by central office this way? Or is there something about a class with a larger share of boys that makes it harder to generate comparable gains on fill in the bubble, standardized tests? Why do the girls get all the good teachers? or do they?

Relinquishing Efficiency: NOLA Data Snapshots

There’s always plenty of bluster about the post-Katrina NOLA miracle. I’ve done a few posts on the topic, but none recently.

See:

The NOLA model of “relinquishment” continues to be pitched as a handy-dandy reformy solution for dismantling the dysfunctional urban school district and achieving miraculous gains in overall student outcomes (like those reported by CREDO), of course, at little or no increased expense. Indeed this latter piece is merely implied, by the complete and utter silence on the question of just how much money is being thrown at this alternative model in order to prove it “works.”

The purpose of this post is merely to put some of this NOLA bluster into context, using readily available data sources, including the NCES Common Core Public School Universe and NCES Fiscal Survey of Local Governments, along with CRDC/Ed Facts data released for states to conduct equity analyses to support their “teaching equity” plans.

First off, here are the pre, to post Katrina enrollment patterns for district and charter schools identified as within city boundaries of New Orleans:

Slide1City enrollments remain far lower than they were pre-Katrina, and any comparisons of the present, to that era, or even to the immediate post-Katrina era, when nearly all students remained displaced, are not useful. Most students are now in Charter schools, meaning that establishing a “counterfactual” comparison of charter students against “non-charter” students, as in the typical charter pissing match studies, is, well, rather difficult if not implausible.

As one might expect, once you’ve got most kids in charter schools, then the charters must somewhat mirror the population that had been in district schools, and remain in the few non-charters as of the final year of these data.

Slide2Really, no surprises here. Of course, we might find a different story if I had readily available data on children with disabilities, by the severity of those disabilities.

This next graph shows the per pupil current spending over time.

Slide3Now, that spike in 2006 is NOT because all of the sudden NOLA schools spent a whole lot more, but rather because the denominator – Pupils – nearly disappeared. Per pupil spending goes up when pupils decline, if spending does not decline commensurately.

It’s a simple math thing. But, even after the system stabilized at its new level, the state of Louisiana has seen fit to boost spending for the Recovery School District to 55% higher than state average spending. Prior to Katrina, NOLA schools were merely at parity with state averages. That’s a substantive boost. And one I’m certainly not complaining about, given the needs of these children. But certainly any claims of NOLA miracles, if they do exist, must include conversation about the “massive infusion of funding” in relative terms associated with this “relinquishment” experiment.

This increase (relative to surroundings) is greater than the boost received by Newark, NJ at any point during school funding litigation in NJ.

And where has some of that money gone? Well, this graph shows transportation expenditures per pupil over time.

Slide4While a bit volatile from year to year, the NOLA experiment seems to be leading to at least DOUBLE state average (non-rural) transportation spending per pupil – AND this is occurring in the most population dense part of the state, where one would expect average transportation costs to be lower. To put these figures into context, taking the margin of difference in transportation spending as about $600 per pupil in the most recent year, that figure is about 6% over the state average $10k per pupil operational expense (that is, consuming the first 6% of the 55% elevated spending on RSD, for a non-transportation RSD margin of 49%, still a healthy boost).

But what had been going on at ground level – within the “district” across schools – when there still existed district and charter schools? Here are some snapshots of total staffing expenditures per pupil by school organized first by low income concentration and then by special education.

Slide5Slide6Visually, it would certainly appear that the edge was being given to charter schools in terms of resources. In which case, any policy inferences based on assertions that charter schools yielded better outcomes, should certainly consider the influence of the additional resources. To clarify, the following table shows the output of  a regression comparing the per pupil staffing expenditure across charter, and “other” schools in New Orleans, for schools serving similar shares of low income children, children with disabilities and serving similar grade range distributions.

Slide7On average, the CRDC/Ed Facts data indicate that Charter Schools in New Orleans were spending $1,604 per pupil more than were “other” schools serving similar student populations. And that’s a hefty boost given that spending ranged from about $4,000 to $8,000 for most districts. That’s 40% of the $4,000 figure.

Again, any interpretation of differential effectiveness of charters versus other schools in New Orleans should consider the potential relevance of a 40% differential in staffing expenditure per pupil.

Setting aside the HUGE ACCOUNTABILITY concerns associated with this model (which no-one should ever set aside), and significant concerns over the legal rights of children and taxpayers (again, which should never be set aside), there are some potential lessons for pundits and policymakers here.  If there is even a success story to be told in NOLA (which I’m unconvinced), that success isn’t free, and it isn’t cheap.

So many pundits over time have ridiculed as the most inefficient experiment in social engineering of all time, the Kansas City desegregation plan of the 1990s. Now, there’s much misguided bluster – urban legend – in those characterizations, as I’ve written in the past. Perhaps one of my greatest fears about the NOLA experiment is that it will provide more fodder for the assertion that money doesn’t matter. Heck, they’ve thrown a lot of money at this so far. They’re just not talking about it. It’s being spent on exorbitant transportation costs, among other things.

Strangely, for now, all I hear is silence from the anti-spending, efficiency warriors of the ed policy world when it comes to NOLA.  Does that mean that money really matters (accepting the NOLA miracle characterization), or, alternatively, is NOLA proving (by not substantively improving outcomes with a 55% boost in funding) that the inefficiencies of a 100% charter/choice/unified enrollment system are equal to or greater than those of the urban school district of the past?

Data notes:

The original data sources for the above analysis are:

  1. enrollment data: http://nces.ed.gov/ccd/pubschuniv.asp
  2. fiscal data (PPCSTOT – or current operating expenditure per pupil) http://www.census.gov/govs/school/
  3. CRDC/Ed Facts School Site staffing expenditure data: http://www2.ed.gov/programs/titleiparta/equitable/laepd.xlsx

For current operating expenditure comparisons, the State of Louisiana reports different per pupil spending figures, combining RSD operated schools and Type 5 charters [whereas NCES reports RSD operated schools, where students shift from one – RSD operated – to the other – Charters – over time, both under RSD Governance]. Both, as far as I can tell, by relevant notations, exclude short term emergency funds. And both are current spending (excluding capital investment) figures. State data are reported below. Notably, the margin of difference is smaller than the operating expenditure figure above. But, interestingly, as more students shift to Type 5 charters, the margin of spending difference increases.

This is a trend worth watching over time. This margin, which is still substantial (and growing), might be consumed almost entirely by increased transportation expense, but may also continue to rise (or not?).

Note that these differences are unrelated to the school level CRDC/Ed Facts analyses above, which include independently reported staffing expenditure data on individual school sites where charter schools have sufficient additional resources  to substantially outspend (+40%) non-charters. These large differentials (huge for some schools) are likely a function of privately contributed resources which may not be showing up in either the State or NCES data.

Finally, there’s rarely need to speculate or make anecdotal claims about data being “wrong” or “different,” or whatever, when one can simply look up the relevant data and make the relevant comparison. Tables w/relevant URL citations can even be conveyed via twitter!

District(s) 2011-12 2012-13 2013-14
Other Parrish Schools $     10,543 $     10,368 $     10,611
Orleans Parish School Board $     14,273 $     14,601 $     13,527
Recovery School District (Operated & Type 5 Charters)* $     11,420 $     11,665 $     11,998
RSD Margin over “Other” 8.3% 12.5% 13.1%
https://www.louisianabelieves.com/resources/library/fiscal-data

NYSED Recommends “Teacher Effectiveness Gnomes” to Fix Persistent Inequities

I guess I knew that when ED released their “teacher equity” regs late fall of 2014, that we were in for a whole lot of stupid.

You see, there was some good in those regulations and the data released to accompany them. There was discussion of teacher salary and qualifications parity, and some financial measures provided that would allow states to do cursory analyses, based on 2011-12 data, of the extent to which there existed objectionable inequities in either cumulative salary expenditures per child across schools, or average salary expenditures. The idea was that states would set out plans to evaluate these disparities, using data provided and using their own data sources. And then, states would provide plans of action for mitigating the disparities. This is where I knew it could get silly.

But state officials in New York have far surpassed my wildest expectations.Here’s their first cut at this issue: http://www.regents.nysed.gov/meetings/2015Meetings/April/415p12hed2.pdf

In this memo, NYSED officials identify the following inequities:

According to the USED published equity profile, the average teacher in a highest poverty quartile school in New York earns $66,138 a year, compared to $87,161 for the average teacher in the lowest poverty quartile schools. (These numbers are adjusted to account for regional differences in the cost of living.) Information in the New York profile also suggests that students in high poverty schools are nearly three times more likely to have a first-year teacher, 22 times more likely to have an unlicensed teacher, and 11 times more likely to have a teacher who is not highly qualified.

& you know what? They’re right. Here’s the full continuum of average salaries and low income concentrations across NY state schools, first with, and then without NYC included.

Slide1

Slide2

As I’ve pointed out over, and over and over again on this blog, NY State maintains one of the least equitable educational systems in the nation. See, for example:

  1. On how New York State crafted a low-ball estimate of what districts needed to achieve adequate outcomes and then still completely failed to fund it.
  2. On how New York State maintains one of the least equitable state school finance systems in the nation.
  3. On how New York State’s systemic, persistent underfunding of high need districts has led to significant increases of numbers of children attending school with excessively large class sizes.
  4. On how New York State officials crafted a completely bogus, racially and economically disparate school classification scheme in order to justify intervening in the very schools they have most deprived over time.

Ah, but I’m just blowin’ hot air again, about that funding stuff, and the fact that NY State continues to severely underfund the highest need districts in the state, like this:

Slide2

But I digress. Who needs all of this silly talk (and actual data) about funding disparities anyway? And what do funding disparities possibly have to do with teacher equity problems, or salary disparities like those identified above by NYSED using USED data?

Well: https://www.youtube.com/watch?feature=player_detailpage&v=wfgnNI9-ImY&list=PLuzsMod17tiHrlaBvDcm2us_k68uxZcSy#t=801

Of course, NYSED official know better – much better what’s behind those ugly salary and ultimately, teacher qualification disparities plaguing NY State schools. The ED regs require that states first identify problems/disparities. Then, ROOT CAUSES, thus, leading to logical policy interventions – Strategery at it’s finest!

PROBLEM –> ROOT CAUSE –> STRATEGERY

So what then are the root causes of the disparities identified above by NYSED?

Through the collaborative sharing of lessons learned through the STLE program and research, the Department has determined that the following five common talent management struggles contribute significantly to equitable access:

  1. Preparation
  2. Hiring and recruitment
  3. Professional development and growth
  4. Selective retention
  5. Extending the reach of top talent to the most high-need students

Although the Department believes the challenges described here are reflective of broad “root causes” for the statewide equity gaps, it is still important for each LEA to examine their unique equity issues and potential root causes. In talking with superintendents, principals, and teachers involved in STLE, the Department was able to see that equity gaps that appear similar across contexts may in fact stem from different root causes in various LEAs. For example, one district struggling with inequitable access for low-performing students may find that inequities stem from a pool of low quality applicants, whereas a second district may find that they have a large pool of high quality applicants but tend to lose top talent early in their careers to neighboring districts who offer more leadership opportunities for teachers.

Ah… okay… I thought equitable funding to actually pay equitable salaries might have had something to do with it. How silly am I? It’s about bad teacher preparation programs which somehow produce bad teachers who ask for lower salaries in high poverty districts? and high poverty districts selectively retaining only their bad teachers, intentionally, by just not paying well. It’s a conspiracy that can be fixed by clever talent development strategies. No money, except some chump change in competitive grants, needed.

And thus, if we know that bad teacher prep and crappy local management of talent is the root cause, the solutions are really easy?

The Department believes the overall quality of teaching and learning can be raised through the implementation of comprehensive systems of talent management, including sound implementation of the teacher and principal evaluation system.

Key Component 1 (Educator Preparation): The Department will continue to support and monitor improvements to access and entry into the profession, such as the redesign of teacher and principal preparation programs through performance-based assessments, clinically grounded instruction, and innovative new educator certification pathways.

Key Component 2 (Educator Evaluation): With the foundation laid by Education Law §3012-c, the Department will continue to provide support and monitoring to LEAs as they implement enhanced teacher and principal evaluation systems that meaningfully differentiate the effectiveness of educators and inform employment decisions.

Key Component 3 (The TLE Continuum): The Department will provide resources and support to LEAs utilizing evaluation results in the design and implementation of robust career ladder

All that’s missing from this brilliant plan are the teacher effectiveness gnomes.

So yeah… it all comes down to the state’s brilliant model for rating, ranking and dumping “bad” teachers to open the door to all the really good teachers who are currently waiting in line to work in schools that …

serve high concentrations of low income and minority students,

Slide6

have larger class sizes,

Slide5

and still (and moving forward) have the largest state aid shortfalls!

Slide4

What’s really great about all of this, is that these teachers – all chomping at the bit to work in these schools for low pay – can have it all! Funding gaps and greater needs. Note that the majority of “ineffective” teachers (as so declared by growth rating along) are clustered in schools with high low income concentrations and big aid gaps. Interestingly, even those in districts with fewer low income children, are also in districts with big aid gaps.

CRDC Ed Facts Data – NY State 2011-12

To summarize – the framework laid out by ED, was:

PROBLEM –> ROOT CAUSE –> STRATEGERY

The brilliant application of that framework by NYSED was:

Problem=Huge salary & teacher qualification disparities by school poverty

Root Cause=Bad teachers, Teacher Prep & Administration

Strategery=Talent Development (fire bad teachers)

Are you kidding me? Really? In my wildest dreams…

To clarify – if it wasn’t already sufficiently clear – I do not at all accept that the patterns above represent the actual distribution of teacher effectiveness, but rather, that the crappy measures adopted by NYSED for rating teacher effect on growth systematically disadvantage those teachers serving needier students, in larger classes and schools with more scarce resources.

Yeah… I get it… NYSED and the Regents don’t pull the budget strings. The Gov has done that damage. But that doesn’t make the logic of the NYSED brief any less ridiculous!

Head… desk…

Angry Andy’s not so generous state aid deal: A look at the 2015-16 Aid Runs in NY

Not much time to write about this, but I finally got my hands on the state aid runs for NY state school districts which were, in an unprecedented and utterly obnoxious move by the Gov, held hostage throughout the budget “negotiations” (if we can call  it that).

Quick review – NY operates a state aid calculation formula built on the premise that each district, given its geographic location (labor costs) and pupil needs requires a certain target level of funding to achieve desired outcomes.

Target = Base x Pupil Needs x Regional Cost

The state then determines what share of that target shall be paid by local districts, the rest to be allocated in state aid.

State Aid = Target – Local Contribution

A few really important points are in order before I move forward with the updated estimates. First, those targets are supposed to be aligned with costs of achieving desired outcomes. Higher outcomes cost more to achieve, with greater marginal cost effects where student needs are higher. As I’ve explained previously, the state has continued to increase those outcome targets, but has continued to lower the funding target. This is a formula for failure!

And, in 2015-16, they’ve done it again. The “base cost” figure which drives the formula has again been decreased, thus leveling down target funding across the board, all else equal.

Slide1

So, with this in mind, any/all funding gaps I discuss below should be considered only funding gaps with respect to what the state would like to pretend is its full funding obligation. What in reality is a low-balled, manipulated figure that downplays substantially the true obligation with respect to current outcome goals. The actual full funding obligation, given increased standards over time, is likely much higher… much higher. There’s no excuse for lowering the target – and continuing year after year to push the date for hitting that target out further. None.

However, from the state perspective, this manipulative game of lowering the outcome target can make it appear that they are getting closer to hitting it. Separately, as I explained on another recent post, one can make the state aid shortfalls look less bad if one requires a higher local contribution, another game used in previous budget years.

Let’s start with the positive. Yes, the adopted state budget does, on average, increase per pupil state aid and does so in higher amounts in districts serving needier pupils:

Slide6

Not bad. We’ve got districts getting what would appear to be hundreds of dollars per pupil in increased state aid. But, remember, this is only a small dent in the funding gaps. Let’s first look at the funding gaps for 2015-16 for those districts Angry Andy called miserable failures who should be subjected to the death penalty.

Slide2

Here, we’ve got districts that in the best case, are still being shorted around $1,500 per pupil in state aid. Every one of Angry Andy’s failing districts will continue to be substantially underfunded – against the state’s own low-ball estimates – for yet another year. All in the name of Angry Andy’s Awesome Austerity Experiment. Regarding a similar “experiment” in Kansas, a 3 judge panel noted it is experimenting with our children which have no recourse from a failure of the experiment.”

And what about small city school districts, who recently had their case heard in Albany? Well, first off, some of them are among the Angry Andy failures.

Slide3

And generally, their state aid gaps remain large – really large. And again, these are gaps with respect to low-balled targets – and after jacking up the supposed local responsibility to fund those targets.

So, who’s to blame here? Well, obviously, it’s not the funding gaps – it’s those lazy teachers and the complicit administrators who give those teachers good ratings even when they can’t produce test score gains.

I close with an update of the 50 districts with the largest funding gaps going into 2015-16. And here they are:

Slide4

Slide5

  For previous reports/lists, see:

  1. Statewide Policy Brief with NYC Supplement: BBaker.NYPolicyBrief_NYC
  2. 50 Biggest Funding Gaps Supplement: 50 Biggest Aid Gaps 2013-14_15_FINAL