Blog

Leaders and Laggards Lags!

A quick note on Center for American Progress Leaders and Laggards report.

On pages 23 & 24, this report attempts to grade state school funding systems and their level of “innovation.” But, the report pays no attention to a) whether these states actually perform well on any measures of outcomes,  b) whether these states actually fund their schools well overall, or c) whether these states actually target any of that funding to where it’s needed most.

Quite simply, this report is complete garbage – at least the finance section! One cannot possibly rate “innovation” of a state school funding system without any regard for whether that system is sufficiently and equitably funded. You can’t stimulate innovation without an investment in Research and Development or the product itself! It really is that simple.

The best Finance grades in the report are given to such education funding laggards as:

Yet, high performing states that actually fund their systems well and target resources where needed most get lousy grades (Massachusetts & New Jersey).  This  stuff is just plain silly!

====

To lighten the mood a bit, here’s Willy Wonka summarizing the Arizona school finance formula: http://www.youtube.com/watch?v=M5QGkOGZubQ

What do NJ Charter Schools Really Spend?

Getting back to the original point of my blog, this post is simply about introducing to the public discourse some actual data on NJ charter school spending. Back when I wrote my textbook on school finance, I found that DC charter schools were having to rely on private contributions to the tune of 14% of their annual operating expenses. One can obtain such information from IRS non-profit tax filings (IRS 990). I did a quick run of New Jersey Charter School IRS 990 filings for 2008, reflecting revenues and expenditures for 2007. I simply combined their tax filing information with their total expenditure information – which does include expenses for facilities.

What is most striking but not surprising is the degree of disparity among charter schools, driven substantially by differences in private fund raising.  Also important to note is that many of these schools spend well over the assumed $11,000 to $12,000 per pupil constantly spun by the media these days. I’ve not yet aligned the performance data with these new financial data, as I need to return to my actual research agenda (this particular analysis is  a part of ongoing research).

Remember also that these schools presently serve few or no special education children, making $16,000 per pupil worth well over $18,000 (assuming 15% special ed students typically at double average cost).

You might say, hey, if the public only has to subsidize $11k to $12k and private contributors pick up the rest, it’s still a bargain for taxpayers, right? Perhaps – but the necessity to rely on $3k to $5k of private contributions for each charter child educated then seriously limits the potential expansion of charter schools.

NJ Charter IRS 990Note from my previous posts and work on private schools, I have also shown that private independent day schools spend well above the average public expenditure. New Jersey private independent schools spent in 2007, an average of $25k to $30k per pupil (day schools only) with some exceeding $30k, also based on IRS 990 data.

My previous research on staffing in charter schools (based on undergraduate college selectivity of teachers) has shown that charters in some states attempt to staff their schools in ways similar to elite private academies – the private independent schools.

There is at least anecdotal evidence that some New Jersey Charter schools wish also to emulate elite private schools. For example, Ethical Community Charter School is founded by individuals previously associated with the Ethical Culture Schools of New York City, including the Fieldston School, a school where I taught for 5 years. An absolutely amazing school, which, by the way, spends well over $30,000 per child per year (even tuition is higher than that). I would argue that it will be quite difficult to emulate the ECFS schools of NYC on a mere $11k to $12k and that substantial private fundraising will be required. But private fundraising shouldn’t be required.

Good schools cost money! Sometimes a lot of money. Good education is expensive, which is not to say that all expensive education is good. My point here is that we are not going to solve our “urban education” problems on the cheap ($11k to $12k per kid), or necessarily any cheaper than what we’re spending currently. Any attempt to do so is likely to cause more harm than good.

[for those hanging on to anecdotal information about private religious school tuition as their basis for assuming good schooling can be done dirt cheap – about $3,500 per kid- please read http://www.epicpolicy.org/files/PB-Baker-PvtFinance.pdf]

The Real NJ Graduation Scam?

Bob Bowdon, of Cartel fame and E-3 make the claim that New Jersey’s poor urban districts are scamming the public and taxpayers by having overstated graduation rates. About half of poor district kids pass the HSPA test, but 85% graduate. Their brilliant solution to this problem, as I’ve noted previously, is to give kids the choice to attend charters – on the argument that charters are less likely to do such scamming?  So, here are some fun numbers.

First, the percent proficient or higher on HSPA MATH Assessments by district factor group for 2008:

Slide1

So, what we have here is that Charters (DFG R) actually had the lowest rate of kids proficient or higher on HSPA (matching my graph on previous posts, but lower here because only math is included). Yep, even lower than the poorest urban publics (DFG A). Yes, this is an average – among general ed test-takers – and averages conceal the highs… but they similarly conceal the lows.

Now, here are graduation rates for the schools by DFG:

Slide2

Wait one second. How can charters have a 97% graduation rate if only about half of the kids pass HSPA? Where’s the scam here? I thought you said that the differential between HSPA proficiency and graduation rates was supposed to be indicative of a scam? And that charters were the solution to the scam? But where is that differential bigger? Charters are lower on HSPA proficiency by a few points and are 12% higher on graduation rate? Now I’m really confused.

Okay – I’m not trying to pick on charter schools here. You guys are mostly working your butts off for a great cause, and quite honestly I don’t hear these completely absurd arguments coming from the charter leaders and teachers themselves. But the supposed “advocacy” out there on your behalf is deeply problematic. Quite honestly, if someone was out there advertising so poorly for my cause, I’d be a little concerned… or perhaps outraged.

Note to Non-Jersey readers about my casual use of Jersey terminology – DFG. In New Jersey, district factor groups or DFGs are a classification scheme that has been used for decades to characterize socio-economic features of public school districts. DFG A districts are generally poor urban districts, but many NJ poor urban districts are relatively small in total enrollment (a cluster of poor urban neighborhoods segregated from their more affluent neighbors). DFG I and J districts are affluent suburban districts. Charters are labeled “R.”

Teacher Evaluation with Value Added Measures

This month, the special issue of the journal Education Finance and Policy on value-added measurement of student outcomes was published. The table of contents is here:

http://www.mitpressjournals.org/toc/edfp/4/4

This is good stuff, authored by leading educational measurement and statistics researchers and economists. These articles provide some important cautionary tales regarding the application of value-added measures of student outcomes for teacher evaluation. Here is a policy brief with a more user friendly summary of some of the content of the special issue:

http://www.wcer.wisc.edu/publications/highlights/v19n3.pdf

Here’s a recent working paper by Jesse Rothstein, Princeton economist who also has an article in the special issue:

http://gsppi.berkeley.edu/faculty/jrothstein/published/rothstein_vam2.pdf

Here’s the concluding sentence of the abstract Rothstein’s paper:

Results indicate that even the best feasible value added models may be substantially biased, with the magnitude of the bias depending on the amount of information available for use in classroom assignments.

On average, the articles in the special issue do show some promise for using value-added assessment in teacher evaluation, with a number of really important caveats and technical stipulations.

Yes, we need access to more student assessment data with linkages to specific teachers – including the range of teachers across which middle and secondary students interact (it’s not as simple as linking the single teacher to a group of children). We need access to such data across multiple states and their assessment systems. Scaling properties of data and test noise play a major role in the precision with which one can isolate teacher or classroom level effects. We have little or no idea, for example, of the extent to which analyses using North Carolina or Texas assessment data relate to New Jersey assessment data, the statistical properties of those data and their usefulness or lack thereof for estimating teacher or classroom effects (unless there are technical papers out there on NJ tests of which I am unaware).

So, these are the main reasons we need to tear down firewalls – to advance the art, science and statistics of value added modeling, school and teacher evaluation and to uncover potential shortcomings where they exist.

Policymakers and pundits diving in head first on these issues need, quite simply, to chill out, perhaps read the special issue above and heed the advice earlier this year from the National Academy of Sciences and figure out how to do this right if we’re going to do it at all.

Diving in too quickly and doing it wrong will make it that much harder to do it right in the long run and will provide that much more ammunition for resistance.

Hawaii’s Funding Mess: My thoughts on why

It is indeed sad to see the state of public schooling in Hawaii. Teachers are furloughed and students are losing valuable classroom time. The state has chosen to use ARRA stimulus funds to fill budget gaps – which has been done by many states – but Hawaii has chosen to cut more than fill.

Arguably, Hawaii’s current education funding problems can be traced back to 2003 and a hard-nosed attempt at revenue-neutral education reforms – Fad-based reforms! Not fact-based ones. Off-the-Shelf School Finance solutions, as Doug Elmer and I describe in a recent article. (http://epx.sagepub.com/cgi/content/abstract/23/1/66)

Some historical context is provided here:

http://archives.starbulletin.com/2003/11/25/news/story2.html

Among other things, Hawaii’s leaders were misled in 2003 to believe that Hawaii already spent far more than necessary on its schools and that decentralized governance alone would solve their problems, driving more money to classrooms without ever having to add a dollar of new revenue.

The report by Bruce Cooper and William Ouchi concluded:

  • If Hawai’i were to reach classroom spending of 65 cents out of each education dollar, it would mean an additional $46,250 to spend on each classroom per year. This diversion of money to non-core uses is typical only of very large school districts.[1]
  • The results of our study bear on the consideration by the state of moving to a new system of management, Weighted Student Formula (WSF).

But this was an argument based on shoddy analysis and poorly documented summaries of state spending (actually, state and local total revenue) – comparisons which the authors of the original report even failed to understand. Yet, their message stuck with Hawaii policymakers.  No more money for schools. Just structural (read superficial) reform.

Oddly enough the original Cooper/Ouchi report which chastised Hawaii’s Board of Education for spending way to much to begin with and driving less than 65% to the classroom, never actually provided legitimate analyses supporting the secondary conclusions of that report – promote decentralized governance and implement a weighted student formula with the money you already have! Doug Elmer and I discuss these issues in this article: http://epx.sagepub.com/cgi/content/abstract/23/1/66

This whole series of events provided the governor and legislature in Hawaii the platform to continue starving the state’s education system while placing blame on the State Board of Education for not acting on their reforms, which in their view, would have solved everything. http://www.kpua.net/news.php?id=9232

Hawaii’s education system problems run much deeper than any superficial, off-the-shelf management guru strategy can solve.

Hawaii is among the few states where fewer than 80% of 6 to 16 year old children attend the public school system (78.8% according to American Community Survey 2005 to 2007). Yes, less than 80% of children in the age groups where most kids attend public schools are in Hawaii’s public schools. And yes, they are the lower income kids compared to their peers in Hawaii private schools.

That said, Hawaii’s educational effort (share of Gross state product spent on public schools) is relatively average to above average among states. Further, cross state comparisons of Hawaii’s educational spending provide mixed messages:  Hawaii’s current spending  – depending on how it’s measured and/or how it’s adjusted for regional cost variation is relatively average to above average (looking at total state and local revenue) or below average (looking at current expenditures per pupil) , adjusted for regional costs.

During the recent economic downturn, Hawaii’s total state revenue decline has been near the middle (upper middle) of the pack nationally – total state revenue losses from peak to June 2009 (p. 20 and 21), according to this Rockefeller Institute Report (best site for this stuff):

http://www.rockinst.org/pdf/government_finance/state_revenue_report/2009-10-15-SRR_77.pdf

This very recent WSJ article (http://online.wsj.com/article/SB125635093976805443.html) shows how Hawaii’s education funding cuts compare to those in states like California, Florida, Georgia and New Mexico – all of which have experienced much greater declines in total state revenue than Hawaii as of earlier this year – according to the Rockefeller Institute analyses linked above.

Even though Hawaii’s total state revenue is not declining as fast as these other states, Hawaii’s cuts to public schools have been comparable or even greater.

A few years back, Scott Thomas (now at Claremont Graduate School) and I were asked to provide analyses for and guidance to the Hawaii Department of Education regarding implementation of the decentralized weighted student funding plan which had been adopted as part of the comprehensive reforms of 2004.  To a large extent, our attempts at modeling financial redistribution options across Hawaii’s schools under revenue neutral assumptions proved to be an exercise in re-arranging deck chairs on the titanic.  Our two reports can be found here:

Part I – includes executive summary and conceptual framing of analyses, along with comparisons to other state formulas

http://sites.google.com/site/schoolfinancepolicy/consulting-reports/Hawaii.Part1%262.2006.pdf?attredirects=0&d=1

Part II & III – includes specific analyses of teacher labor markets, distribution of teachers by qualifications across richer and poorer neighborhoods, locations & islands, and concludes with simulations of redistribution options

http://sites.google.com/site/schoolfinancepolicy/consulting-reports/Hawaii.Part3.2006.pdf?attredirects=0&d=1

On page 34 of the second report, Scott Thomas and I explain:

=======  Begin Excerpt

A recent New York Daily News (7/2/06) editorial opined:

“Rather than simply pumping more gas into this broken down car, it’s time to design a much smarter and more effective way to get from Point A to Point B. A reform idea called ‘weighted student funding’ does just that, making intelligent use of the resources we already devote to education. How? Unlike the current system—which funds school districts through an incredibly complicated calculus—weighted student funding ties the money to the student.” (Cooper)

Increasingly, pundits supporting this view of WSF use the analogy of students carrying with them a need-based backpack of funding. Hawai‘i’s BOE and Committee on Weights now recognizes that in a system already constrained by limited resources, targeting sufficient need-based weighting simply costs more, not less or the same amount of money. As noted in our original report, we do not envy the members of committee charged with redistributing limited resources. If, as our estimates suggest, some schools need 40% more than others on the basis of poverty alone (we believe this to be a low estimate), and if this is to be done with no new money added to the system, then others must necessarily give up 40% of their funding.

In other words, assume Johnny and Malaya both need backpacks and currently they both have $10, sufficient to buy an ordinary backpack at Target or Wal-Mart. But, Malaya, by virtue of combined economic disadvantage and limited English proficiency, needs a $20 backpack. Johnny may need only an $8 backpack—the cheapest available (but with less padded shoulder straps than Johnny is used to). Unfortunately, if we redistribute the necessary resources to Malaya, then Johnny is out of luck altogether. If we leave Johnny with enough for the $8 backpack, then Malaya is out of luck. It’s a lose/lose proposition. For both Johnny and Malaya to get the backpack (read education) they need through a WSF, we will likely have to find more money. We ourselves might view this issue differently if it was plainly obvious that Hawai‘i’s schools are flush with funds and simply squandering those funds on unnecessary, frivolous endeavors. We lack any evidence to support this conclusion.

======= End Excerpt

While I’ve not followed Hawaii closely for the past few years, it would appear that this ship has now begun to sink – widening the gap between the fewer than 80% of children left in public schools (on the ship) in Hawaii and the 20% from first class who had access to life rafts.

I find it most disturbing that much of this mess may have been avoidable had it not been for purely political interests and self-absorbed snake-oil salesmen ready and willing to serve those interests with the simple message that money can’t fix schools.  Off-the-shelf reforms like WSF can!

The reality is that substantive education reform often costs money – sometimes a lot of money and sometimes a lot more than the amount already being spent. Automatically assuming that there’s enough money in a system just because it looks like a big number is not enough. More detailed analysis is required. You can’t starve a system into reform, especially if the reforms cost money. Unfounded assumptions and arguments that there’s plenty of money and that money doesn’t matter and may never matter are not only absurd but are potentially very harmful. It would appear that Hawaii is now becoming a stark example of that harm.

You can rebuild the engine and transmission as many times and in as many ways as you want, but if you don’t eventually put gas in the car, it won’t run!

(my apologies for combining sinking ship metaphors, backpacks and cars that don’t run in a single blog post)

Replicating Robert Treat Academy

With little doubt, Robert Treat Academy in Newark is one of those charter schools that is doing well by common outcome measures and likely by even more important measures than state tests. What we know about are the tests. And even if one controls for a variety of factors about student populations, Treat’s test scores are pretty darn good.

Here’s a figure from a model I re-ran the other day (based on older work), using a variety of school, student population and community factors to control for expected differences in student outcomes. Schools above the line are those that outperformed expectations and those below the line fell below expectations. Charters are in red, and again, there are roughly equal numbers of traditional publics above and below the red line and charters above and below the red horizontal line. Treat is one of those above the line.

Treat Beat

So the argument goes, Treat is producing these test scores with much less money, and therefore we should be able to do the same, with similarly less money across poor urban settings by emulating the Treat model.

I addressed in a previous post how charter schools receive less through the state aid formula than traditional public districts. Again, this should shift somewhat over time, but charters will remain relatively disadvantaged. Using Robert Treat’s IRS 990 for 2007 expenditures (instead of their NJDOE reporting of their expenditure of public charter funding only), Treat shows expenditures per pupil in 2007 around $12,600. I’m still not sure I’ve captured the full expenditure here, because Treat’s IRS 990s show unusually low levels of private contribution for a successful charter school.

That aside, is the Treat miracle replicable across Newark? Or, is Treat different in substantive ways that can’t be spread throughout the system. Here are a few numbers that raise concern.

First, as I noted on a previous post, Robert Treat’s student body is only 3.8% special education in a district with an average of 18.1%.  This is from the special education classification data from NJDOE. In the enrollment files, Treat reports 0%. At 100% additional average expenditure per special education pupil, matching district demographics would raise Treat’s expected spending to $14,868 (1.18 x 12,600 in 2007).

Second, while Robert Treat does show about 62.4% students qualifying for free (130% poverty level) and reduced (185% poverty level) lunch, the free lunch share is about 42.9%. That is, Treat’s free or reduced share is boosted by the share of children who are more well off among the less well off. Note that the model I used above used Free & Reduced shares, not Free alone or the ratio between them.

By contrast, Newark Public Schools in total has 82% free or reduced and 71% free lunch alone.

Treat also reports less than 1% limited English proficient students while Newark City schools report 8.7%.

It’s one thing for me to try to control for these differences in estimating who does and does not “beat” odds, but yet another to take a model that has been successful under certain circumstances and apply it widely under very different circumstances, at the same cost.

It’s all well and good to cite other studies from other cities  and states that show that charter schools on average aren’t “cream-skimming,” (where most of those comparisons are based either on student’s initial performance or on free + reduced shares) but the reality in this case is that Treat Academy is producing its current level of outcomes at its current price tag with a substantively different student population – most notably the absence of children with disabilities. Again, they’re doing well, and even in models I’ve run controlling for some of these things, they still stand out and should be applauded for their efforts and results.

But, given the demography of the entire student population of Newark in particular, replicating this model may prove difficult. Adding more schools that serve fewer of the poorest children and few or no children with disabilities may be significantly problematic for those schools which then serve the larger shares of both.

Charter Averages Worse than Originally Estimated

Note: The information below is not a comprehensive research study on the relative effectiveness of New Jersey Charter Schools. Rather, it is a quick summary of average proficiency rates for charters compared to other New Jersey schools by socio-economic strata. Unfortunately, New Jersey Charter schools were not part of two major recent multi-state analyses of charter school effectiveness, which can be found here, along with reviews & critiques of those studies. http://www.epicpolicy.org/think-tank/reviews These studies also found mixed results, with charters in some states slightly outperforming their public school counterparts, in other states performing comparably and in others performing less well. I have put together this post merely to stimulate conversation on how NJ charter schools are doing and perhaps encourage additional more thorough research.

In my original post on NJ Charter School performance, Charter schools appeared to be performing somewhere between performance levels of DFG A and DFG B traditional publics. Here’s one of the graphs to that effect.

% Proficient for All Tested Students

Note that the charter line – R – falls between the DFG A (poorest traditional publics) and DFG B lines. But, this analysis includes all tested students. While I expected that children with disabilties were underrepresented in Charter schools, I had no idea just how under represented until I took a look, here: https://schoolfinance101.com/wp-content/uploads/2009/11/charter-special-ed.jpg

For example, Robert Treat Academy has 3.8% and North Star Academy 7.8% children with disabilities in a district that has 18.1% in 2007. These are higher than many, which actually serve 0%.

So, correcting for this problem by looking only at General Education students, the graph above becomes the graph below:

updated charter rel performance

In this graph, the Charter line maps almost precisely with that of the DFG A line. That is, the slightly higher performance in the first graph is almost entirely a function of the fact that NJ Charters simply don’t serve children with disabilities and don’t have them in their test taking pool. My apologies for this apparently glaring omission.

The biggest change to my analysis however is in the relative probability that a student attends a tested grade level where less than 40% of students are proficient or higher. Making the above correction, leads to the finding that a child in a charter school is 35% more likely than a student in a DFG A traditional school to be in a tested grade level where fewer than 40% of general education students scored proficient or higher.

Here’s the logistic regression, weighted for number of test takers in grade level and on test (general education only), based on the 2008 report card data:

Logistic Regression of Low Performance Grade Level (<40% prof. or adv.)

DFG A is the baseline comparison group. An odds ratio of greater than 1.0 indicates a greater likelihood of being in a grade level with fewer than 40% proficient or advanced than in a traditional DFG A school. Only charters have a greater likelihood – and much greater – 36% greater. Likelihoods vary dramatically for the different tests and subject areas. Apparently, 6th grade tests have cut scores aligned such that many more students do poorly on them. I don’t think that it’s just that 6th graders get dumb for  a year. Newer tests take some tweaking. Note the dip in previous graphs. Note also that in affluent communities (GH through J), there is statistically no chance of being in a low performing grade level.

Here’s a link to the School Reports 2008 Data:

http://education.state.nj.us/rc/rc08/database/nj_rc08.xls

Please – take your own stab at this. I’ve been running these quickly. My Stata data are here.

New Update: Here’s my last shot at it for now. I’ve got the odds for charters down to about 25% greater chance than DFG A schools of being in a grade level where fewer than 40% were proficient or higher. Unfortunately, poverty rates among test takers were only calculable at the district level (and for charters) not school and charters with these data (must use the enrollment data for whole school for that). Also, NJDOE continues the habit of not identifying specific locations of charters in their coding system by county. I have a bridge file somewhere, constructed by zip code, but for charters through 2006. May revisit. Anyway, here’s the logistic regression:

updated logit

Ah the perils of goofing around with data too quickly/on the fly. Fun though.

NJ School Funding Suburban Taxpayer Scam?

I hate wasting so much time countering completely absurd claims, like those that spill out on the E3 Cartel commercials. This is a short reply this time. At the end of one of the commercials, the spokesperson slips in the claim that not only are we wasting a ton of money on our low graduation rates in poor urban schools (I discuss this claim here: https://schoolfinance101.wordpress.com/2009/10/31/cartel-recap/), but this whole inefficient mess is a “suburban taxpayer scam.” Yep, suburbanites (like myself) are being dreadfully over-taxed and our hard earned money is being thrown down the rat-hole. We don’t get any of it back.

A simple question to answer here is whether the property tax effort in suburban communities (however we are supposed to define suburban?)  is that much greater than in “urban” communities. An appropriate way to measure this is by calculating the percent of income paid in property taxes.

Here’s a quick snapshot of tax effort in Essex County by income level and in Monmouth county by income level. These data are taken from http://www.nj.com/news/bythenumbers/, and the data are generally from 2005. Most “Abbott” funding to school districts had scaled up between 1998 and 2005.

Essex Tax Effort

Hmmm… no systematic pattern here. Yep, some pretty big differences, but no systematic pattern between poorer and wealthier communities.

Monmouth Tax Effort

As it turns out, tax effort in Monmouth declines systematically as homeowner income increases. Perhaps this is the “urban tax scam” not suburban one?

Yes, the property tax bill in an affluent suburban community is larger – because it is the tax bill on a more expensive home!  (should I really have to say that?) Yes, low property value, low income communities receive higher rates of state subsidy through the state aid formula for schools. That’s generally how aid equalization formulas work. And yes, New Jersey’s aid is targeted to higher need districts, above and beyond typical equalization (but only since 1998-2003).

Let’s get this straight. If the idea of the funding formula was to send back to communities and school districts exactly the amount submitted to state coffers from residents of those communities – then why the heck would we be collecting it to begin with? This would be a particularly foolish exercise since it costs money to process the tax revenues and send them back. That’s how taxes work – whether collected at the municipal level, providing benefit to the people across the street whose house may be valued (taxable value) less than yours, and tax bill may be proportionately less, or across the state. For those who don’t quite understand this, I recommend the Schoolhouse Rock tune about the Taxman. Pretty good stuff!

In a previous post, I also explain how local media in NJ has distorted comparisons of New Jersey property taxes with other states – https://schoolfinance101.wordpress.com/2009/10/03/should-nj-really-try-to-be-like-de-md-mo-ga-wa/

NJ Charters & Disability Rates

Here’s a quick snapshot of the percent of children classified as having disabilities in Charter schools and in Traditional Public Schools in Essex County. These figures add some context to the spending deficit figures in my previous post. Yes, Charters receive a reduced operating aid subsidy. Charters are most disadvantaged financially by not receiving support for facilities, and having to draw on operating funds for facility leases, or receive substantial private support. But, this piece – special education populations- cuts the other way. Traditional public school districts have about 14% to 18% children with disabilities, which typically run about 90% to 110% above “average” expenditure (to provide typical – not necessarily adequate or great – special education services). For example, if 16% of children qualified as disabled and required additional per pupil expenditure of 100% each, these students would add 16% total cost onto district operating costs – or $1,920 over $12,000 for an average per pupil cost of $13,920. That is – just to provide average/typical special education services – the per pupil cost in a district with 16% special ed would be 16% above the per pupil cost of a district with 0% special ed. In other words, if a district with 16% special ed spends $13,920 and another with 0% special ed spends $12,000, those spending figures are comparable – not vastly different.

Here are the special ed rates among Essex county districts and charters:

https://schoolfinance101.com/wp-content/uploads/2009/11/charter-special-ed.jpg

Note: A knowledgeable reader has informed me that the “0” value for Greater Newark Charter is actually “missing data,” for that year and has assured me that Greater Newark Charter does indeed enroll children with disabilities. At some point, I may get around to updating these analyses. Other “0” values may also represent missing data. But, very low, actual reported rates likely do not.

Charter School Special Education Classification

A Must Read – Mapping State Proficiency Standards

Call me crazy, but I’d have to say that one of my favorite publications of all time is a National Center for Education Statistics report mapping state standards onto NAEP, allowing comparison of where state proficiency benchmarks align with NAEP scores. I’ve likely provided links to that report more than a few times in my blog. Well, they’ve done it again. The new MAPPING STATE PROFICIENCY STANDARDS report is out, and you can find it here:

http://nces.ed.gov/nationsreportcard/pubs/studies/2010456.asp