Blog

6 Things I’m Still Waiting for in 2012 (and likely will be for some time!)

I start this new year with reflections on some unfinished business from 2011 – Here are a few bits of information I anxiously await for 2012. Some are likely within reach. Others, well, not so much.

  1. A thoroughly documented (rigorously vetted) study by Harvard economist Roland Fryer, which actually identifies and breaks out in sufficient detail (& with appropriate rigor & thorough documentation) the costs of delivering in whole and in part (and costs of bringing to scale), no excuses curriculum/models/strategies and comprehensive wrap-around services.
  2. The long since promised rigorous New Jersey charter school evaluation – or even better – improved student level data in New Jersey such that researchers can actually conduct reasonable analyses of charter schooling and reforms/strategies more generally across New Jersey public & charter schools.
  3. That long list of all of those other average to below average paying professions – professions other than teaching – where compensation is entirely merit based and based substantially on (noisy) multiple regression estimates of employee effectiveness determined by the behavior of children as young as 8 years old [generously assuming 3rd grade test scores to represent the lower end of the value-added grade range],  AND where the top college graduates just can’t wait to sign up!
  4. That long list of highly successful market-based charter and/or independent private schools – schools not bound by the shackles of union negotiated agreements – where teacher compensation is not strongly predicted by (or directly a function of) experience and/or academic credentials,  AND where the top college graduates just can’t wait to sign up (or stick around)! (see also: https://schoolfinance101.wordpress.com/2010/10/09/the-research-question-that-wasn%E2%80%99t-asked/)
  5. Evidence that there really is enough money tied up in (wasted on) cheerleading and ceramics to be reallocated to provide sufficient class size reduction in core content areas and increased classroom teacher wages (toward improving teacher quality) to make substantive improvements to the quality of high poverty schools!
  6. Evidence that  the differences in student outcomes between high performing affluent suburban public school districts and lower performing poor urban and inner urban fringe school districts are somehow explained by substantial differences in personnel policies, merit-based teacher compensation, teacher benefits and negotiated agreements as opposed to substantive differences in family backgrounds and available resources.

For elaboration on a few of these issues, see my recent AP interview with Geoff Mulvihill: http://www.mycentraljersey.com/article/20120101/NJNEWS10/301010003

And so the new year of education policy research and blogging begins. A year in which I, myself, will be engaged in addition, more extensive analyses of the finances of charter schools, revenue raising and expenditure patterns by locations and by network affiliation. A year in which I also expect to be digging deeper into the distribution and effects of cuts in state aid and funding constraints on school and district resource allocation and exploring across multiple states (and districts and schools within states) the causes and consequences of inequities and inadequacies in public education funding.

Taxpayer rights under New Jersey’s current Education Policy Agenda

In light of recent controversy over the role of state appointed “emergency” managers in Michigan,   I’ve been pondering the state of taxpayer rights under the current education policy agenda(s) in New Jersey. For example:

  • The state of New Jersey seems determined to maintain its control over Newark Public Schools, which, in effect, at least partially (if not almost entirely) negates the voice of local taxpayers in decisions over the operations of their schools.  http://www.nytimes.com/2011/12/12/education/newark-school-district-in-debate-over-state-control.html
  • The State of New Jersey continues to maintain a charter authorization law which permits the state department of education to grant a charter to a school to operate in any district, and draw resources from that district, including those resources derived from local property taxes. But, local taxpayers have no authority in the distribution of local tax dollars to charter schools, authorized by the state.

By contrast, in Georgia, the state constitution grants authority to establish and maintain public schools within their limits exclusively to county and area boards of education (http://www.sos.ga.gov/elections/GAConstitution.pdf, page 60).  So, when the Georgia legislature approved a charter law granting authority to a state entity to approve charters (and draw on local resources), county boards of education challenged that provision in court and won.

One reasonable summary can be found here: http://www.accessnorthga.com/detail.php?n=238715, see also: http://www.earlycountynews.com/news/2011-05-18/Front_Page/Court_ruling_leaves_charter_schools_in_limbo.html

  • The legislature continues to debate the adoption of a Tuition Tax Credit act, known as the Opportunity Scholarship Act. Tuition Tax Credits (or quasi-vouchers) create an indirect tax subsidy of private schooling, primarily religious private schooling in practice and in likelihood in New Jersey, by providing full tax credits to corporations to gift money to a state approved entity (voucher governing body). Thus, a hole of “X” is created in the state budget. That hole is paid for by the fact that the state no-longer has to allocate state aid (>or= X) to local public districts where students accept the scholarship to attend private schools instead. Here’s the taxpayer twist. If the state was to adopt a direct subsidy program (voucher), providing state tax dollars to religious institutions, citizen taxpayers might be able to bring a legal challenge to the use of their tax dollars on religious institutions. They might lose that challenge, as in the Cleveland voucher model which the US Supreme Court determined to be religion neutral because vouchers were provided to parents who were then able to choose religious or non-religious options, as well as to choose to take a voucher or not. So, even though nearly all private school alternatives in Cleveland were religious, the system, by its design was determined neutral. NJ taxpayers might, for example, challenge the legislative choice to include an exclusively religious community among the locations for eligibility was not religion neutral (different from Cleveland). BUT THE KICKER WITH A TUITION TAX CREDIT PROGRAM – even if it would pass constitutional muster regarding the establishment clause – IS THAT TAXPAYERS DON’T EVEN HAVE STANDING TO CHALLENGE THE CONSTITUTIONALITY IN COURT. NO TAXPAYER RIGHT AT ALL! (and we’ve been yet to figure out a party that would have standing to challenge such a model)  That’s right, under this indirect subsidy approach NJ taxpayers likely would not have the right – the legal standing – to challenge NJOSA even if the legislature decided to operate the program exclusively for Lakewood? (we’d have to see how that would play out).

Do we see a theme emerging here?

I tend to be somewhat ambivalent about deference to local control arguments.  The more local we allow our education systems to be operated and financed, the greater the likelihood of substantial inequities, especially given the economically and racially segregated structure of housing stock & neighborhoods (which did not occur by chance!).  Clearly, there’s a time and place for state intervention, including state intervention in local tax policy.  After all, as I’ve explained previously on this blog, local tax authority often only exists as a function of state policy (often in state constitutions). Unfortunately, what I’ve realized over the years is that state governments have refined their own art of taking policies intended to improve equity (greater state financing) and have often used those policies to reinforce inequities as great as those which might exist without state intervention.

In fact, in our school funding fairness report we found absolutely no relationship between the share of revenues coming from state as opposed to local sources, and increased equity (figure 15). This is somewhat disheartening, and has me really questioning the optimal governance for achieving the appropriate balance between liberty and equity (to concepts often in tension with one another in policy design).

For now, I’m stumped, but stick by my basic assumption that an equitable distribution of sufficient levels of financial resources are necessary underlying conditions for achieving an education system that is both equitable and excellent (regardless of the balance of public-charter-private schooling in the mix). Further, I still believe that state courts (elected or appointed) have (and should use where necessary) the authority interpret equity and adequacy requirements of state constitutions pertaining specifically to education (and financing of schools), but I struggle with the best methods for managing the aftermath of those decisions. Either representative majority rule, or direct tyranny of the majority can, and does lead to policies that can only be rectified by a (quasi)independent judiciary. But I digress.

I am, at the very least, concerned at the apparent disregard for citizen/voter/taxpayer interests that seems to be emerging under New Jersey education policy.

Mapping the Potential Distribution of NJ Opportunity Scholarships

A while back, when the NJ Opportunity Scholarship Act was a hotter topic, I wrote a post explaining how, depending on which districts were included in NJOSA and depending on how family income qualifications were set and eligibility for those already enrolled in private schools, the largest share of scholarships could actually end up going to Orthodox schools in Lakewood. After all, Lakewood is home to the largest private schooled population in the state. Not only that, most families in Lakewood whose children attend the Orthodox schools actually have income below the 250% poverty threshold.

Even outside of Lakewood, in lower income urban communities throughout the state, private school alternatives are sparse, and primarily religious.

Here’s the lay of the land. These data are from the NCES Private School Universe Survey of 2007-08, in addition to U.S. Census Data (from http://www.ipums.org).

First, here are the locations (with circle sizes indicating enrollment size), of NJ private schools – religious and non-religious:

This map shows some concentrations of private schools in the tightly packed urban northern NJ cities, as well as a large cluster in Lakewood, and some relatively large private schools in Somerset and Morris County.The blue numbers (on red bounded areas) indicate the number of 5 to 17 year olds enrolled in private K-12 schools in each Census Public Use Micro Data Area (from http://www.ipums.org). For example, the Lakewood PUMA has 17,260 children enrolled in private schools, compared to more typical amounts of 3,0000 to 5,000 for other PUMAs statewide.

Take away the religious schools and here’s what you’ve got:

Much of the concentration of private schooling in more densely populated urban areas evaporates when religious schools are excluded. Nearly all private schools in and around Lakewood are eliminated. The remaining large private schools which are non-religious tend to be in more affluent areas such as Princeton, or in Morris and Somerset counties, and some in Bergen and suburban areas of Essex County.

Here’s the Newark area including all private schools:

Again, blue numbers are total private school enrolled students. Circles are schools, with larger schools being larger circles. The Newark area has about 2,200 children in private schools in one PUMA, 3,400 (covered by key) in the other PUMA, and about 1,000 (red number) who are from families below the 250% income threshold for poverty in one PUMA and 2,200 in the other (partly covered by key).

Now here it is excluding religious schools:

Clearly, without immediate market adjustment, few non-religious choices exist for potential Newark OSA recipients. More on this at the end of the post.

Here’s Lakewood, with religious schools:

Note that Lakewood has about 17,000 children in private schools in the PUMA. Over 10,000 of those children are from families below the 250% income threshold for poverty.  That’s more than double the Newark amount (1022 + 2189) and more than any other part of the state.

Here’s Lakewood’s private school market after excluding religious schools:

Not much left, eh?

In my previous post, I explained the financial implications of including Lakewood in the mix for NJOSA, while using the 250% income threshold and while permitting access to scholarships for students already enrolled in private schools. https://schoolfinance101.wordpress.com/2010/05/19/njosa-the-lakewood-effect/ My estimates showed that the Lakewood community could make out with around $67 million in OSA scholarships, depending on the scholarship funding level/availability.

I do not know the current parameters of NJOSA.

In any case, even beyond Lakewood, the options for private school access are hardly religion neutral. Yes, the choice, in and of itself is neutral of religion. Parents can take the scholarship and use it at a religious school or not (which is essentially all that matters for passing muster in court). But the choice set does not provide any balance of options in most cases. Of course, the other reason this program is a non-issue in court is that Opportunity Scholarship programs provide a tax credit to business to hand over contributions to an independent entity which manages the scholarships, avoiding any direct government subsidy for religious education (replacing it with a convoluted, indirect mechanism). Last April, the U.S. Supreme Court determined that taxpayers have no standing to challenge such mechanisms (essentially tossing the case on the technicality that taxpayers don’t have standing to bring a legal challenge against such a mechanisms because taxpayers lack the ability to show that they are harmed by granting of a tax credit to corporations as opposed to expending their own tax dollars directly). So, to cut to the chase, there aren’t many if any legal options here, and not only because the choice is assumed neutral even though the choices aren’t, but rather because no-one has legal standing to challenge the convoluted financial subsidy structure of tuition tax credits.

Rather, these are substantive policy concerns.

Further, as I’ve shown in previous posts, the non-religious private schools in NJ tend to have per pupil expenses far above and beyond levels considered in NJOSA (https://schoolfinance101.wordpress.com/2010/03/23/would-8000-scholarships-help-sustain-nj-private-schools/), reducing the likelihood that these schools would open significant numbers of slots to scholarship recipients.

I’ve also pointed out that the oft stated policy objective of using NJOSA to help sustain NJ’s ailing private school sector is misguided: https://schoolfinance101.wordpress.com/2010/07/21/private-schools-public-education-policy-in-new-jersey/ In short, NJ’s private school sector isn’t necessarily ailing.

While it is possible that providing opportunity scholarships could lead to an expanded “lower cost” non-religious strata of private schools in NJ urban communities, it seems more than likely that individuals interested in testing these waters and starting a new school might instead opt to apply to establish a charter school and receive the higher per pupil subsidy.  Even the leading charter schools in the NY metropolitan area, including NJ, have learned that access to substantial external philanthropy is a necessity for providing a high quality education, and start-up private schools receiving smaller subsidies would have much more ground to make up. And that’s not easy as a start up.

As such, I’m skeptical that the choice set for OSA recipients would become significantly more neutral over time.  This is especially true where charters are increasingly able to cast themselves as nearly religious by focusing on culture/language and establishing in religiously homogeneous communities. Further, charters remain able to do so by authority of the State Department of Education only, and draw on local tax dollars regardless of local taxpayer preferences.

======

More on the diminishing rights of taxpayers in a future post. Note the interesting parallels between the lack of taxpayer standing to challenge Tuition Tax Credits for religious schools, and the lack of taxpayer control over the redistribution of local property tax revenues to state authorized charter schools (except in states where that authority is explicitly granted to local officials/governments, like Georgia). In the first, the state creates a convoluted mechanism which reduces state revenue, requiring the redistribution of existing taxpayer funds, but in such a way as to eliminate the rights of taxpayers to challenge that redistribution (even though the mechanism is constructed in such a way that channels tax exempt funds primarily to religious organizations). In the second, the state grants authority for an independent entity to set up shop in your neighborhood, and to the extent that entity (quasi-religous or not) can attract customers (students) it can also access your local tax dollars to subsidize its operations, without your consent. So much more to explore here. Working on a handful of related law review articles with legal scholar Preston Green of Penn State.

Dobbie & Fryer’s NYC charter study provides no meaningful evidence about class size & per pupil spending

So, I’ve seen on more than a few occasions these last few weeks references to the recent Dobbie and Fryer article on NYC charter schools as the latest evidence that money doesn’t matter in schools. That costly stuff like class size, or  overall measures of total per pupil expenditures are simply unimportant, and can easily be replaced/substituted with no-cost alternatives like those employed in no excuses charter schools (like high expectations, tutoring, additional time, and wrap-around services). I’ll set aside the issue that many of these supposedly more effective alternatives do, in fact, have cost implications. Instead, I’ll focus my critique on whether this Dobbie/Fryer study provides any substantive evidence that money doesn’t matter – either broadly, or in the narrower context of looking specifically at NYC charter schools.

Now, in many cases, it’s really just the media spin from a study that gets out of hand. It’s just the media and politically motivated tweeters who dig for the lede otherwise buried by the overly cautious researcher. Not so much in this case. Dobbie and Fryer actually make this bold statement… and make it several times and in several forms throughout their paper – as if they’re really on to something.

We find that traditionally collected input measures — class size, per pupil expenditure, the fraction of teachers with no certification, and the fraction of teachers with an advanced degree — are not correlated with school effectiveness.

http://www.nber.org/tmp/65800-w17632.pdf

Now, I would generally treat the work of such respected researchers with great caution here on my blog. Yes, my readers know well that I do go after shoddy think tank work with little reservation. But, when the work is from a respected source, like here, or here, I do tend to be more reserved and more cautious, often second guessing whether my critique is legit.

But I’ll be honest here. I find this Dobbie/Fryer piece infuriating on many levels, some of which are simply entirely inexcusable (and, as noted below, this is the 3rd in a row, so my patience is running thin).  The basic structure of their study, as far as I can tell from the disturbingly sparse documentation in their working paper,  is that they conducted a survey of NYC charter schools to gather information on practices (the no excuses stuff) and on expenditures and class size. Then, they evaluated the correlations between individual factors (and an aggregate index of them) among traditional and no excuses measures, and alternative forms of their charter effect estimates.

Let’s be really clear here – simply testing the correlation between spending and an outcome measure – comparing higher and lower spending schools and their outcomes to see if the higher spending schools have higher effectiveness measures – WOULD TELL US LITTLE OR NOTHING, EVEN IF THE DATA WERE ACCURATE, PRECISE AND WELL DOCUMENTED. Which, by the way, they are not.

Here’s what Dobbie and Fryer give us for descriptive information on their resource measures:

FIGURE 1: D/F Descriptives

And here’s the evidence regarding the correlation between traditional resources and outcomes:

FIGURE 2: D/F Correlations (they include another table, #6 w/Lottery estimates)

So, why would it be problematic to look for a simple correlation between charter spending (“per pupil expenditure”) levels and school effectiveness measures?

First, NYC charter schools are an eclectic mix of very small to small (nothing medium or large, really) schools at various stages of development, adding grade levels from year to year, adding schools and growing to scale over time. Some are there, others working their way there. And economies of scale has a substantial effect on per pupil spending. So too might other start-up costs which may not translate to same year effectiveness measures.

Here’s a link to my detailed analysis of NYC charter school spending and the complexities of even figuring out what they spend, comparing audited annual financial report data and IRS filings: http://nepc.colorado.edu/files/NEPC-NYCharter-Baker-Ferris.pdf (as opposed to saying, hey, what do you spend anyway?)

As it turns out, school size and grade range were the only two factors I (along with Richard Ferris) found to be reasonable predictors of NYC charter school per pupil spending (note that the caption on this chart in the original report is wrong – this chart relates to predictors of total per pupil spending, not facilities spending alone). At the very least, any respectable analysis of the relationship between spending and effectiveness must account for grade range/level and economies of scale. It should probably also account for student population characteristics (which may bias effectiveness estimates). But, the sample sizes are also pretty darn small when trying to evaluate resource effects across similar grade level/range NYC charter schools. That alone will find you nothing.

FIGURE 3:  B/F Regression of factors influencing NYC charter spending

Further, NYC charter schools have different access to facilities. Some are provided NYC public school facilities (through colocation), while others are not. Having a facility provided can save a NYC charter school over $2500 per pupil per year (to be put toward other things). Dobbie and Fryer provide no documentation regarding whether these differences are accounted for in their mythical per pupil expenditure figure.

It turns out that because of the various structures, grade ranges and developmental stage of NYC charters, it’s hard to even discern a relationship between per pupil spending and class size, even after trying to account for the facilities cost differentials (typically, you’d get a pattern in this type of graph, with declining class size as per pupil spending increases).

FIGURE 4: B/F $ and Class Size

Some more detail in NYC KIPP spending here: https://schoolfinance101.com/wp-content/uploads/2011/10/slide81.jpg

The reality is that the wacky and large expenditure variations that exist across NYC charter schools don’t seem to be correlated with much of anything, individually, but are  correlated with school size and grade range (r-squared between .5 and .6 for those).

Capturing an accurate and precise representation of NYC charter school spending is messy. Not even trying is embarrassing and inexcusable. 

Even worse and most frustrating about this particular paper by Dobbie and Fryer is the absurd lack of documentation, or any real descriptives on the measures they used. Instead, Dobbie and Fryer present a very limited information form of descriptive on per pupil spending (above). We have no idea what Dobbie and Fryer believe are the actual ranges of per pupil spending across their sample of schools? Rather, we have only a measure of the amount above the mean, the high expenditure charters are (I don’t mind standardizing measures, but like to see what I’m dealing with first!) This information is presumably drawn from their survey  – with no definition whatsoever of what is even meant by “per pupil expenditures?” [which is not always a simple question] Did the costs of wrap-around services in Harlem Children’s Zone count?  Dobbie and Fyer’s earlier back of the napkin estimates of HCZ wrap around costs (see below) fall well short of the revenue we identified for HCZ in our report by actually looking at their financial statements.

Even if Dobbie and Fryer did find, in appropriately documented analyses, using more accurate/precise and appropriate spending measures, that spending was not correlated with charter effectiveness estimates in NYC, this would be a very limited finding.

The finding is more limited in light of the fact that the supposedly resource neutral strategies used in their “no excuses” schools aren’t resource neutral at all. Rather, the cost implications of these resource intensive strategies are not carefully explored (similar to the unsatisfying lack of real cost analysis in Fryer’s recent Houston Apollo program study – again, no documentation at all!).

Dobbie and Fryer’s NYC charter study adds nothing to the larger debate on the importance of class size, or financial resources toward improving school quality and/or student outcomes. A much richer, more rigorous literature on this topic already exists, and I will provide a thorough review of that literature at a future point in time.

Tip – surveys of interested parties are not how to get information on finances. Audited financial statements are probably a better starting point, and two forms of such data are available for nearly all NYC charter schools. Further, where specific programs/services are involved, a thorough resource cost analysis (ingredients method) is warranted. This is School Finance (or Econ of Ed) 101.

Other examples of sloppy, poorly documented cost/benefit inferences from recent Dobbie and Fryer papers:

Here’s a segment identified as cost-benefit analysis from Dobbie and Fryer’s paper on Harlem Children’s Zone:

 The total per-pupil costs of the HCZ public charter schools can be calculated with relative ease. The New York Department of Education provided every charter school, including the Promise Academy, $12,443 per pupil in 2008-2009. HCZ estimates that they added an additional $4,657 per-pupil for in school costs and approximately $2,172 per pupil for after-school and “wrap-around” programs. This implies that HCZ spends $19,272 per pupil. To put this in perspective, the median school district in New York State spent $16,171 per pupil in 2006, and the district at the 95th percentile cutpoint spent $33,521 per pupil (Zhou and Johnson, 2008).

http://www.economics.harvard.edu/files/faculty/21_HCZ_Nov2009_NBERwkgpaper.pdf

This paper on Harlem Childrens Zone provides no attempt to validate the $4,657 figure, and no documentation from financial reports to reconcile it. We discuss in our NEPC report, the range of likely expenditures  for HCZ, where $4,657 would be below our low estimates (though 2 years earlier), based on mining actual IRS filings and audited financial reports. Further, it is absurd to compare HCZ spending to NY State mean spending without any consideration for variations in regional costs. It is far more reasonable to compare the relevant spending components to similar schools within NYC serving similar student populations.  Their statement about perspective puts absolutely nothing into perspective, or at least not into any relevant perspective.

Here’s all of the information provided in the Apollo 20 no excuses Houston public schools study:

The experiment’s cost of roughly $2,042 per student – 22 percent of the average per pupil expenditure and similar to the costs of “No Excuses” charters – could seem daunting to a cash strapped district, but taking the treatment effects at face value, this implies a return on that investment of over 20 percent.

http://www.hisd.org/HISDConnectEnglish/Images/Apollo/apollo20whitepaper.pdf

The $2,042 figure is not documented at all. This is where a resource cost analysis would be appropriate (identifying the various resources that go into providing these services, the input prices of those resources, and determining the total costs). Further, it is not cited/documented anywhere in this paper any source that shows that no excuses charters spend about the same. Where? When? Actually, $2,000 per pupil in Texas is one thing and something entirely different in NY? This stuff isn’t trivial and such omissions are shameful and inexcusable.

The Comparability Distraction & the Real Funding Equity Issue

Yesterday, the US Department of Education released a new report addressing how districts qualified for Title I funds (higher poverty districts) often allocate resources across their schools inequitably, arguing that requirements for receiving Title I funds should be strengthened.

The report is here: http://www2.ed.gov/rschstat/eval/title-i/school-level-expenditures/school-level-expenditures.pdf

Related resources here: http://www2.ed.gov/about/offices/list/opepd/ppss/reports.html#comparability-state-local-expenditures

It is certainly problematic that many public school districts have far from predictable, far from logical and far from equitable formulas for distributing resources across their schools. This is a problem which should be addressed. And improving comparability provisions for receipt of Title I funding is an appropriate step to take in this regard.

However, it is critically important to understand that improving within district comparability of resources across schools is only a very small piece of a much larger equity puzzle. It’s a drop in the bucket. Perhaps an important drop, but not one that will even come close to resolving the major equity issues that plague public education systems today.

I have written on this topic previously both on this blog and in peer reviewed publications:

  • Baker, B. D., & Welner, K. G. (2010). “Premature celebrations: The persistence of interdistrict funding disparities” Educational Policy Analysis Archives, 18(9). Retrieved [date] from http://epaa.asu.edu/ojs/article/view/718
  • B. D. (2009). Within-district resource allocation and the marginal costs of
    providing equal educational opportunity: Evidence from Texas and Ohio. Education Policy
    Analysis Archives, 17(3). Retrieved [date] from http://epaa.asu.edu/epaa/v17n3/.
  • Baker, B.D. Re-arranging deck chairs in Dallas: Contextual constraints on within district resource allocation in large urban Texas school districts. DeckChairsinDallas.Baker (forthcoming in Journal of Education Finance)

Among other things, I have pointed out on this blog that one reason why focusing on within district disparities between “rich and poor” schools is misguided is because most of the disparities in wealth among families and children occur across district lines rather than within district boundaries. (2nd major point in post)

The new U.S. Dept. of Ed. report reinforces this overemphasis on within district disparity, ignoring entirely between district disparity. In part, it is perhaps a more politically convenient argument to point blame at local school district officials, rather than states, for not doing their part to improve equity across schools. Local school officials make good targets, but it’s harder to pick on states & state legislatures.

Here’s one way in which the USDOE report casts the disparities:

The report compares the number of Title I (higher poverty) schools that have lower per pupil spending than non-Title I schools in the same district.  This becomes fodder for the news headlines. And I would argue, fuels public distraction from the bigger inequities.

Now, there are a multitude of methodological quibbles I have with this analysis. First, it compares only the average spending of Title I and non-Title I schools within districts, without consideration for other factors which frequently serve as strong predictors of different school site spending across schools within districts (primarily, concentrations of children with disabilities, and district choices to locate specific programs in specific schools). Poverty is one factor – and a very important one at that – but it’s also important to look across the full range of poverty concentration across schools in a district, rather than just splitting schools into Title I and non-Title I. The Deck Chairs in Dallas article above provides examples of the steps one should take to evaluate equity in spending across schools within districts. So too does this article: http://epaa.asu.edu/ojs/article/view/5

But, let’s take a look at the more important issue that is missed entirely in the myopic focus on within district disparities and “blame the local districts” approach to school funding equity.

First stop, Philadelphia. This first graph shows the box plot of elementary school spending per pupil from the data set used in the USDOE report (nice new data to play with!) Philadelphia city elementary schools simply have far less than elementary schools in surrounding districts (in Pennsylvania). THIS IS THE MAJOR EQUITY CONCERN!  Here’s how these funding differences play out along a continuum of all schools in the metro (within PA) with respect to students qualified for free or reduced price lunch:

Philadelphia schools are in Red. Indeed, the pattern of spending per pupil with respect to % free or reduced price lunch is not what I would want/expect to see across schools within Philadelphia. It actually appears somewhat regressive. That is, higher poverty schools within Philadelphia having marginally lower spending per pupil than lower poverty ones. But, there may be some other factors at play (such as special education population distributions) which complicate the interpretation of this relationship. But, we also see that:

  1. the majority of Philadelphia elementary schools have near or over 80% free or reduced price lunch
  2. the majority of schools in this picture that are over 80% free or reduced price lunch are Philadelphia schools
  3. Philadelphia schools have systematically fewer per pupil resources than those of surrounding districts
  4. the majority of other schools in the metro area have fewer than 40% free or reduced price lunch
  5. these much lower poverty schools IN OTHER DISTRICTS have higher average spending.

These are the districts with which Philadelphia must compete to recruit and retain a sufficient quantity of high quality teachers. And it’s clearly a losing battle.

Focusing only on the disparities inside Philadelphia, bringing the comparability hammer down on Philadelphia does little to resolve the bigger funding equity issues that are a function of neglect by the Commonwealth of Pennsylvania, not the city of Philadelphia.

Not all metro areas look this bad. In many cases, central cities are on average or slightly above average for their metro areas. But arguably, not “enough” above average that they have wide latitude to reshuffle their resources aggressively to their higher poverty schools. Note that if Philadelphia did strive to create a strong progressive distribution of resources toward higher poverty schools, all other schools in the district would be left with next to nothing – at least relative to their surroundings. This is the very “deck chairs” issue I discuss in my paper on Dallas (well, actually on Texas as a whole).

It also turns out that many smaller cities, and very poor inner urban fringe areas (with particularly weak tax base) are often as disadvantaged or much more disadvantaged than the urban core. Places we don’t always hear about. Here’s one of my favorite small city examples, Utica, NY:

Utica City elementary schools (1 in Box Plot) have much lower average per pupil spending than elementary schools in surrounding districts.Here’s the scatterplot with respect to % free or reduced price lunch:

Like Philadelphia, there appear to be inequities in resources across Utica City elementary schools. But again, most Utica City elementary schools have over 80% free or reduced price lunch and spend less per pupil than most elementary schools in surrounding districts, many of which are not wealthy districts by any stretch of the imagination. They’re just not as poor as Utica itself. Here’s a little more backdrop on the position of Utica among NY State school districts.

While it is important, and relevant to consider ways to tighten regulations on Title I districts to require that they are allocating resources equitably across schools within their boundaries, we cannot and should not let the emphasis on Title I and Comparability distract us from the bigger equity issues – the harder equity issues to resolve.  While it’s politically convenient to blame local bureaucrats (those overpaid fat cats in large city school district central offices) we must also maintain pressure on states to do the right thing, and ensure that these districts have the resources they need in order to distribute them equitably.

see also: http://www.schoolfundingfairness.org/

Dealing with the Devil? Policy Research in a Partisan World

This note is in response to James O’Keefe’s attempt to discredit me on his Project Veritas web site (though I think his point was intended to larger than this). I was lucky (?) enough to be part of one of his investigative set ups earlier this fall. I wrote and held on to this post and all related e-mails.

His scheme was uncovered in this Huffington Post piece to which he refers in his most recent report:

http://www.huffingtonpost.com/mobileweb/2011/10/17/james-okeefe-economic-policy-institute_n_1015845.html

The story:

Back in September, I was contacted by this fictional Peter Harmon who characterized himself as working for the Ohio Education Association, but never made it absolutely clear that he was working for the state teachers’ union of Ohio. In my case, unlike the EPI case, Harmon didn’t (I don’t recall) indicate being a hedge fund guy or being backed by one, but rather that he had “funders.” He dropped me a phone message and an email which were pretty innocuous, so I agreed to talk by phone. That’s where I pick up in this string of e-mails:

===================================

EMAIL #2 – PHONE CALL SET UP

From: peter.harmon@ohioedassoc.org

Sent: Monday, September 19, 2011 10:14 PM

To: bruce.baker@gse.rutgers.edu;

gse.rutgers.edu/bruce_baker@ohioedassoc.org

Subject: Meeting

Dr. Baker,

Thank you for getting back to me.  We are eager to talk with you about this project. Would 3pm tomorrow work alright for you?

Sincerely,

Peter Harmon

614-468-3941

===================================

Then there was the strange phone call (which I’m quite sure in retrospect was recorded) where first, “Peter Harmon” wanted me to do a study showing that the collective bargaining legislation in Ohio would hurt children, to which I suggested that a) evaluating collective bargaining legislation is outside the realm of my expertise and b) that even if I agreed that it might, I’d have no clear, defensible way to analyze and argue that point.

From there I suggested things that I can and often do analyze and argue, in each case pointing out that the ability to make such an argument is contingent upon data to support that argument. For example, evaluating the competitiveness of teacher wages over time, or evaluating the distribution of state aid cuts. These are two issues on which I have already actually evaluated Ohio data. I pointed out that there are 3 basic types of products we might be talking about – a) critiques of policy reports or arguments by others (for a few thousand dollars), b) policy briefs/research brief reports (typically about ten thousand dollars) or c) full scale research report (thirty to fifty thousand dollars, with clarification that projects of this magnitude would have to go through RU and/or be done over the Summer).  I attempted repeatedly to shift his focus to answerable questions and topics within my expertise, and to topics or issues where I felt I could be helpful to him, on the assumption that he was advocating for the state teachers’ union.

It got strange when Peter Harmon laid down his requirement that if they were going to fund a study, they didn’t want it coming out finding the opposite of what they wanted. I did explain that if he had a topic he was interested in, that I would be willing to explore the data to see if the data actually support his position on the issue and that I would do so before agreeing to write a report for him. The phone call ended with no clear agreement on anything, including no agreement on even what the topic of interest was.  In fact, my main point was repeatedly that he needed to figure out what the heck he even wanted to study, though I tried to keep it friendly and supportive. No reason to argue on a first phone call.

It was a strange and disturbing conversation, but I played along until I could get off the phone with the guy. Note that the playing along in a conversation like this also involves trying to figure out what the heck is up with the caller – whether he/she has a particular axe to grind – or other issues that would make any working relationship, well, not work out.

Sadly, as twisted as this phone call was, I’ve had similarly twisted conversations with real representatives of legitimate organizations. However, with most legitimate organizations, you can later identify the less sleazy contact person. My approach has generally been to humor them while on the phone… perhaps probe as to see how twisted they really are… and when the phone conversation ends….let it pass. Move on.

Then came the follow up:

===================================

EMAIL #3 – HARMON FOLLOW-UP

From: peter.harmon@ohioedassoc.org [mailto:peter.harmon@ohioedassoc.org]

Sent: Friday, September 23, 2011 10:01 AM

To: bruce.baker@gse.rutgers.edu; gse.rutgers.edu/bruce_baker@ohioedassoc.org

Subject: Next Meeting

Dr. Baker,

I have good news, my colleagues are very interested in moving forward.

We are confident we can cover the expense of this potential study.

We have a few ideas we would like to run by you for this project.

When would be a good time to call you next?

Regards,

Peter Harmon

614-468-3941

===================================

So now, Harmon is basically suggesting that he can generate the $30 to $50k figure which I had given him for a bigger study, a figure I had basically given him to encourage him to think about doing something else – like contracting a few short policy briefs or critiques. But, he still has no idea what he supposedly wants me to write about. Quite honestly that’s really strange. So my response is simple – it’s essentially a get your act together and don’t both me again until you do. In other words, here are a few examples of the work I do and am proud of. Figure out your damn question and let me know when you do.

===================================

EMAIL #4 – BAKER REPLY

From: Bruce Baker [bruce.baker@gse.rutgers.edu]

Sent: Friday, September 23, 2011 10:06 AM

To: ‘peter.harmon@ohioedassoc.org’

Subject: RE: Next Meeting

Rather busy for next week or so. Would prefer if you could at least send an outline of potential topics & research questions of interest, so I can mull them over.

For examples of reviews/critiques of policy reports, see:

http://nepc.colorado.edu/thinktank/review-middle-class

http://nepc.colorado.edu/thinktank/review-spend-smart

For an example of a policy brief/research report, see:

http://nepc.colorado.edu/publication/NYC-charter-disparities

http://nepc.colorado.edu/publication/private-schooling-US

Thanks.

Bruce Baker

===================================

Here’s Harmon’s attempt at figuring out his question:

===================================

EMAIL #5 – HARMON REPLY

Dr. Baker,

Thanks for getting back to us.

Once of the topics we want to pursue is research regarding spending.

Specifically and increase in spending having a good effect on children. If you need to limit the scope of your research to a specific county, district or other local geographic area. that’s OK.

I will take a closer look at the examples you sent on your last email to get a better idea of what you would like from our end.  But,I hope this more specific goal better illustrates what we are looking for.

Let me know when would be good time to call, so I can clarify whatever questions you have about this.

Peter Harmon

614-468-3941

===================================

So, Peter Harmon wants me to explain, or more strangely to show that increasing spending is good for children. Okay. Anyone even modestly informed would know that’s an odd way to frame the question or issue. But clearly, given my body of work, I have argued on many occasions in writing and in court that having more funding available to schools can improve school quality, which is something I would certainly argue is good for children. Would I somehow use data on a specific district or county to do this? No…. uh… not sure? I’d probably start with an extensive review of what we already know from existing research on money and school quality.

At this point, I’m ready to drop the whole discussion, but receive an e-mail notice of a new Economic Policy Institute paper on public employee wages in Ohio. So, to save Mr. Harmon money paying for a new study on this topic, I a) send him a link to that study, and b) explain that I’m already working on a paper related to his issues of concern.

=================================== 

 EMAIL #6 – BAKER REPLY

From: Bruce Baker [bruce.baker@gse.rutgers.edu]

Sent: Thursday, October 06, 2011 10:44 AM

To: ‘peter.harmon@ohioedassoc.org’

Subject: FYI

From one of my Rutgers colleagues:

Click to access Briefing_Paper_329.pdf

Working on some related projects myself, which may be of use to you in near future. Will be back in touch as schedule frees up.

Bruce

===================================

And so it ended. And as I suspected by this point, it appears that this whole thing was a sham… and an attempt at a sting. Interestingly, this appears to be when Harmon moved on to go after EPI.

Quite honestly, O’Keefe’s concept for the investigation isn’t entirely unreasonable except that he and his colleagues didn’t seem to fully understand the fundamental difference between research projects per se, and policy analyses – between writing summaries and opinions based on data that already exist and research that’s already been done – versus exploring uncharted territory – where the data do not yet exist and where the answers cannot yet be known.

At this point, I think a few clarifications are in order about doing policy research, or more specifically writing policy briefs in a highly political context.

First, why would I ever vet the data on an issue before signing on to do work for someone? Well, this is actually common, or should be in certain cases. For example, let’s say the funder wants me to show that “teachers in Ohio are underpaid.” I don’t know that to be true. I’m not going to take his money to study an issue where he has a forgone conclusion and a political interest in that conclusion but where the data simply don’t support that conclusion. It is relatively straight forward for me to check to see if the data support the conclusion before I agree to write anything about it. This is an easy one to check. There are a standard set of databases to use, including statewide personnel data, census data and Bureau of Labor Statistics data and there are standard credible methods for comparing teacher wages. If the argument holds up applying the most conservative (most deferential analysis to the “other side” of an argument) analysis, then it’s worth discussing how to present it or whether to move forward.

A different type of example which I’ve learned by experience is that it’s always worth taking a look at the data before engaging as an expert witness on a school funding related case. I often get asked to serve as an expert witness to testify about inequities or inadequacies of funding under state school finance systems. Sometimes, attorneys have already decided what their argument is based only on the complaints of their clients. It would be utterly foolish of me to sign on to represent those clients and accept payment from them without first checking the data to see if they actually have a case.

Then there’s the issue of doing work for partisan clients to begin with. That’s a different question than doing work for sleazy clients. But sometimes, if it’s a legitimate organization, there may be a sleazy contact person, but further checking reveals that the organization as a whole is credible – and not sleazy. But back to the point…

Quite honestly, the toughest kind of policy analysis to do is for partisan clients – clients with an axe to grind or a strong interest in viewing an issue in one particular way. That is usually the case in litigation and increasingly the case when it comes to writing policy briefs on contentious topics. What this means is that the analyses have to be “bullet-proof.” There are a few key elements to making an analysis “bullet proof.”

First, the analysis must be conservative in its estimates and one must avoid at all cost overstating any claims favored by the client. In fact, the analysis needs to be deferential, perhaps even excessively, to the opposing view.

Second, the analysis must use standard, credible methods that are well known, well understood and well documented by others. Examples in my field would include comparable wage analysis, or wage models which typically include a clearly defined set of variables.

Third, the analysis must rely on publicly accessible data, with preference for “official” data sources, such as state and federal government agencies. This is because the analyses should be easy for any reader to replicate by reading through my methods and downloading or requesting the data.

So here are my final thoughts on this issue…

If this kind of stuff causes anyone to place greater scrutiny on my work of that of any others writing policy briefs on contentious topics that’s fine. It’s not only fine, but desirable. I am fully confident that my work stands on its own. Unlike some, I don’t simply take a large commission to offer my opinion without ever having looked at any data. For example, Eric Hanushek of Stanford University took $50,000 from the State of Colorado to testify that more money wouldn’t help kids and that Colorado’s school funding system is just fine, without ever having looked at any data on Colorado’s school funding system. See:

http://www.edlawcenter.org/news/archives/school-funding/what-hanushek-shows-up-again.html?searched=hanushek&advsearch=oneword&highlight=ajaxSearch_highlight+ajaxSearch_highlight1

By contrast, I did indeed accept a payment of $15,000 for writing a nearly 100 page report filled with data and detailed analyses of Colorado’s school funding system raising serious questions about the equity and adequacy of that system (available on request). In fact, I had already come to the conclusions about the problems with Colorado’s school funding system long before I was engaged by the attorneys for the plaintiff districts (as one will find in many of my blog posts referring to Colorado).

My rule #1 is always to check the data first and to base my opinions on the data. So I welcome the scrutiny on my work and I especially welcome it directly. If you have a criticism of my work, write to me. The more scrutiny on my work the better.

=========

Note #1: for an example of the types of policy briefs and/or analyses to which I am referring here, see:  NY Aid Policy Brief_Fall2011_DRAFT6

In my view, this is a solid, rigorous and very defensible analysis. It is a policy brief. It uses numerous sources of publicly available data. And, it was written on behalf of an organization which has self-interested concerns with the NY school finance formula.

Note #2: Indeed there were some poor word choices on my part in the phone conversation. “Play with data” is how I tend to refer to digging in and vetting the data to see what’s there. This blog is dedicated to what I would refer to as playing with data.  Looking stuff up. Downloading large data files (IPUMS, NCES). Running statistical models. My friends and colleagues, as well as my students know full well that I take great joy in working with data and that I consider it play.  But I’ll admit that it sure doesn’t sound too good when taken out of that context.

Note #3: A few people have asked about the portion of the conversation where I suggest that if I find results that do not support the funders’ views, I will not charge them for the work. Some have suggested that this is an example of burying an undesirable result, which would in my view be unethical. So, what’s the point of not charging them? Actually, it’s so that the result won’t get buried. If I do a bunch of preliminary data analyses only to find that the data do not support a funder’s claims/preferences, I’d rather not write up the report for the funder and charge him/her, because they then own the report and its findings, and have the control to bury it if they so choose. Now, I typically don’t permit gag-order type clauses in my consulting contracts anyway, but, it’s much easier just to avoid the eventual pissing match over the findings and any pressure to recast them, which I will not do.  If I keep the results of my preliminary work for myself, then I have complete latitude to do with them as I see fit, regardless of the funder’s preferences. It’s my out clause. My freedom to convey the findings of any/all work I do.

I’ve come to this approach having had my results buried in the past on at least two occasions, one in particular where the funder clearly did not want the results published under their name due in part to pending litigation in which they were a defendant. Much to my dismay, the project coordinators (agency that subcontracted me) capitulated to the funder. I was, and remain to this day, deeply offended by the project coordinator’s choice under pressure by the funder, to edit the report and exclude vital content. Yeah… I got paid for the work. But the work got buried, even though the work was highly relevant. I’m unwilling to go down that road again.

License to Experiment on Low Income & Minority Children?

John Mooney at NJ Spotlight provided a reasonable overview of the NJDOE waiver proposal to “reward” successful schools and sanction and/or takeover “failing” ones.

The NJDOE waiver proposal includes explanation of a new classification system for identifying which schools should be subject to state intervention, ultimately to be managed by regional offices throughout the state. This new targeted intervention system classifies districts in need of intervention as “priority” districts, with specific emphasis on “focus” districts. Mooney explains:

In all, 177 schools — known as Focus Schools — fell into this category, largely defined as the bottom 10 percent in terms of the achievement gaps between the highest- and lowest-performing student groups over three years.

http://www.njspotlight.com/stories/11/1117/0003/

The new system also has a reward program:

The same list also includes the schools that the state designates as Reward Schools, based on both their overall achievement and their progress. Reward Schools with high poverty concentrations will also be rewarded with cash: $100,000 each.

http://www.njspotlight.com/stories/11/1117/0003/

But, some significant questions persist as to whether the state is over-reaching its authority to intervene in the “focus” and priority schools. Here are a few comments from a related article:

“Consistent with state law, they can go in and direct districts to take particular actions,” said David Sciarra, director of the Education Law Center that has spearheaded the Abbott litigation. “All of that, they clearly have the authority to do.

“But nothing that I am aware of allows them to close existing schools,” he said. “And they have no power to withhold funds. That’s even outside the scope of the federal guidelines. ”

Paul Tractenberg, a Rutgers Law School professor and noted expert on education law, said he also questioned whether the application’s reform plans ran counter to the state’s current school-monitoring system, the Quality Single Accountability Continuum (QSAC).

“As a constitutional matter, it is pretty clear the commissioner has whatever power he needs to ensure a thorough and efficient education,” he said. “But that’s different than saying if there is a legislation out there, he can just ignore it.”

In terms of significant alterations such as reassigning staff or directing changes in collective bargaining, Tractenberg said, “there are all kinds of big-time issues about their legal authority to do that.”

http://www.njspotlight.com/stories/11/1117/2359/

Of course, a related twist here is just which schools are involved. NJDOE like other state agencies has adopted a set of performance metrics most likely to single out schools serving the largest shares of low income and minority students for dramatic interventions – for school closure – or for major staffing disruptions (strategies with little track record of success).

Here’s the breakdown of which schools will be subject to closure, staff replacement or other intervention, versus those who will be left alone and those eligible for a check for $100,000.


When considering racial composition, poverty and geographic location (metro area) simultaneously as predictors of school classification:

  • A school that is approaching 100% free lunch is nearly 30 times more likely to be classified as a focus school (as opposed to all other categories including priority) than a school that is 0% free lunch.
  • A school that is approaching 100% free lunch is nearly 60 times more likely to be either a priority or focus school (compared to all other options) than a school that is 0% free lunch.

While the typical FOCUS school is 26% black, 39% Hispanic and 51% free lunch, the typical reward school is 7.2% black, 11.3% Hispanic and 10.3% free lunch.

[note: several NJ schools had missing data in the 2009-10 NCES Common Core of Data which were merged with the NJDOE schools list http://www.njspotlight.com/assets/11/1116/2300. Total school enrollment data were most commonly missing, and where possible were replaced with the sum of racial subgroup data for calculating racial composition. Complete data were matched and available for 160 of the 177(9?) focus schools and 120 of the 138(?) reward schools. Thus, I am sufficiently confident that the above patterns will hold as remaining missing data are added.]

NJDOE will likely argue that they are intervening in these schools because poor and minority kids are the ones getting the worst education, which may in part be true. But causal attribution to the teachers and administrators in these schools and districts stands on really shaky ground – especially on the statistical basis provided by NJDOE.  The accountability framework chosen is merely identifying schools by the extent of the disadvantage of the students served and not by any legitimate measures of the quality of education being provided.

Further, and perhaps most disturbing, is that this policy framework, like those proposed and used elsewhere is, in effect,  (self-granted) license for NJDOE to experiment on these children with unproven “reform” strategies which are as likely to do harm as to do good (that is, likely to do more harm than even simply maintaining the status quo).  Helen Ladd’s recent presidential address at the Association for Public Policy Analysis and Management provides exceptional insights in this regard!

Why we need those 15,000+ local governments?

Neal McClusky at Cato Institute makes a good point about our casual, imprecise use of the term “democracy” in the post linked here. I did not delve into this in my previous post, and more or less allowed the imprecise terminology to slip past. Clearly there are huge differences between simple majority rule through direct democracy and our constitutional republic with separation of powers, and I certainly favor the latter.

My original point was that Bowdon completely misrepresents not just a single judicial decision in Georgia, but the notion of the “will of the people” as expressed through our form of government, especially in Georgia and especially in this case. By Bowdon’s strange logic, the will of the people in Georgia is only expressed through the legislation adopted by elected state officials – the state legislature. Local elected officials apparently don’t count – and in Bowdon’s view, the choice of these local elected officials to challenge the constitutionality of state legislative action is somehow an attack on the will of the people. Further, the judicial mediation of this dispute – by an elected judiciary – is an extension of that attack on the will of the people?

Really, the big question which goes back to Mike Petrilli’s post is determining the right balance between centralized versus local control, as carried out by our elected officials at each level. Certainly the process of electing our officials at either the local, state or federal level can become corrupted over time. Local elections can be corrupted (or at least become less expressive of the “will of the people”) by imbalanced influence (the will of some preferred more than others) on those elections and so too can state and federal elections. It would seem that Petrilli’s core argument is that local elections are necessarily most corrupt and most imbalanced because, as he sees it, local elections are entirely controlled, essentially owned by teachers’ unions, whereas state and federal elections clearly remain more pure? less influenced by imbalance of money/power? So, essentially, Mike’s argument is that we must negate the policy decision making power of the most corrupted level of the system, which in his view, are local elected officials. I find that a really hard argument to swallow.

Alternatively, on can argue in favor of centralization, as I used to (and still do on some occasions), that the higher levels of government should – by representing larger and more diverse constituencies and by having greater access to resources (including bigger budgets) – be able to accumulate better technical capacity to make more informed policy decisions. That is, to develop/design/adopt policies better grounded in technical analysis of what works. I’ve become increasingly cynical on this point of late, and quite honestly, I’m generally unwilling to see the overall power distribution shift more heavily from local to state, especially to federal policy decision making.

I still feel strongly that due to economic inequities in tax base and other measures of collective fiscal capacity of communities to provide schools – many of which were induced by policies of housing segregation and discrimination – that states must play a strong role in revenue redistribution in order to ensure that children, regardless of where they live, have access to equitable and adequate schooling.This perhaps where my perspectives begin to diverge most dramatically from McClusky’s preferred policy solutions (though we’ve not debated/discussed the particulars).

I still feel that state agencies can (in their better days), perhaps provide technical support to local schools and districts which are struggling, but I fear that state agencies (departments of education) have become increasingly politicized and instead of providing technical support, are now invariably promoting political agendas (perhaps I’m just waking up to something that’s been occurring all along?), and in many cases forcing ill-conceived politically motivated “reforms” on struggling districts and schools (rather than ensuring access to sufficient resources). See my previous post on pundits vs. practitioners.

So, at this stage in my life and career, I’m not willing to cede to the idea of eliminating entirely the role of local elected officials (or even unbalancing these roles further), as Mike Petrilli might wish. Nor do I accept that a reason for eliminating local elected officials from the mix is that local elections are most corrupted by money & uneven influence (of unions?). This seems merely an argument of convenience from the Petrillian standpoint that right now, he just happens to agree more with the policies of states – and potential to influence federal policy in order to control states – than the current push-back of locals. That’s a rather common perspective from inside the beltway (physically or mentally). It’s logistically easier for an organization like Fordham Institute (which casts itself as providing research/technical guidance?) to have disproportionate impact on policy through a single locus of control – federal gov’t – than through 15,000 local governments (that takes a lot of leg work). And that’s precisely why we need those 15,000+ local governments!

Logic and Facts, not Democracy, be Damned!

Thanks to good ol’ Mike Petrilli, much of this week’s education policy debate has centered on the relevance of local school boards and the age old tug-of-war between state and local authority over the operation and financing of local public school districts. Much of the debate has been framed in terms of “democracy,” and much of it has been rather fun and interesting to watch.  That is, until Mike and the crew at Fordham decided to let Bob Bowdon (of Cartel fame) join in the conversation, and inject his usual bizarre understanding of the world as we know it.

This time, jumping in where Petrilli had left off, Bowdon opined about how teachers unions and their advocates repeatedly cry for respecting democracy while consistently thwarting democratic efforts through legal action. The layers of absurdity in Bowdon’s  logic are truly astounding, and perhaps best illustrated by walking through one of the examples he chooses.

Here’s how Bob Bowdon explains the Georgia charter school governance and finance decision of May 2011:

When the elected legislature in Georgia authorized the state’s chartering of schools, the Georgia Association of Educators union wasn’t so happy with the voice of the people. They later filed a brief in support of a lawsuit to strike down the law — and that suit prevailed. Democracy be damned.

http://www.educationgadfly.net/flypaper/2011/11/who-has-a-problem-with-democracy/

So, according to Bob Bowdon, the way this really ambiguously referenced case played out was that the Georgia legislature acting entirely on the will of the good Georgians that elected them, passed a law establishing a statewide commission to oversee the operation and distribution of funding to charter schools. The state teachers union got pissed simply because they don’t like charter schools. The teachers union filed a brief with a sympathetic liberal activist court, which then, under no authority at all… merely being responsive the gripes of the teacher’s union, struck down the charter law. A major blow against democracy. Democracy be damned!

Okay. Let’s take a closer look at what actually happened.  One reasonable summary can be found here: http://www.accessnorthga.com/detail.php?n=238715, see also: http://www.earlycountynews.com/news/2011-05-18/Front_Page/Court_ruling_leaves_charter_schools_in_limbo.html

First, let’s acknowledge that Georgia, like other states has a) elected state officials – the legislature – who pass laws, such as the charter school law they had passed which would allow a state commission to redirect county funding (county and area district tax revenues) to charter schools established within their boundaries [by way of reducing state aid in a equal amount], b) county and area boards of education charged with establishing and maintaining public schools within their limits, and c) a State Constitution which outlines these responsibilities (http://www.sos.ga.gov/elections/GAConstitution.pdf, bottom of Page 60). That’s kind of how stuff works in U.S. States.

The County board of education in Gwinnett County, GA was not thrilled when they were informed they would be required to transfer significant funds to charter schools established under the legislatively granted authority of the state commission. The county board of Gwinnett County (joined by many others to follow) challenged in court that the legislature violated the constitution by granting authority to this state commission to redistribute county tax revenues – and more specifically – to establish and maintain schools (that would draw on such tax revenues).  So, one level of elected officials – county officials – challenged that another level of elected officials – the state legislature – had interfered with their explicitly stated constitutional authority. And the court mediated this dispute (uh… ‘cuz that’s what courts do), finding in favor of the elected officials whose authority to establish and maintain schools was clearly articulated in the constitution?

How in the hell is that a case of “democracy be damned?”  How is this a case of a union thwarting the “voice of the people.” Quite honestly, these are among the most bizarre, warped distortions of reality I’ve seen in a damn long time.

That makes about as much sense as the rest of the arguments in the Cartel movie, or in the graphs at the end of this post!

 

Note: Another fun twist here is that apparently, in Georgia, judges are elected (http://www.georgiaencyclopedia.org/nge/Article.jsp?id=h-2841). Democracy be damned I tell you! How can these elected officials overturn the will of the people as expressed by the elected legislature, when challenged in court by elected county officials?

The Wrong Thinking about Measuring Costs & Efficiency in Higher Education (& how to fix it!)

There is a movement afoot to reduce the measurement of the value of public institutions of higher education to a simple ratio of the revenue brought in by full time faculty members divided by the salaries and benefits of those faculty members. That is, does each faculty member “pay” for him or herself, on an annual cash flow basis?[1]

Even some of the finest major public colleges and universities have recently succumbed to reporting such information, arguably, in an effort to appease politically motivated critics.[2] This seemingly simple ratio of the “net cost” of faculty salaries and benefits is presumed representative of the relative efficiency of higher education institutions and/or entire public systems of higher education.

This is a dreadfully oversimplified if not simply wrongheaded approach to measuring the cost of providing public higher education.  It is also a simply wrong approach to characterizing the efficiency of production of higher education institutions or higher education systems, largely because the approach ignores entirely the question of what higher education institutions produce. More importantly, measuring institutional performance and efficiency in this way does little or nothing to inform policymakers or institutional leaders on how to get more bang for the buck from higher education. That is, how to generate greater economic benefit to the state or society as a whole, by achieving more efficient production of an educated citizenry.

Arguably, the greatest economic (setting aside cultural and social) value-added of public higher education systems is achieved when those systems can efficiently transform high school graduates into college graduates, with all of the economic and societal benefits bestowed on them (at least in relative terms). This is especially true for high school graduates from low-income backgrounds, including first generation college students. Accepting an economic emphasis, public higher education institutions can and should substantially improve the economic outlook and lifelong earnings of students who otherwise have the least likelihood of college degree completion. Thus, public higher education’s role in providing value added to the economy and to society as a whole.

As such, what we must begin to better understand is how colleges and universities can improve the efficiency with which they produce undergraduate (and graduate) degrees across a variety of fields, and for students of varied backgrounds. Further, we must establish metrics of cost and efficiency that promote the right incentives for faculty and institutions of higher education to improve degree production, especially for those students previously least likely to complete their undergraduate education in a timely and efficient manner. The current policy rhetoric and proposed metrics do little or nothing to advance these policy objectives.

Flawed Reasoning and Bad Incentives of the Net-Value Approach

Under the politically popular model of faculty “net value,” the basic underlying assumption is that higher education faculty are worth as much as the sum of a) the grant funding they bring to the institution and b) the number of student credit hours they produce, thus generating tuition revenue. It is then assumed that if the state subsidized portion of the faculty member’s salary is greater than the sum of the other two values, that faculty member is inefficient (or not worth it).  Therefore, the incentives for any faculty member formally evaluated or even informally characterized by this model are to either, track down enough external grant and contract funding to pay in full, his or her own salary and/or to teach enough large sections of large classes and recruit enough students into his or her classes to cover salary and benefits. The same incentives similarly apply to all faculty. But both are counterproductive incentives.

If the mission of public higher education is to produce an educated citizenry that contributes to the economy and society as a whole, as well as being a direct engine of economic development through research and scholarly productivity, then having all faculty focus their efforts on chasing external funding to cover their costs and reduce or eliminate teaching from their responsibilities is counterproductive.  Second, production of credit hours and generating tuition may also operate at odds with helping college students progress most efficiently toward degree completion. Maximizing course enrollments generates tuition and credit hours, but may actually reduce time-to-completion as more students get lost in the shuffle. It also reduces the incentive to provide lower enrollment higher level courses that may improve completion rates.

The net-value metric is at best neutral to whether institutions try to move students forward toward completion, or allow them to flounder, repeat numerous (large enrollment) courses and never quite reach the end goal. That just doesn’t make sense, on many levels.

Finally, using this net-value metric forces the same incentive structure onto all faculty members uniformly, encouraging them to act as autonomous agents choosing either one or other approach to covering their margin.

Understanding the Role of Student Behaviors

How might we better think about productivity and efficiency in higher education? Again, consider that a primary goal is the efficient production of degreed or credentialed graduates. That is, taking high school completers and moving them efficiently through their coursework to degree completion, at which point they are likely to, at the very least, be a higher wage earner than they otherwise might have been, and in an even better light might be more likely to contribute more significantly to the economy and society as a whole.

Higher education institutions consist of a maze of pathways often navigated naively (or at least irregularly) by college students trying to find their way toward that light at the end of the tunnel. Evaluating the relative efficiency of higher education institutions requires that we better understand these student behaviors – student course taking patterns – and figure out a) which behaviors seem to be more (and less) associated with successful degree completion and b) whether institutional constraints or supports make any difference. It is naïve, if not completely ignorant to try to evaluate the productivity or efficiency of higher education systems and their economic contributions (or financial drain) without considering these student behaviors and how to influence them.

On the one hand, understanding student pathways helps us understand who is more likely to complete their degree in a timely manner. Further, for those critics of higher education who believe that too many students are pursuing (or at least completing) “useless” degrees in “unproductive” fields, it is important to understand how and why students migrate across degree programs through course selection behavior.

For example, let’s say that we believe society needs more electrical engineers than economists, a reasonable assertion indeed! (note the old adage that majoring in EE [electrical engineering] refers to “eventual economics”). Evaluation of course taking behaviors may reveal that many EE majors become economics majors (without really wanting to) after performing poorly in specific lower level engineering courses, for a variety of reasons. It may be that these students would still have been great engineers and would have flourished in their higher level courses. But perhaps course delivery approaches (large lectures) lack of supports or other institutional barriers are partly at fault.  Identifying these barriers and shifting institutional policies may lead to an increased production of electrical engineering completers (and most importantly a decrease in future economists).

Linking Student Behaviors to their Cost & Efficiency Implications

Building on understanding student pathways, we should shift our focus toward the way groups of faculty members and the sequences of courses (and degree programs) they provide lead to differences in the likelihood of degree completion, differences in time to completion and differences in the total costs of degree completion.  This is another area where higher education cost research has gone awry in the past. One cannot calculate the differences in costs of producing an economics versus an engineering major by simply looking at the costs of operating those departments. Departments are top down organizational units of universities. But students pursuing a degree in any one field take courses across many units. Instead, we can estimate the cost per credit hour for any one student taking any course in the university, and can then estimate the cumulative costs of common student pathways, and identify the higher and lower average and total cost pathways toward achieving any one degree.

Taking this approach, we might find, for example, that offering smaller class sizes (thus higher unit cost) in specific lower tier courses decreases the likelihood of repeating those courses and/or increases likelihood of successful completion of subsequent courses, leading to an overall more efficient pathway to degree completion.  But under the current model of evaluating the net cash value of faculty, the incentive works in the opposite direction by encouraging filling seats over completing degrees and programs.

We might find that offering additional supports for students from disadvantaged backgrounds (who attended high schools with weaker math and physical science programs) taking their lower level courses in engineering calculus leads to greater likelihood of timely degree completion in electrical engineering. Further, that doing so significantly decreases average cost to degree completion by decreasing course repeats.  Again, the current net-value approach creates the opposite incentive, favoring course repeats to beef up credit hour production in high enrollment lower level classes.

In reality, the unit costs of any single course, or net value of the faculty member delivering that course, matter far less than how that course more broadly influences the cost of degree completion overall.

Institutional and Public Policy Implications

For progress to be made in the current policy conversations around higher education costs and efficiency, we must improve our metrics and must link new metrics to a much deeper understanding of just how higher education systems work, the role of individual student behaviors and the complexity of the delivery systems and institutional structures designed to serve those students.

We must also be cognizant of the fact that higher education systems are not uniformly, as often characterized in policy rhetoric, stagnant structures of ancient origin, assuming a single woefully inefficient, exorbitantly costly and arcane governance and program delivery structure. Arguably, many elite institutions which best fit this caricature (elite private liberal arts colleges), while sustaining themselves with very high tuition, also achieve very high degree completion rates, albeit for the most advantaged high school graduates.

By contrast, in recent decades we have seen a dramatic proliferation of alternative delivery mechanisms, including rapid expansion of online and for profit higher education institutions. Further, many of these alternative delivery institutions have begun to disproportionately serve high school graduates with the least likelihood of timely (6 year or less) degree completion and have done so at substantial public expense through access to federal student loans. If evaluated on a net-value of faculty basis, these institutions likely look quite good. They must in order to achieve their desired financial bottom line. Yet, their financial bottom line (and in some cases stock value) comes at the taxpayer expense of high rates of loan default and societal and economic expenses of dismal rates of completion of meaningful degrees or credentials.

Getting higher education cost and efficiency measures right is critically important for informing the policy debate and for informing institutional practices. Getting these measures right means the difference between incentivizing non-productive course credit and financial debt accumulation versus incentivizing timely degree completion. When one group of students completes their degrees in a timely fashion, institutions have more resources available for the next wave. Finally, getting these measures right means the difference between a) having each and every faculty member in public institutions of higher education operate autonomously and inefficiently out of self-interest, often to the disadvantage of their students, or b) having faculty working collectively with colleagues and their institutions to improve degree production for the benefit of students, and the broader economy.