At the Intersection of Money & Reform Part III: On Cost Functions & the Increased Costs of Higher Outcomes

In my 2012 report Does Money Matter in Education, I addressed the education production function literature that seeks to establish a direct link between resources spent on schools and districts, and outcomes achieved by students. Production function studies include studies of how variation in resources across schools and settings is associated with variations in outcomes across those settings, and whether changes in resources lead to changes in the level or distribution of outcomes.

I have written previously on this blog about the usefulness of education cost functions.

The Education Cost Function

The education cost function is the conceptual flip side of the education production function. Like production function research, cost function research seeks to identify the link between spending variation and outcome variation, cross-sectionally and longitudinally. The goal of the education cost function is to discern the levels of spending associated with efficiently producing specific outcome levels (the “cost” per se) across varied geographic contexts and schools serving varied student populations. Most published studies applying cost function methodology use multiple years of district-level data, within a specific state context, and focus on the relationship between cross-district (over time) variations in spending and outcome levels, considering student characteristics, contextual characteristics such as economies of scale, and labor cost variation. Districts are the unit of analysis because they are the governing unit charged with producing outcomes, raising and receiving the revenues, and allocating the financial and human resources for doing so. Some cost function studies evaluate whether varied expenditures are associated with varied levels of outcomes, all else being equal, while other cost function studies evaluate whether varied expenditures are associated with varied growth in outcomes.

The existing body of cost function research has produced the following (in some cases obvious) findings:

  1. The per-pupil costs of achieving higher-outcome goals tend to be higher, across the board, than the costs of achieving lower-outcome goals, all else being equal.[1]
  2. The per-pupil costs of achieving any given level of outcomes are particularly sensitive to student population characteristics. In particular, as concentrated poverty increases, the costs of achieving any given level of outcomes increase significantly.[2]
  3. The per-pupil costs of achieving any given level of outcomes are sensitive to district structural characteristics, most notably, economies of scale.[3]

Researchers have found cost functions of particular value for evaluating the different costs of achieving specific outcome goals across settings and children. In a review of cost analysis methods in education, Downes (2004) explains: “Given the econometric advances of the last decade, the cost-function approach is the most likely to give accurate estimates of the within-state variation in the spending needed to attain the state’s chosen standard, if the data are available and of a high quality” (p. 9).[4]

Addressing the critics

This body of literature also has its detractors, including, most notably, Robert Costrell, Eric Hanushek and Susanna Loeb (CHL), who, in a 2008 article, assert that cost functions are invalid for estimating costs associated with specific outcome levels. They assert that one cannot possibly identify the efficient spending level associated with achieving any desired outcome level by evaluating the spending behavior of existing schools and districts, whose spending is largely inefficient (because, as discussed above, district expenditures are largely tied up in labor agreements that, according to these authors, are in no way linked to the production of student outcomes). If all schools and districts suffer such inefficiencies, then one cannot possibly discern underlying minimum costs by studying those institutions. However, CHL’s argument rests on the assumption that desired outcomes could be achieved while spending substantially less and entirely differently than any existing school or district spends, all else being equal. Evidence to this effect is sparse to nonexistent.[5]

Authors of cost function research assert, however, that the goal of cost modeling is more modest than exact predictions of minimum cost, and that much can be learned by better understanding the distribution of spending and outcomes across existing schools and districts, and the varied efficiency with which existing schools and districts achieve current outcomes.[6] That is, the goal of the cost model is to identify, among existing “outcome producing units” (districts or schools), the more (and less) efficient spending levels associated with given outcomes, where those more efficient spending levels associated with any given outcome provide a real-world approximation, approaching the minimum costs of achieving those outcomes.

CHL’s empirical critique of education cost function research centers on a falsification test, applying findings from a California study by Jennifer Imazeki (2008).[7] CHL’s critique was published in a non-peer-reviewed special issue of the Peabody Journal of Education, based on testimony provided in the state of Missouri and funded by the conservative Missouri-based Show-Me Institute.[8] The critique asserts that if, as it would appear conceptually, the cost function is merely the flip side of the production function, then the magnitude of the spending-to-outcomes relationship should be identical between the cost and production functions. But, in Imazeki’s attempt to reconcile cost and production functions using California data, the results differed dramatically. That is, if one uses a production function to identify the spending associated with certain outcome levels, and then the cost function, the results differ dramatically. CHL use this finding to assert the failure of cost functions as a method and, more generally, the uncertainty of the spending-to-outcomes relationship.

Duncombe and Yinger (2011), however, explain the fallacy of this falsification test, in a non-peer-reviewed special issue of the same journal.[9] They explain that while the cost and production functions are loosely flip sides of the same equation, they are not exactly such. Production models are estimated using some outcome measure as the dependent variable—that which is predicted by the equation. In an education production function studying the effect of spending on outcomes, the dependent variable is predicted as a function of (a) a measure of relevant per-pupil spending; (b) characteristics of the student population served; and (c) contextual factors that might affect the value of the dollar toward achieving outcomes (economies of scale, regional wage variation).

Outcomes = f(Spending, Students, Context)

The cost model starts out similarly, switching the position of the spending and outcomes measures, and predicting spending levels as a function of outcomes, students and context factors.

Spending = f(Outcomes, Students, Context)

If it was this simple, then one would expect the statistical relationship between outcomes and spending to be the same from one equation to the next. But there’s an additional piece to the cost function that, in fact, adds important precision to the estimation of the input to outcome relationship. The above equation is a spending function, whereas the cost function attempts to distill “cost” from spending by addressing the share of spending that may be “inefficient.” That is:

Cost = Spending – Inefficiency, or Spending = Cost + Inefficiency

That is, some of the variation in spending is variation that does not lead to variations in the outcome measure. While we don’t really know exactly what the inefficiency is (which dollars are being spent in ways that don’t improve outcomes), Duncombe and Yinger suggest that we do know some of the indirect predictors of the likelihood that school districts spend more than would be needed to minimally achieve current outcomes, and that one can include in the cost model characteristics of districts that explain a portion of the inefficient spending. This can be done when the spending measure is the dependent variable, as in the cost function, but not when the spending variable is an independent measure, as in the production function.[10]

Spending = f(Outcomes, Students, Context, Inefficiency Factors)

When inefficiency factors are accounted for in the spending function, the relationship between outcomes and spending more accurately represents a relationship between outcomes and costs. This relationship would be expected to be different from the relationship between spending and outcomes (without addressing inefficiency) in a typical production function.

In Summary

In summary, while education cost function research is not designed to test specifically whether and to what extent money matters, the sizeable body of cost function literature does suggest that achieving higher educational outcomes, all else being equal, costs more than achieving lower educational outcomes. Further, achieving common educational outcome goals in settings with concentrated child poverty, children for whom English is a second language and children with disabilities costs more than achieving those same outcome goals with less needy student populations. Cost models provide some insights into how much more money is required in different settings and with different children to achieve measured outcome goals. Such estimates are of particular interest in this period of time when more and more states are migrating toward common standards frameworks and common assessments but are still providing their schools and districts with vastly different resources. Cost modeling may provide insights into just how much more funding may be required for all children to have equal opportunity to achieve these common outcome goals.

Notes

[1]W. Duncombe and J. Yinger, “Financing Higher Student Performance Standards: The Case of New York State,” Economics of Education Review 19, no. 4 (2000): 363-386; A. Reschovsky and J. Imazeki, “Achieving Educational Adequacy through School Finance Reform,” Journal of Education Finance (2001): 373-396;
J. Imazeki and A. Reschovsky, “Is No Child Left Behind an Un (or Under) Funded Federal Mandate? Evidence from Texas,” National Tax Journal (2004): 571-588; J. Imazeki and A. Reschovsky, “Does No Child Left Behind Place a Fiscal Burden on States? Evidence from Texas,” Education Finance and Policy 1, no. 2 (2006): 217-246; and J. Imazeki and A. Reschovsky, “Assessing the Use of Econometric Analysis in Estimating the Costs of Meeting State Education Accountability Standards: Lessons from Texas,” Peabody Journal of Education 80, no. 3 (2005): 96-125.

[2]T. A. Downes and T. F. Pogue, “Adjusting School Aid Formulas for the Higher Cost of Educating Disadvantaged Students,” National Tax Journal (1994): 89-110; W. Duncombe and J. Yinger, “School Finance Reform: Aid Formulas and Equity Objectives,” National Tax Journal (1998): 239-262; W. Duncombe and J. Yinger, “Why Is It So Hard to Help Central City Schools?,” Journal of Policy Analysis and Management 16, no. 1 (1997): 85-113; and W. Duncombe and J. Yinger, “How Much More Does a Disadvantaged Student Cost?,” Economics of Education Review 24, no. 5 (2005): 513-532.

[3]For a discussion, see B. D. Baker, “The Emerging Shape of Educational Adequacy: From Theoretical Assumptions to Empirical Evidence,” Journal of Education Finance (2005): 259-287. See also M. Andrews, W. Duncombe and J. Yinger, “Revisiting Economies of Size in American Education: Are We Any Closer to a Consensus?,” Economics of Education Review 21, no. 3 (2002): 245-262; W. Duncombe, J. Miner and J. Ruggiero, “Potential Cost Savings from School District Consolidation: A Case Study of New York,” Economics of Education Review 14, no. 3 (1995): 265-284; J. Imazeki and A. Reschovsky, “Financing Adequate Education in Rural Settings,” Journal of Education Finance (2003): 137-156; and T. J. Gronberg, D. W. Jansen and L. L. Taylor, “The Impact of Facilities on the Cost of Education,” National Tax Journal 64, no. 1 (2011): 193-218.

[4]T. Downes, What Is Adequate? Operationalizing the Concept of Adequacy for New York State (2004), http://www.albany.edu/edfin/Downes%20EFRC%20Symp%2004%20Single.pdf.

[5] For a recent discussion, see: Baker, B., & Welner, K. G. (2012). Evidence and rigor scrutinizing the rhetorical embrace of evidence-based decision making. Educational Researcher, 41(3), 98-101. See also: Baker, B. D. (2012). Revisiting the Age-Old Question: Does Money Matter in Education?. Albert Shanker Institute.

[6]See, for example, B. D. Baker, “Exploring the Sensitivity of Education Costs to Racial Composition of Schools and Race-Neutral Alternative Measures: A Cost Function Application to Missouri,” Peabody Journal of Education 86, no. 1 (2011): 58-83.

[7]Completed and released in 2006, eventually published as J. Imazeki, “Assessing the Costs of Adequacy in California Public Schools: A Cost Function Approach,” Education 3, no. 1 (2008): 90-108.

[8]See the acknowledgements at http://files.eric.ed.gov/fulltext/ED508961.pdf. Final published version: R. Costrell, E. Hanushek and S. Loeb, “What Do Cost Functions Tell Us about the Cost of an Adequate Education?,” Peabody Journal of Education 83, no. 2 (2008): 198-223.

[9]W. Duncombe and J. Yinger, “Are Education Cost Functions Ready for Prime Time? An Examination of Their Validity and Reliability,” Peabody Journal of Education 86, no. 1 (2011): 28-57. See also W. Duncombe and J. M. Yinger, “A Comment on School District Level Production Functions Estimated Using Spending Data” (Maxwell School of Public Affairs, Syracuse University, 2007); and W. Duncombe and J. Yinger, “Making Do: State Constraints and Local Responses in California’s Education Finance System,” International Tax and Public Finance 18, no. 3 (2011): 337-368. For an alternative approach, see T. J. Gronberg, D. W. Jansen and L. L. Taylor, “The Adequacy of Educational Cost Functions: Lessons from Texas,” Peabody Journal of Education 86, no. 1 (2011): 3-27.

[10]W. Duncombe and J. Yinger, “Are Education Cost Functions Ready for Prime Time? An Examination of Their Validity and Reliability,” Peabody Journal of Education 86, no. 1 (2011): 28-57. See also W. Duncombe and J. M. Yinger, “A Comment on School District Level Production Functions Estimated Using Spending Data” (Maxwell School of Public Affairs, Syracuse University, 2007). For an alternative approach, see T. J. Gronberg, D. W. Jansen and L. L. Taylor, “The Adequacy of Educational Cost Functions: Lessons from Texas,” Peabody Journal of Education 86, no. 1 (2011): 3-27.

Pondering Chartering: Getting the incentives right for the good of the whole!

I had a fun chat with EduShyster the other day about my recent report on charter school business practices. It was during the course of that conversation that I articulated some of my major concerns about how we are currently approaching “chartering” as public policy, and, for that matter, academic researchers of chartering as public policy. Here are a few points that I think are key takeaways from my recent ramblings.

First, I discuss the fact that there are “better” and “worse” actors in the present system. But a major problem is that there’s little pressure for anyone to do anything about the “worse” actors (or “bad apples” as Edushyster called them). I explained:

It’s to the benefit of the good guys to have the bad guys there because it makes them look better. When you’re KIPP, you look that much better when White Hat does something awful.

Further, because we (including policy researchers) are obsessed with what I refer to as “pissing match” studies of whether charter schools on average “outperform” matched, district, or schools of “lotteried out” kids, it’s in the interest of charter  operators to gain every edge they can over the “competition” (or the “comparison” group, or “counterfactual”). In other words, it’s NOT in their interest to support strengthening the “competition.”  I explained:

It’s just like the way that they continually argue for boosting their own subsidy, even if they know full well it’s at the expense of the district.

The problem is that there’s no incentive under the current policy structure for them to want the district schools to do better. And there’s every incentive for them not to. That’s what’s wrong with this system. Even when they’re good folks and trying to do a good thing, there’s still that undercurrent.

It’s time for all of us to rethink how we frame this conversation to get the incentives right!

 

 

Pondering Chartering: What do we know about administrative and instructional spending?

In a recent report, Gary Miron and I discuss some of the differences in resource allocation practices between Charter operators and district schools.  Among other things, we discuss the apparently high administrative expenses of charter operators. But in that same report, we explain that some of these higher administrative expenses, and, as a result lower instructional expenses, result from bad policy structures that constrain resource allocation and/or induce seemingly illogical behaviors.

Some have pointed out to me that this assertion of higher administrative and lower instructional expense by charter operators runs counter claims made by Dale Russakoff in her book The Prize. My doc student Mark Weber has already thoroughly rebutted Russakoff’s anecdotal claims.  Put bluntly. Those claims were supported only by anecdote and run in contrast with the larger body of data in New Jersey (see Mark’s post) and larger literature on the topic. The summary below addresses additional literature on this topic.

[to be clear… and this is a topic for another post, or perhaps Matt Barnum will do a piece on this… there is little if any evidence that administrative expense shares alone are an indicator if “inefficiency,” where inefficiency is defined as a reduction in outcomes produced for the same aggregate dollar input]

In a related recent post, I explain whether “chartering” can tell us much/anything about whether and how money (and resources that cost money) are associated with measured student outcomes.

Below is a section of a separate, forthcoming paper (coauthored with Mark Weber), in which we evaluate school site staffing expenditure differences between district, non-profit and for-profit charter operators.

Charter School Administrative/Instruction Expense

A handful of studies over time have addressed questions similar to those we address herein, asking more specifically about the differences in administrative overhead expenditures of charter schools. Two studies of Michigan charter schools, which operate fiscally independently of local public districts, have found them to have particularly high administrative expenses and low direct instructional expenses. Arsen and Ni (2012) found that “Controlling for factors that could affect resource allocation patterns between school types, we find that charter schools on average spend $774 more per pupil per year on administration and $1141 less on instruction than traditional public schools.” (p. 1) Further, they found “charter schools managed by EMOs spend significantly more on administration than self-managed charters (about $312 per pupil). This higher spending occurs in administrative functions traditionally performed at both the district central office and school building levels.” (p. 13)

Izraeli and Murphy (2012) found that district schools in Michigan tended to spend more on instruction per student than did charter schools, and the gap grew by about 5 percent to nearly 35% percent over the period studied (1995-96 to 2005-06) (p. 265). Further they found the spending gap for instructional spending to be greater than that for general spending. The overall funding gap between district and charter schools was approximately $230. The spending gap for basic programs was $562 and for total instruction $910. The authors note “much like a profit-maximizing firm, charter schools generate a surplus of revenue over expenditure.” (Izraeli & Murphy, 2012, p. 265)

Bifulco and Reback (2014) explore the complex relationship between fiscally dependent charter schools and their host districts in upstate New York cities. Particularly relevant to our investigation is Bifulco and Reback’s finding that having fiscally dependent charter schools separately affiliated with outside management companies and governance structures can create excess, redundant costs (p. 86).

Others have explored teacher compensation in relation to instructional expense in charter schools. In a recent comprehensive review of charter school research, Epple, Romano and Zimmer (2015) summarize that “On the whole, teachers in charter schools are less experienced, are less credentialed, are less white, and have fewer advanced degrees. They are paid less, their jobs are less secure, and they turnover with higher frequency.” (Epple, 2015) Similarly, in a report on spending behavior of Texas charter schools Taylor and colleagues (2011) explain that much of the difference between instructional and non-instructional expense across differing types of charter and district schools is tied to differences in teacher compensation. The authors explain that “open-enrollment charter schools paid lower salaries, on average, than did traditional public school districts. Average teacher pay was 12% lower for teachers in open-enrollment charter schools than for teachers in traditional public school districts of comparable size, and adjusted for differences in local wage levels, average teacher pay was 24% lower. Average teacher salaries were lower not only because open-enrollment charter schools hired less experienced teachers, on average, but also because open-enrollment charter schools paid a smaller premium for additional years of teacher experience.” (p. ix)

Research by Gronberg, Taylor and Jansen (2012) also points to the revenue enhancement activities of some charter management companies, most notably KIPP schools. The authors find that some KIPP schools in Texas had nearly doubled their per pupil public subsidy through private philanthropy. Baker and Ferris (2011) and Baker, Libby and Wiley (2012, 2015) find similarly that some Charter Management Organizations have significant potential for revenue enhancement. Baker, Libby and Wiley (2012) explain “We find that in New York City, KIPP, Achievement First and Uncommon Schools charter schools spend substantially more ($2,000 to $4,300 per pupil) than similar district schools. Given that the average spending per pupil was around $12,000 to $14,000 citywide, a nearly $4,000 difference in spending amounts to an increase of some 30%.” But, while some New York City based CMOs raised substantial private funding, others did not, and charter schools operating in other locations in Ohio and Texas had much less access to philanthropy.

Relative Efficiency & Underlying Differences

Of particular interest herein are studies of the relative effectiveness or efficiency of charter schools operated by for-profit management companies, including operators of online schools. Rigorous, peer reviewed literature on these schools remains limited, and much of it dated, evaluating charter expansion from the late 1990s through mid-2000s. King (2007) evaluated the effectiveness of Arizona charter schools, where there exist significant numbers of for profit firms. King (2007) found, based on data from 2003-2004 that “there is some evidence that for-profit charter schools are achieving higher test scores, however, given the insignificant findings for many of the for-profit specifications, a definite conclusion cannot be reached based on this one study alone. (King, 2007, p. 744) However, in a broader, more recent and more empirically rigorous analysis of Arizona charter schools as a whole Chingos and West (2014) found that “the performance of charter schools in Arizona in improving student achievement varies widely, and more so than that of traditional public schools. On average, charter schools at every grade level have been modestly less effective than traditional public schools in raising student achievement in some subjects.” (p. 120S)

Studies on Michigan charter schools, another state we identify has having significant shares of children enrolled in for-profit schools, have also yielded mixed findings over time regarding effectiveness and relative efficiency. Bettinger (2005) found that during the early years of Michigan charter schools, “test scores of charter school students do not improve, and may actually decline, relative to those of public school students.” (p. 133) Hill and Welsch (2009) found “no evidence of a change in efficiency when a charter school is run by a for-profit company (versus a not-for-profit company). (p. 147) They explain further: “The results of this paper find no evidence that schools managed by for-profit companies deliver education services less efficiently than schools run by not-for-profit companies; this matches recent results found by Sass (2006).” (p. 164) That is, the shift from nonprofit to for-profit management status caused no systematic harm to measured student outcomes. Sass (2004) in an early study of Florida charter schools by their management status had also found no significant performance differences between schools managed by nonprofit and for-profit providers, but had found that for-profit providers serve substantively fewer children with disabilities. (p. 91)

Perhaps the strongest evidence of charter school efficiency advantages comes from the work of Gronberg, Taylor and Jansen (2012) on Texas charter schools. The authors find that, generally, Texas “charter schools are able to produce educational outcomes at lower cost than traditional public schools—probably because they face fewer regulations—but are not systematically more efficient relative to their frontier than are traditional public schools.”(p. 302) In other words, while the overall cost of charter schools is lower for comparable output, the variations in relative efficiency among Texas charter schools are substantial. Efficiency is neither uniformly nor consistently achieved. As explained above, evidence from related work by these authors reveals that the lower overall expenses are largely a function of lower salaries and inexperienced staff (Taylor et al., 2011). Thus, maintaining efficiency may require ongoing reliance on inexperienced staff.

Frequently cited studies touting the relative effectiveness of charter schools operated by major Charter Management Organizations, including Lake et al. (2010) and Dobbie and Fryer (2011) have typically measured poorly or not at all the resources available in these schools – schools which Baker, Libby and Wiley (2015, 2012) and Gronberg, Taylor and Jansen (2012) identify as often spending substantially more than nearby district schools. Baker, Libby and Wiley (2015) and others (Preston et al., 2012) explain that most charter schools, and large CMO charter schools in particular, operate under a similar human resource intensive model as traditional district schools. Specifically, well-endowed CMOs allocate their additional resources to competitive wages (higher than expected for relatively inexperienced teachers), small classes, longer days and years (Baker, Libby and Wiley, 2012).

Other charter school operators have attempted to reduce substantially direct instructional per pupil costs through online and hybrid learning. This approach provides perhaps the greatest opportunity to maximize profit margin as it presents the greatest opportunity to cut staffing costs. But as Epple, Romano and Zimmer (2015) explain, regarding student outcomes “online ‘cyber’ schools appear to be a failed innovation, delivering markedly poorer achievement outcomes than TPSs.” (p. 55)

Pulling it All Together

To summarize, based on limited analyses of resource allocation behaviors of charter schools, we have evidence that charter schools generally tend to divert more from the classroom to administration. Classroom expenditures are reduced in part, if not mainly by reduction of total teacher salary expenses by having relatively inexperienced teachers and high turnover rates. EMO operated charter schools tend to have even greater administrative expense and charter schools operating within districts may create redundant administrative expenses. That said, there is limited evidence that charter schools generally, or those operated by EMOs and CMOs are less efficient as a result of increased administrative expense, and some evidence of efficiency improvement for charters over district schools (in Texas) due to reduced staffing expenditure. Generally, we have little evidence of systematic differences between nonprofit and for-profit operated charter schools, but we do have some evidence that high profile nonprofit providers engage in substantial revenue enhancement. Finally, we have increasingly clear evidence that online and cyber charter schools lag in performance outcomes, as well as evidence that charter schools in states including Ohio and Arizona perform particularly poorly.

References

Andrews, M., Duncombe, W., & Yinger, J. (2002). Revisiting economies of size in American education: are we any closer to a consensus?. Economics of Education Review, 21(3), 245-262.

Arsen, D. D., & Ni, Y. (2012). Is administration leaner in charter schools? Resource allocation in charter and traditional public schools. education policy analysis archives, 20, 31.

Baker, B.D. & Bathon, J. (2012). Financing Online Education and Virtual Schooling: A Guide for Policymakers and Advocates. Boulder, CO: National Education Policy Center. Retrieved 7/14/15 from http://nepc.colorado.edu/publication/financing-online-education

Baker, B. D., & Elmer, D. R. (2009). The politics of off-the-shelf school finance reform. Educational Policy, 23(1), 66-105.

Baker, B. D., & Ferris, R. (2011). Adding up the Spending: Fiscal Disparities and Philanthropy among New York City Charter Schools. National Education Policy Center.

Baker, B.D., Libby, K., Wiley, K. (2015) Charter School Expansion & Within District Equity: Confluence or Conflict? Education Finance and Policy

Baker, B. D., Libby, K., & Wiley, K. (2012). Spending by the Major Charter Management Organizations: Comparing Charter School and Local Public District Financial Resources in New York, Ohio, and Texas. National Education Policy Center.

Bettinger, E. P. (2005). The effect of charter schools on charter students and public schools. Economics of Education Review, 24(2), 133-147.

Bifulco, R., & Reback, R. (2014). Fiscal Impacts of Charter Schools: Lessons from New York. Education Finance & Policy, 9(1), 86-107.

Bitterman, A., Gray, L., and Goldring, R. (2013). Characteristics of Public and Private Elementary and Secondary Schools in the United States: Results From the 2011–12 Schools and Staffing Survey (NCES 2013–312). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved 7/14/15 from https://nces.ed.gov/pubs2013/2013312.pdf

Bulkley, K. E., & Burch, P. (2011). The changing nature of private engagement in public education: For-profit and nonprofit organizations and educational reform. Peabody Journal of Education, 86(3), 236-251.

Center for Research on Education Outcomes (CREDO) (2013, June). National Charter School Study. Palo Alto: CREDO, Stanford University. Retrieved July 10, 2013, from http://credo.stanford.edu/research-reports.html

Chingos, M. M., & West, M. R. (2015). The Uneven Performance of Arizona’s Charter Schools. Educational Evaluation and Policy Analysis, 37(1 suppl), 120S-134S.

Dobbie, W., & Fryer Jr, R. G. (2011). Getting beneath the veil of effective schools: Evidence from New York City (No. w17632). National Bureau of Economic Research.

Duncombe, W., & Yinger, J. (2008). Measurement of cost differentials. Handbook of research in education finance and policy, 238-256.

Education Trust-Midwest (2015) Accountability for All: The need for real charter school authorizer accountability in Michigan. http://www.crainsdetroit.com/assets/PDF/CD98381219.PDF

Epple, D., Romano, R., & Zimmer, R. (2015). Charter Schools: A Survey of Research on Their Characteristics and Effectiveness (No. w21256). National Bureau of Economic Research.

Gronberg, T. J., Jansen, D. W., & Taylor, L. L. (2012). The relative efficiency of charter schools: A cost frontier approach. Economics of Education Review, 31(2), 302-317.

Hill, C. D., & Welsch, D. M. (2009). For‐profit versus not‐for‐profit charter schools: an examination of Michigan student test scores. Education Economics, 17(2), 147-166.

 Izraeli, O., & Murphy, K. (2012). An Analysis of Michigan Charter Schools: Enrollment, Revenues, and Expenditures. Journal of Education Finance, 37(3), 234-266.

Kena, G., Musu-Gillette, L., Robinson, J., Wang, X., Rathbun, A., Zhang, J., Wilkinson-Flicker, S., Barmer, A., and Dunlop Velez, E. (2015). The Condition of Education 2015 (NCES 2015-144); p.85. U.S. Department of Education, National Center for Education Statistics. Washington, DC. Retrieved 7/14/15 from http://nces.ed.gov/pubs2015/2015144.pdf

King, K. A. (2007). Charter Schools in Arizona: Does Being a For-Profit Institution Make a Difference?. Journal of Economic Issues, 729-746.

Lake, R., Dusseault, B., Bowen, M., Demeritt, A., & Hill, P. (2010). The National Study of Charter Management Organization (CMO) Effectiveness. Report on Interim Findings. Center on Reinventing Public Education.

Maul, A., & McClelland, A. (2013). REVIEW OF NATIONAL CHARTER SCHOOL STUDY 2013. Boulder, CO: National Education Policy Center. Retrieved September, 2, 2014.

Maul, A. (2013). Review of “Charter School Performance in Michigan.”. Boulder, CO: National Education Policy Center. Retrieved July, 10, 2013.

Miron, G., & Gulosino, C. (2013). Profiles of for-profit and nonprofit education management organizations: Fourteenth Edition—2011-2012. Boulder, CO: National Education Policy Center.

Molnar, A., Huerta, L., Rice, J. K., Shafer, S. R., Barbour, M. K., Miron, G., … & Horvitz, B. (2014). Virtual Schools in the US 2014: Politics, Performance, Policy, and Research Evidence.

Morley, J. (2006). For-profit and nonprofit charter schools: An agency costs approach. The Yale Law Journal, 1782-1821.

Preston, C., Goldring, E., Berends, M., & Cannata, M. (2012). School innovation in district context: Comparing traditional public schools and charter schools. Economics of Education Review, 31, 318–330.

Richards, C. E. (1996). Risky Business: Private Management of Public Schools. Economic Policy Institute, 1660 L Street, NW, Suite 1200, Washington, DC 20036.

Sass, T. R. (2006). Charter schools and student achievement in Florida. Education Finance and Policy, 1(1), 91-122.

Taylor, L.L., and Fowler, W.J., Jr. (2006). A Comparable Wage Approach to Geographic Cost Adjustment (NCES 2006-321). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

Taylor, L.L. Alford, B.L., Rollins, K.G., Brown, D.B., Stillisano. J.R., Waxman, H.C. (2011) Evaluation of Texas Charter Schools 2009-2010 (Revised Draft). Texas Education Research Center. Texas A&M University, College Station.

Zimmer, R., Gill, B., Booker, K., Lavertu, S., & Witte, J. (2012). Examining charter student achievement effects across seven states. Economics of Education Review, 31(2), 213-224.

 

Picture Post Week: Subprime Chartering

A short while back, I explained how, in our fervor to rapidly expand charter schooling and decrease the role of large urban school districts in serving their resident school-aged populations, we’ve created some particularly ludicrous scenarios whereby, for example – charter school operators use public tax dollars to buy land and facilities that were originally purchased with other public dollars… and at the end of it all, the assets are in private hands!  Even more ludicrous is that the second purchase incurred numerous fees and administrative expenses, and the debt associated with that second purchase likely came with a relatively high interest rate because – well – revenue bonds paid for by charter school lease payments are risky. Or so the rating agencies say.

So how much of this debt is accumulating? And when does it come due? Who is issuing this debt? Are we looking at a charter school subprime bubble? Here are some snapshots:

Slide1

Most revenue bond debt incurred on behalf of charter schools is either unrated, or BBB- or BB+ rated. The unrated debt is saddled, on average, with coupon rates around 6.9% in recent years, marginally higher than rates attached to BBB- or BB+ bonds.

Slide2

Slide3

Slide4

PIMA County Industrial Development Authority in Arizona has been particularly active in recent years! Still trying to figure this one out.

So, are we at risk of a subprime chartering collapse?

What will happen to all of this debt if some of the bigger charter chains go belly up? Can’t make their (at times exorbitant) lease payments?

Have we let the charter industry get “too big to fail?” [certainly by comparison, this is a tiny bubble, but it’s really just getting started]

And when and how will that bail out occur? [and who will own those facilities when the dust settles?]

And just remember who’s running charter schools in the states where the debt is accumulating the fastest!

 

 

Picture Post Week: Increased Standards & Student Needs, But Shrinking Resources!

As I explain in a post a while back:

In short, the “cost” of education rises as a function of at least 3 major factors:

  1. Changes in the incoming student populations over time
  2. Changes in the desired outcomes for those students, including more rigorous core content area goals or increased breadth of outcome goals
  3. Changes in the competitive wage of the desired quality of school personnel

And the interaction of all three of these! For example, changing student populations making teaching more difficult (a working condition), meaning that a higher wage might be required to simply offset this change. Increasing the complexity of outcome goals might require a more skilled teaching workforce, requiring higher wages.

So how well have we been addressing the increased costs associated with both our increasingly needy student populations, and our desire for higher outcome standards?

Slide1

Slide2

Not so well I guess!

Picture Post Week: Who’s granting all of those education degrees?

This post is an update to a series of earlier posts in which I summarized the production of education degrees over time. As policymakers continue their critiques of the supposed decline in the quality of teacher preparation, as if teacher and leader preparation has been static since the 1950s, it’s worth again looking at trends of the last 20+ years to see just what has changed.  The following graphs summarize undergraduate and graduate degree production classified by a) undergraduate institution selectivity as reported in Barron’s guides and b) institutional classifications from the 1994 Carnegie Classification system, which was more hierarchical (read: elitist) than later versions.

Slide1Slide2

Slide4

Slide6Slide7

Slide8

Slide9Slide10

 

Related Research

Baker, B.D, Orr, M.T., Young, M.D. (2007) Academic Drift, Institutional Production and Professional Distribution of Graduate Degrees in Educational Administration. Educational Administration Quarterly  43 (3)  279-318

Baker, B.D., Fuller, E. The Declining Academic Quality of School Principals and Why it May Matter. Baker.Fuller.PrincipalQuality.Mo.Wi_Jan7

Baker, B.D., Wolf-Wendel, L.E., Twombly, S.B. (2007) Exploring the Faculty Pipeline in Educational
Administration: Evidence from the Survey of Earned Doctorates 1990 to 2000. Educational
Administration Quarterly 43 (2) 189-220

Wolf-Wendel, L, Baker, B.D., Twombly, S., Tollefson, N., & Mahlios, M.  (2006) Who’s Teaching the Teachers? Evidence from the National Survey of Postsecondary Faculty and Survey of Earned Doctorates.  American Journal of Education 112 (2) 273-300

1994 Carnegie Classifications

  • Research Universities I: These institutions offer a full range of baccalaureate programs, are committed to graduate education through the doctorate, and give high priority to research. They award 50 or more doctoral degrees1 each year. In addition, they receive annually $40 million or more in federal support.
  • Research Universities II: These institutions offer a full range of baccalaureate programs, are committed to graduate education through the doctorate, and give high priority to research. They award 50 or more doctoral degrees1 each year. In addition, they receive annually between $15.5 million and $40 million in federal support.
  • Doctoral Universities I: These institutions offer a full range of baccalaureate programs and are committed to graduate education through the doctorate. They award at least 40 doctoral degrees1 annually in five or more disciplines.
  • Doctoral Universities II: These institutions offer a full range of baccalaureate programs and are committed to graduate education through the doctorate. They award annually at least ten doctoral degrees—in three or more disciplines—or 20 or more doctoral degrees in one or more disciplines.
  • Master’s (Comprehensive) Universities and Colleges I: These institutions offer a full range of baccalaureate programs and are committed to graduate education through the master’s degree. They award 40 or more master’s degrees annually in three or more disciplines. [Includes typical regional, within-state public normal schools/teachers colleges]
  • Master’s (Comprehensive) Universities and Colleges II: These institutions offer a full range of baccalaureate programs and are committed to graduate education through the master’s degree. They award 20 or more master’s degrees annually in one or more disciplines.
  • Baccalaureate (Liberal Arts) Colleges I: These institutions are primarily undergraduate colleges with major emphasis on baccalaureate degree programs. They award 40 percent or more of their baccalaureate degrees in liberal arts fields4 and are restrictive in admissions.
  • Baccalaureate Colleges II: These institutions are primarily undergraduate colleges with major emphasis on baccalaureate degree programs. They award less than 40 percent of their baccalaureate degrees in liberal arts fields4 or are less restrictive in admissions. [Includes many cash-strapped, relatively non-selective, smaller private liberal arts colleges]

Picture Post Week: Follow up on who’s running America’s charter schools

This post is a follow up on my previous post where I discussed which charter school operators are actually leading the nation in charter school enrollments. Here are a some slides breaking out the charter school enrollments by operator/manager for a handful of states.  These slides are made possible by my meticulous graduate student Mark Weber, who spent hours aligning operator classifications and school links first presented by Gary Miron and colleagues, and merging those classifications to the 2011-12 National Center for Education Statistics Common Core of Data and Civil Rights Data Collections.

The data are likely imperfect in many ways. For one, it’s not always easy to figure out exactly who’s managing what school. In addition, charter school enrollments have continued to expand rapidly since this time. But, we have little reason to believe, for example, that the distribution of operators within the charter sector has shifted dramatically. Bottom line – we should have better – “officially” (USDOE/NCES, SEAs) gathered data on such things.  For now, we don’t.

In later posts, I will, time permitting, spend a bit more time discussing some of the operators I’ve highlighted in red (w/yellow font) on these slides. My previous post includes some links and to some, these names will be familiar.

Slide1

Slide3

Slide4

Slide5

Slide6

Slide7

Slide8

Slide9

Slide10

From NJ Ed Policy Forum: On Average, Are Children in Newark Doing Better?

Bruce D. Baker & Mark Weber, Rutgers University, Graduate School of Education

November 16, 2015

Baker.Weber.NewarkBetterOff.NJEPF.11_15_15

In this research note, we estimate a series of models using publicly available school level data to address the following question:

Q: Did students in Newark (combined district and charter) make gains on statewide averages (non-Newark) on state assessments, controlling for demographics?

Specifically, we evaluate changes in mean scale scores on state assessments (NJASK) for language arts and math grades 6 to 8.

Newark Reforms Since 2009

Schools in the city of Newark have undergone a series of disruptive reforms since 2009, including substantial increases in the numbers of children served in charter schools, adoption of a unified enrollment system, ratification of a performance based teacher contract, and school closures, reconstitutions and reorganization.[i] Some of these reforms were instituted following the much publicized gift of Facebook founder Mark Zuckerberg, chronicled in Dale Russakoff’s The Prize.[ii]

A commonly asked question in the aftermath of these disruptions is whether students in Newark on the whole are better off than they were before these reforms? That is, were the disruptions and resulting political turmoil worth it? Some have chosen to speculate, based largely on anecdotal evidence, that children in Newark must be better off today than before these disruptive reforms.

Chris Cerf, former NJ Commissioner of Education and current State Superintendent of Schools for the Newark Public Schools, asserts that the past few years have brought significant positive changes for Newark’s schools:

Whether the measure is graduation rates, improved instructional quality, last year’s improvement in the lowest-performing schools targeted for special intervention, a nation-leading new collective-bargaining agreement, the addition of many new high-quality public schools, increased parental choice, or a material increase in the proportion of effective teachers, the arrow is pointed decidedly up in Newark.

“To be sure, as is always the case, the evidence of improvement is textured and in some respects uneven. The many positive indicators and trend lines, however, paint a picture of hope and progress that is completely at odds with the pessimism that has made its way into the standard storyline.”[iii]

Tom Moran, Editorial Page Editor of the Star-Ledger and a consistent supporter of the Newark reforms, writes: “The growth of charters has not damaged the kids in the traditional system. In fact, they’ve made modest improvements.”[iv] In a post on his Facebook page, Mark Zuckerberg, whose $100 million gift was the catalyst for the NPS reforms, writes: “No effort like this is ever going to be without challenges, mistakes and honest differences among people with good intentions. We welcome a full analysis and debate of lessons learned. But it is important that we not overlook the positive results.”[v]The chief-of-staff for Cory Booker, former mayor of Newark and current U.S. senator who was instrumental in secure Zuckerberg’s donation, states: “Newark students are quite simply better off now than they were five years ago.[vi]

In these conversations, “better off” is often reduced to whether or not, on average, across district and charter schools, student test scores for children in Newark have improved. That is, are students achieving more than they otherwise would have, had there been no such disruptions? It remains far too soon to measure longer term outcomes, including graduation rates, college attendance or economic outcomes.

While we are unable to compare against what might have been in the absence of reforms, we can at least evaluate whether children in Newark have made progress when compared to statewide averages, controlling for student population characteristics.

Data

To make these comparisons we use a school level data set including measures from 2009 to 2014, most of which are publicly available – downloadable from the New Jersey Department of Education web site:

  • Mean scale scores by subject and year[vii]
  • Shares of low income (% free lunch) children and ELL[viii]
  • City of school location[ix]

Web-based data do not include school level shares of children classified as having disabilities. We have obtained those data via request. Because of the volatility of year to year school level measures of disability populations, we have used three year averages in our analyses (for this measure only).

Among the most significant changes over time in the city of Newark has been the expansion of numbers of children served in the city’s charter schools, and adoption of a unified, citywide enrollment system for assigning children to charters.

Figure 1 shows the shares of valid scale scores for charter schools (as a percent of citywide valid scale scores) on state assessments. For grades 6 and 7, the share of valid test scores (for data included in our analyses) in Newark that are for children attending charter schools rise from around 18% to 25% or more from 2009 to 2014. The share of valid test scores for 8th grade assessments also rises, but somewhat less.

Figure 1

Slide1

Statewide, the share of valid test scores coming from children attending charter schools is much lower, rising to just over 2%.

 

Figure 2

Slide2

Models

Assessing whether or not mean scores in Newark as a whole are rising faster or slower than mean scores for schools statewide other than Newark is relatively straightforward statistically (whether meaningful or not is another question entirely). To address this question, we estimate the following model:

Scale Scorest = b0 + b1Year + b2Newark + b3Year x Newark + b4LowIncomest + b5ELLst + b6Disabilitys+ est

Where scale scores for school “s” at time “t” are the dependent variable, and where we run separate models for each scale score. For each school statewide, we include measures of the share of children who qualify for free lunch, under the national school lunch program, shares of children who are limited in their English language proficiency and shares of children classified as having disabilities. As such, we am comparing changes in the relative position of Newark children to non-Newark children of similar demographics.

Scale scores for any subject and grade level tend to drift over time. As such, we include a set of “year” dummy variables which pick up that drift (more later). We also include a “Newark” dummy variable assigned to every district and charter school in Newark. The coefficient on this dummy variable will tell us whether the average score for children in schools in Newark is different from the average scale score for children not in Newark, controlling for demographics.

Of particular interest here is the interaction term of “Year x Newark.” The coefficients on this term will tell me whether, for each year, the scores of children in Newark have closed the gap (relative to the baseline year of 2009) when compared with children not in Newark, controlling for demographics.

Grade level differences are accounted for by modeling each subject by grade level assessment separately. We focus on grades 6 through 8 (rather than 3 through 5) to capture cumulative effects of schooling.

All models are weighted for school counts of valid test takers. Models are estimated with robust standard errors to account for the fact that repeated measures on schools over time are not independent.[x]

Findings

Table 1 shows the results of the regression models. Starting with demographic effects, the models show us that as we move from a school with 0% to 100% special education students, the average 6th grade language arts scores drop by over 50 points. More realistically, as we move from about 10% to about 20% children with disabilities, average scale scores drop by about 5 points. Similarly, as we move from 0% low income to 100% low income, mean scale scores in 6th grade language arts are about 42 points lower. Demographic disparities in math tend to be greater than in language arts on these assessments.

Table 1

Slide3

Turning to the coefficients of interest: First we see that the “year” dummy variables have many significant coefficients. These represent average score differences from the baseline year of 2009. For example, for 6th grade math, we see than in 2010, average scores were 1.42 points higher than in 2009. In 2011, they were 6.31 points higher than in 2009. The margins grow up through 2014. These coefficients represent a strong upward drift in Math 6 scores from 2009 to 2014. It is that drift against which changes in Newark scores must be measured. Math scores in general show stronger upward drift than language arts within the New Jersey data.

The “Newark” coefficients show that on average, after accounting for demographics, scale scores for Newark are a few points higher than statewide, but only statistically significant for Language Arts in 8th grade.         Now for the interaction term: The interaction term tells us whether Newark schools on the whole (charters and district combined, where ¼ of scores are contributed by charters at the end of the period), are gaining on, losing ground or staying in the same relative position to other schools statewide. Since Newark schools on average are ahead of schools statewide (controlling for demographics), the question is whether they open up that lead, hold it, or lose it.

A quick summary of the coefficients in Table 1, focusing on the end of the period, tells us that in 2013 and 2014, Newark schools had gained no ground on schools statewide- period. Statistically, in terms of these measured outcomes, Newark children are not better off in their aggregate, compared to peers statewide than they were in 2009. They are also no worse off.

The figures that follow illustrate the changes in Newark scores compared to statewide scores for each assessment over time. To generate these figures we use the coefficients in Table 1 to calculate predicted values of the scale scores for Newark and non-Newark schools holding other measures constant. For example, we set all demographics to 0, such that the projected scale scores represent the scale scores of Newark and non-Newark schools at 0% low income, 0% ELL and 0% special education. This is why the average scale scores appear high. But all that’s relevant here are the relative position of Newark and non-Newark scores. We could have adjusted everything downward by setting all demographics to 100% (100% low income, 100% ELL and 100% disability). That would have simply lowered the level of the lines on all graphs but kept their trends and relative positions the same.

Each figure shows that scores of Newark children did jump in 2011 in some cases opening up a statistically significant gap with non-Newark children. In some cases the gap is increased as much if not more so by a drop in scores of non-Newark children. 8th grade language arts is the only assessment addressed here which does not show that 2011 change.

But the apparent change in 2011 immediately disappears, such that by the end of the period – by 2014 – there are no differences in Newark students’ performance than students statewide and no gains made by Newark students compared to students statewide. On most assessments, from 2011 to 2014, Newark scores seem to flat-line or even drop.

Figure 3

Slide4

Figure 4

Slide5

Figure 5

Slide6

Figure 6

Slide7

Figure 7

Slide8

Figure 8

Slide9

Conclusions

In a recent interview, Russakoff stated that she did not believe Newark’s students are better off today than they were five years ago: “…it feels like a wash.”[xi] The analysis herein, while admittedly narrow in scope and short in time frame perspective, finds that Russakoff is correct. Average state assessment scores in grades 6, 7 and 8 are pretty much right where they were – relative to non-Newark students – in 2009.

Follow-up analyses are certainly warranted, but limited by changes in state outcome measures.

NOTES

[i] Weber, M. (2015) Empirical Critique of One Newark: First year update. New Jersey Education Policy Forum. https://njedpolicy.wordpress.com/2015/03/12/empirical-critique-of-one-newark-first-year-update/

[ii] Russakoff, Dale (2015) The Prize: Who’s in charge of America’s schools? New York: Houghton Mifflin Harcourt

[iii] http://www.njspotlight.com/stories/14/12/01/op-ed-dispelling-five-falsehoods-about-newark-s-school-system/

[iv]http://www.nj.com/opinion/index.ssf/2015/09/newark_moran.html

[v] https://www.facebook.com/zuck/posts/10102461248065941?fref=nf

[vi] http://www.nj.com/essex/index.ssf/2015/11/newark_power_brokers_back_zuckerbergs_school_donat.html

[vii] http://www.nj.gov/education/schools/achievement/prior.htm

[viii] http://www.nj.gov/education/data/enr/

[ix] http://education.state.nj.us/directory/

[x] http://www.ats.ucla.edu/stat/stata/webbooks/reg/chapter4/statareg4.htm

[xi] http://www.nj.com/opinion/index.ssf/2015/09/author_dale_russakoff_discusses_new_book_on_newark.html

Pondering Chartering: False Markets & Liberty as Substitute for Equity?

Today’s musing:

It is important to acknowledge that charter school market shares are not, in recent years, expanding exclusively or even primarily because of market demand and personal/family preferences for charter schools. Traditional district public schools are being closed, neighborhoods left without options other than charters, district schools are being reconstituted and handed over to charter operators (including entire districts), and district schools are increasingly deprived of resources, experience burgeoning class sizes, reductions in program offerings sending more families scrambling for their “least bad” nearest alternative. [i] These are conscious decisions of policymakers overseeing the system that includes district and charter schools. They are not market forces, and should never be confused as such. These systems are being centrally managed without regard for equity and adequacy goals or the protection of student, family, taxpayer and employee rights, but instead, on the false hope that liberty of choice is a substitute for all of the above (including, apparently, loss of individual liberties). [ii]

NOTES

[i] See, for example:

Mezzacappa, Dale (2015, Oct. 1) Hite Plan: More charter conversions, closings, turnarounds, and new schools. Philadelphia Public School Notebook. http://thenotebook.org/blog/159023/hite-plan-more-renaissance-charters-closings-turnarounds-new-schools

Weber, Mark (2015) Empirical Critique of “One Newark”: First Year Update. New Jersey Education Policy Forum. https://njedpolicy.files.wordpress.com/2015/03/weber-testimony.pdf

Weber, Mark (2015, Jun. 5) Camden’s “Transformation” Schools: Racial & Experience Disparity in Staff Consequences. https://njedpolicy.files.wordpress.com/2015/06/weber_camdentransformationsfinal.pdf

[ii]   Green, P.C.; & Baker, B.D.; & Oluwole, J. (2015, forthcoming). The Legal Status of Charter Schools in State Statutory Law- University of Massachusetts Law Review.

Green, P.C., Baker, B. D., & Oluwole, J.O. (2013). Having it both ways: How charter schools try to obtain funding of public schools and the autonomy of private schools. Emory Law Journal, 63, 303-337.

Mead, J.F. (2015). The Right to an Education or the Right to Shop for Schooling: Examining Voucher Programs in Relation to State Constitutional Guarantees, 42 Fordham Urban Law Journal 703.

Civil Rights Suspended: An Analysis of New York City Charter School Discipline Policies (2015). Advocates for Children of New York. http://www.advocatesforchildren.org/sites/default/files/library/civil_rights_suspended.pdf?pt=1

 

Pondering Chartering: Balancing Portfolios

The blogging has been quiet for a while. This is partly because I feel like most issues that arise have already been dealt with somewhere on this blog. Also because I’ve been involved in several, simultaneous, long-term projects. These projects intersect with many topics I’ve addressed previously on this blog. At times, this blog serves as a palette for testing/sharing ideas. So… in this, and a rapid fire sequence of follow up posts, I will share some excerpts of forthcoming, and early stage in progress work. 

[from work in progress]

An Opportunity for Scalable Innovation

Since its origins in the early 1990s, the charter school sector has grown to over 6,500 schools serving over 2.25 million children in 2013.[i] In some states, the share of children now attending charter schools exceeds 10% (Arizona, Colorado), and in select major cities that share exceeds one-third (District of Columbia, Detroit, New Orleans).[ii] Modern day charter schools were conceived by union leader Albert Shanker in the 1980s as providing opportunities for creative, independent educators to collaborate and test new ideas with lessened policy constraints. [iii] To the extent these innovations were successful they might inform practices in traditional district schools. Over the next few decades, states adopted statutes providing opportunities for individuals and organizations, including traditional districts, to create these newly chartered schools. In some cases, statutes allowed for the creation of charters governed and financed by existing districts, and in other cases, for the creation of charters independent of district governance, while operating within the boundaries of and in competition with local public districts.

While charter schooling was conceived as a way to spur innovation – try new things – evaluate them – and inform the larger system, studies of the structure and practices of charter schooling find the sector as a whole not to be particularly “innovative.” [iv] Analyses by charter advocates at the American Enterprise Institute find that the dominant form of specialized charter school is the “no excuses” model – a model which combines traditional curriculum and direct instruction with strict disciplinary policies and school uniforms, in some cases providing extended school days and years.[v] Further, charter schools raising substantial additional revenue through private giving tend to use that funding to a) provide smaller classes, and b) pay teachers higher salaries for working longer days and years.[vi] For those spending less, total costs are held down, when necessary, through employing relatively inexperienced, low wage staff and maintaining high staff turnover rates.[vii] In other words, the most common innovations are not especially innovative or informative for systemic reform.

Emergence of Private Managers

The early charter movement coincided with the emergence of private management firms interested in public schooling. Two private for-profit companies tried their hand at providing school management services for public districts in the 1990s – Edison Schools, Inc. and Education Alternatives, Inc.[viii] Education Alternatives, Inc. a publicly traded for-profit company, failed financially while holding an operating contract for 9 (then 11) schools within Baltimore City Public Schools, soon after signing a contract with Hartford Connecticut Public Schools. The company failed prior to taking full responsibility for schools in Hartford. Edison Schools expanded cautiously in the wake of EAI’s failure, operating a school in Wichita, Kansas, in 1995 and 25 schools nationally by the end of 1996.[ix] Edison also faced financial troubles as a publicly traded stock, eventually buying back their company stock in 2003 and reverting to privately held status.[x]

As charter schools expanded, including online and hybrid schooling options, Edison Schools and other new, upstart for-profit companies shifted their growth strategy to the charter sector, where they could control employment contracts, increasing financial flexibility and profit potential. Coinciding with these developments, many now high-profile nonprofit charter management firms got their start as founders of single charter schools, including the Knowledge is Power Program (KIPP), with middle schools in Houston and New York City; Uncommon Schools, founded from North Star Academy in Newark, NJ; and Achievement First, founded from Amistad Academy in New Haven, CT. Presently the charter school landscape consists of a mix of schools operated by major nonprofit Charter Management Organizations, schools operated by for-profit managers, schools operated by other education management organizations described by Miron and colleagues as nonprofit in formal status but engaging in contractual arrangements more similar to for-profit organizations, and schools that remain independently operated (“mom-and-pop”).[xi]

From Portfolios to Parasites?

As early as the mid-1990s, authors including Paul Hill, James Guthrie and Lawrence Pierce (1997) advocated that entire school districts should be reorganized into collections of privately managed contract schools.[xii] This contract school proposal emerged despite the abject failure of Education Alternatives, Inc. in Baltimore and Hartford. This proposal provided a framework for renewed attempts at large scale private management including the contracting of management for several Philadelphia public schools in the early 2000s. Philadelphia’s experiment in private contracting yielded mixed results, at best.[xiii] Notably, Hill and colleagues’ contract school model depended on a centralized authority to manage the contracts and maintain accountability, a precursor to what is now commonly referred to as a “portfolio” model. In the portfolio model, a centralized authority oversees a system of publicly financed schools, both traditional district operated and independent, charter operated, wherein either type of school might be privately managed.[xiv] The goal as phrased by former New York City schools’ Chancellor Joel Klein is to replace school systems with systems of great schools.[xv]

A very different reality of charter school governance, however, has emerged under state charter school laws – one that presents at least equal likelihood that charters established within districts operate primarily in competition, not cooperation with their host, to serve a finite set of students and draw from a finite pool of resources. One might characterize this as a parasitic rather than portfolio model – one in which the condition of the host is of little concern to any single charter operator. Such a model emerges because under most state charter laws, locally elected officials – boards of education – have limited control over charter school expansion within their boundaries, or over the resources that must be dedicated to charter schools. Thus, there is no single, centralized authority managing the portfolio – the distributions of enrollments and/or resources – or protecting against irreparable damage to any one part of the system (be the parasites or the host).

Figure 1 displays a system in which a set of District Operated Schools (DOS), District Charter Schools (DCS) and Independent Charter Schools (ICS) serve a geographic space previously governed and operated entirely by local elected officials. In many states, independent charter schools may only be authorized by a government or government appointed entity – a single authorizer. Nonetheless, these schools are not required to be responsive to local elected governance (unless required under state charter law) and have little or no incentive to be concerned with the financial condition of their host. In other states, additional entities may authorize charter schools to compete for students and resources in the same geographic space. This approach further disperses authority for schools serving any geographic area. Among other issues, dispersed authorization provides the opportunity for potential charter operators, including those with previously failing track records to “shop” for authorizers who will more readily permit their expansion and more likely turn a blind eye to academic failure and/or financial mismanagement.[xvi]

Figure 1

EPI.Baker.Figures_10_18_15

Proponents of the dispersed governance model in Figure 1 assert that competition both for governance/accountability and for management/operations of schools provides greater opportunity for rapid expansion and innovation. However, some of the more dispersed multiple authorizer[xvii] governance models have been plagued by weak accountability, financial malfeasance and persistently low performing charter operators, coupled with rapid, unfettered, under-regulated growth.[xviii]

Nonetheless, charter advocacy organizations including Bellwether Education Partners (BEP) continue to argue for more rapid growth and increased market share for charter schooling. BEP provides a facile extrapolation (along unconstrained linear trajectories) to claim that at current rates, the charter sector will grow to serve 20 to 40% of all U.S. students by 2035. For charter expansion advocates like BEP, however, even this rate is too slow, constrained by having too few authorizers, caps on new charters in some states and unwillingness of districts as authorizers to approve new charter applications.[xix] For charter advocacy organizations, tight centralized regulation and slow or limited growth of charters is a non-sequitur, with the optimal balance somewhere between approximately 40% (Washington, DC) and 100% (New Orleans) of children served by independent charter schools.[xx]

Fiscal Stress, Inefficiency & Charter Expansion?

Increased attention is being been paid to the fiscal and enrollment effects of charter schooling on host districts. These concerns come at a time when municipal fiscal stress and the potential for large-scale municipal and school district bankruptcies are in the media spotlight.[xxi] Many high profile cases of municipal fiscal stress are in cities where the charter sector is thriving.[xxii] Some charter advocates have gone so far as to assert that school district bankruptcy presents “huge opportunity” to absolve the taxpaying public of existing debts and financial obligations and start fresh under new management, reallocating those funds to classrooms.[xxiii] Of course, this strategy ignores the complexities of municipal bankruptcy proceedings, and the contractual, social and moral obligations for the stewardship of publicly owned capital (and other) assets and responsibility to current and retired employees.

Advocates for charter expansion typically assert that charter expansion causes no financial harm to host districts. The logic goes – if charter schools a) serve typical students drawn from the host district’s population, and b) receive the same or less in public subsidy per pupil to educate those children, then the per pupil amount of resources left behind for children served in district schools either remains the same or increases. Thus, charter expansion causes no harm (and in fact yields benefits) to children remaining in district schools. The premise that charter schools are uniformly under-subsidized is grossly oversimplified and inaccurate in many charter operating contexts.[xxiv] In addition, numerous studies find that charter schools serve fewer students with costly special needs, leaving proportionately more of these children in district schools. Perhaps most importantly, the assumption that revenue reductions and enrollment shifts cause districts no measurable harm for host districts ignores the structure of operating costs and dynamics of cost and expenditure reduction.

Moody’s Investor Service recently opined that “charter schools pose greatest credit challenge to school districts in economically weak urban areas.” [xxv] Specifically, Moody’s identifies the following four areas posing potential concerns for host urban districts with growing independent charter sectors:

  1. Weak demographics and district financial stress, which detract from the ability to deliver competitive services and can prompt students to move to charter schools
  2. Weak capacity to adjust operations in response to charter growth, which reduces management’s ability to redirect spending and institute program changes to better compete with charter schools
  3. State policy frameworks that support charter school growth through relatively liberal approval processes for new charters, generous funding of charters, and few limits on charter growth
  4. Lack of integration with a healthier local government that can insulate a school system from credit stress

District officials in Nashville, TN recently contracted consultants to evaluate impact of charter expansion on their district. The consultants’ report noted specifically:

  • They will continue to cause the transfer of state and local per student funds without reducing operational costs.
  • They will continue to increase direct and indirect costs.
  • They will continue to negatively impact deferred maintenance at leased buildings.
  • They may have an offsetting impact on capital costs, if they open in areas of need for increased capacity. [xxvi]

Recently published academic analyses raise similar concerns. Robert Bifulco and Randall Reback (2014) evaluate the fiscal impact of charter expansion on two mid-size upstate New York cities – Albany and Buffalo.[xxvii] They find that charter schools have had negative fiscal impacts on these districts, and argue that there are two reasons for these impacts. First, districts are generally unable to adjust their expenditures on a student by student basis, because costs range from fixed costs (district wide and school overhead costs that are not reduced by the transfer of individual pupils), to step costs (including classroom level costs, also not reduced by the transfer of individual pupils) to variable costs, which are most easily reduced on a student by student basis, but constitute a relatively small share of school district budgets. These concerns echo those of consultants to Nashville Public Schools. Further, Arsen and Ni (2012) find that higher levels of charter school enrollments in Michigan school districts are strongly associated with declining fund balances, and that revenues declined more rapidly than costs in districts losing students to charter schools.[xxviii]

Second, Bifulco and Reback point out that “operating two systems of public schools under separate governance arrangements can create excess costs,” or inefficient expenditures.[xxix] Other authors have raised similar concerns about additional, often exorbitant overhead expenses created by introducing school systems within school systems (independent charter schools within districts).[xxx] That is, while inducing fiscal stress on host districts, charter expansion may also be increasing total overhead costs. Two studies of Michigan charter schools, which operate fiscally independently of local public districts, have found them to have particularly high administrative expenses and low direct instructional expenses. Arsen and Ni (2012) found that “Controlling for factors that could affect resource allocation patterns between school types, we find that charter schools on average spend $774 more per pupil per year on administration and $1141 less on instruction than traditional public schools.” (p. 1) Further, they found “charter schools managed by EMOs spend significantly more on administration than self-managed charters (about $312 per pupil). This higher spending occurs in administrative functions traditionally performed at both the district central office and school building levels.” (p. 13)[xxxi]

Izraeli and Murphy (2012) found that district schools in Michigan tended to spend more on instruction per student than did charter schools, and the gap grew by about 5 percent to nearly 35% percent over the period studied (1995-96 to 2005-06) (p. 265). Further they found the spending gap for instructional spending to be greater than that for general spending. The overall funding gap between district and charter schools was approximately $230. The spending gap for basic programs was $562 and for total instruction $910. The authors note “much like a profit-maximizing firm, charter schools generate a surplus of revenue over expenditure.” (Izraeli & Murphy, 2012, p. 265)[xxxii]

Baker and Miron (2015) show that in New Jersey, charter school administrative expenses are “nearly $1,000 per pupil higher than those of other regular school district types, and the share of budgets allocated to administration is nearly double.”[xxxiii] Further, that local public school districts maintain responsibility for providing some services to charter school students, and thus, the administrative overhead associated with those responsibilities. That is, on a per pupil basis, district administrative expenses are being over-stated and charter school administrative expenses understated. Additionally, these publicly reported administrative expenses do not include, for example, expenses (including executive salaries) from regional or national management organizations above and beyond management fees, further potentially understating total administrative expenses of the charter schools.

In addition, the uneven reshuffling of children and resources across schools within geographic boundaries can exacerbate inequities. Numerous studies have found charter schools to yield imbalance distributions of children by their income status, language proficiency and race. [xxxiv] Under some models, like the New Orleans all-charter system, stratification exists by design. [xxxv] Baker, Libby and Wiley (2014) find that through the sorting of children and resources, charter expansion induces inequities within districts – inequities among charter schools and between charter and district schools. [xxxvi] But, to the extent that charter share remains small, these inequities remain limited. [xxxvii] Profit-status of charter operators also may lead to very different available school level resources available to children, as for-profit schools seek to achieve profit margins, while non-profits seek to enhance revenues through tax exempt private giving.[xxxviii] Finally, the concentration of needy children in some schools can dramatically increase the costs of improving outcomes for those students. That is, uneven sorting of children by needs can create additional inefficiencies. [xxxix]

Finally, it is conceivable that the dissolution of large centralized school districts and introduction of multiple school operators into a single geographic space could compromise efficiency associated with economies of scale, which operate at both the school and district level. Numerous studies of education costs have found that the costs of providing comparable services rise as district enrollments drop below 2,000 students and rise sharply at enrollments below 300 students. Further, a comprehensive review of literature on economies of scale in education by Andrews, Duncombe and Yinger (2002) finds “there is some evidence that moderately sized elementary schools (300–500 students) and high schools (600–900 students) may optimally balance economies of size with the potential negative effects of large schools.”[xl] (p. 245) To the extent that charter expansion creates independent “districts” operating with fewer than 2,000 pupils and/or increases shares of children attending schools with smaller enrollments than those noted above, inefficiencies may be introduced.

Notes

[i] Tabulation by author using data from NCES Common Core of Data, Public School Universe Survey(s)

[ii]   Weber & Baker, (2015) Do For-Profit Managers Spend Less on Schools and Instruction? A national analysis of charter school staffing expenditures. Working Paper. Rutgers University

[iii] Shanker, A. (1988). Restructuring our schools. Peabody Journal of Education, 65(3), 88-100.

[iv] Preston, C., Goldring, E., Berends, M., & Cannata, M. (2012). School innovation in district context: Comparing traditional public schools and charter schools. Economics of Education Review, 31(2), 318-330.

[v] http://www.aei.org/wp-content/uploads/2015/07/Measuring-Diversity-in-Charter-School-Offerings.pdf

[vi] Baker, B. D., Libby, K., & Wiley, K. (2012). Spending by the Major Charter Management Organizations: Comparing Charter School and Local Public District Financial Resources in New York, Ohio, and Texas. National Education Policy Center.

[vii] Epple, D., Romano, R., & Zimmer, R. (2015). Charter schools: a survey of research on their characteristics and effectiveness (No. w21256). National Bureau of Economic Research.

Toma, E., & Zimmer, R. (2012). Two decades of charter schools: Expectations, reality, and the future. Economics of Education Review, 31(2), 209-212.

[viii] Richards, C. E. (1996). Risky Business: Private Management of Public Schools. Economic Policy Institute, 1660 L Street, NW, Suite 1200, Washington, DC 20036.

[ix] http://www.nytimes.com/1997/12/17/us/edison-project-reports-measurable-progress-in-reading-and-math-at-its-schools.html

[x] http://m.lasvegassun.com/news/2003/jul/14/edison-founder-to-buy-back-stock/

[xi] Miron, G., & Gulosino, C. (2013). Profiles of for-profit and nonprofit education management organizations: Fourteenth Edition—2011-2012. Boulder, CO: National Education Policy Center.

[xii] Hill, P., Pierce, L. C., & Guthrie, J. W. (2009). Reinventing public education: How contracting can transform America’s schools. University of Chicago Press.

[xiii] Mac Iver, M. A., & Mac Iver, D. J. (2006). Which Bets Paid Off? Early Findings on the Impact of Private Management and K‐8 Conversion Reforms on the Achievement of Philadelphia Students. Review of Policy Research, 23(5), 1077-1093.

The authors found:

Non-Edison EMO schools actually performed worse than district-managed schools. With the exception of one older K-8 school in one cohort, Edison schools did not significantly outperform district-managed counterparts. Students in long-established K-8 schools generally outgained students in middle schools, but gains were not as large in newly-established K-8 schools. Across all types of schools, the second cohort of students obtained greater gains than did the first.

[xiv] Hill, P. T. (2006). Put Learning First: A Portfolio Approach to Public Schools. PPI Policy Report. Progressive Policy Institute.

More recently, the phrases “sector agnosticism” and “relinquishment” have been used to describe systems that turn a blind eye to whether public access schools are privately or publicly (elected officials/government) governed, or at the greater extreme, that public officials should “relinquish” any and all control of publicly accessible schooling to private providers/managers who will be far more responsive (than publicly governed/managed entities) to the needs and demands of children and their parents.

[xv] http://blogs.worldbank.org/education/system-great-schools-joel-klein-s-legacy-new-york-city

[xvi] http://www.journalgazette.net/news/local/indiana/Bill-seeks-to-stop–charter-shopping–by-failing-schools-5439091

[xvii] https://www.edreform.com/wp-content/uploads/2012/05/CERPrimerMultipleAuthorizersDec2011.pdf

[xviii] http://www.dispatch.com/content/stories/local/2015/10/21/charter-study.html and http://www.freep.com/article/20140622/NEWS06/140507009/State-of-charter-schools-How-Michigan-spends-1-billion-but-fails-to-hold-schools-accountable

[xix] http://bellwethereducation.org/publication/state-charter-school-movement

[xx] http://educationnext.org/how-many-charter-schools-just-right/

[xxi] http://www.governing.com/gov-data/municipal-cities-counties-bankruptcies-and-defaults.html

[xxii] Chester Upland, PA: https://www.washingtonpost.com/local/education/in-a-bankrupt-pa-school-district-teachers-plan-to-work-for-free/2015/08/28/0332898e-4dba-11e5-84df-923b3ef1a64b_story.html

Philadelphia: http://www.philly.com/philly/education/20151022_Near-broke_Philly_schools_must_borrow_to_make_payroll.html

Detroit: http://www.reuters.com/article/2015/10/19/us-usa-detroit-education-idUSKCN0SD2KC20151019

[xxiii] http://www.prwatch.org/node/12932

[xxiv] Baker, B. D. (2014). Review of charter funding: Inequity expands. Boulder, CO: National Education Policy Center. Retrieved August, 7, 2014.

[xxv] https://www.moodys.com/research/Moodys-Charter-schools-pose-greatest-credit-challenge-to-school-districts–PR_284505

[xxvi] http://nashvillepublicmedia.org/wp-content/uploads/2014/09/MNPS-Charter-Schools-Financial-Impact-Review.pdf

[xxvii] Bifulco, R., & Reback, R. (2014). Fiscal impacts of charter schools: lessons from New York. Education, 9(1), 86-107.

[xxviii] Arsen, D., & Ni, Y. (2012). The effects of charter school competition on school district resource allocation. Educational Administration Quarterly, 48(1), 3-38.

“Overall, the results do not support the hypothesis that competition from charter schools spurs regular public schools to shift resources to achievement-oriented activities. Charter competition has had remarkably little impact on standard measures of district resource use in Michigan schools. On the other hand, higher levels of charter competition clearly generate fiscal stress in districts. Moreover, changes in resource allocation cannot explain the differing trajectories of districts that do and do not turn back the competitive challenge. There are no significant differences in the resource allocation changes made by districts that stabilize enrollment loss to charters and those that continue to spiral down.”

[xxix] Bifulco, R., & Reback, R. (2014). Fiscal impacts of charter schools: lessons from New York. Education, 9(1), 86-107.

[xxx] Baker, B. D., Libby, K., & Wiley, K. (2012). Spending by the Major Charter Management Organizations: Comparing Charter School and Local Public District Financial Resources in New York, Ohio, and Texas. National Education Policy Center.

[xxxi] Arsen, D. D., & Ni, Y. (2012). Is administration leaner in charter schools? Resource allocation in charter and traditional public schools. education policy analysis archives, 20, 31.

[xxxii] Izraeli, O., & Murphy, K. (2012). An Analysis of Michigan Charter Schools: Enrollment, Revenues, and Expenditures. Journal of Education Finance, 37(3), 234-266.

[xxxiii] These averages mask the highest administrative expenses, which occur in Camden Community Charter School (at $5,325 per pupil, higher than their instructional expense of $4,225)[xxxiii] and TEAM Academy (a KIPP school, at $4,851 per pupil,[xxxiii] but still much lower than their instructional expense at $8,639 per pupil).

[xxxiv] Stein, M. L. (2015). Public School Choice and Racial Sorting: An Examination of Charter Schools in Indianapolis. American Journal of Education, 121(4), 597-627.

Ladd, H. F., Clotfelter, C. T., & Holbein, J. B. (2015). The Growing Segmentation of the Charter School Sector in North Carolina (No. w21078). National Bureau of Economic Research.

Kotok, S., Frankenberg, E., Schafft, K. A., Mann, B. A., & Fuller, E. J. (2015). School Choice, Racial Segregation, and Poverty Concentration Evidence From Pennsylvania Charter School Transfers. Educational Policy, 0895904815604112.

LOGAN, J. R., & BURDICK‐WILL, J. U. L. I. A. (2015). SCHOOL SEGREGATION, CHARTER SCHOOLS, AND ACCESS TO QUALITY EDUCATION. Journal of Urban Affairs.

See also: http://districtmeasured.com/2015/10/07/can-school-lotteries-make-schools-more-diverse/

[xxxv] Adamson, F., Cook-Harvey, C., & Darling-Hammond, L. (2015). Whose choice?: Student experiences and outcomes in the New Orleans school marketplace. Stanford, CA: Stanford Center for Opportunity Policy in Education.

[xxxvi] Baker, B. D., Libby, K., & Wiley, K. (2015). Charter School Expansion and Within-District Equity: Confluence or Conflict?. Education Finance and Policy.

[xxxvii] Baker, B. D., Libby, K., & Wiley, K. (2015). Charter School Expansion and Within-District Equity: Confluence or Conflict?. Education Finance and Policy.

[xxxviii] Weber, M., Baker, B.D. (2015)

[xxxix] Baker, B. D. (2011). Exploring the sensitivity of education costs to racial composition of schools and race-neutral alternative measures: A cost function application to Missouri. Peabody Journal of Education, 86(1), 58-83.

[xl] Andrews, M., Duncombe, W., & Yinger, J. (2002). Revisiting economies of size in American education: are we any closer to a consensus?. Economics of Education Review, 21(3), 245-262.