1-aasa-logo.jpg school innovations and achievement 

This document is one in a series of reports on key aspects of the Every Student Succeeds Act (ESSA) produced in a partnership between AASA, The School Superintendents Association and School Innovations & Achievement’s Cabinet Report. The full set of resources is available at aasa.org/AASAESSA.aspx.

Spending and training decisions need to be research-based  

Few would argue with the notion that decisions on curriculum purchases, trainings, or reform projects need to be research based. But, in spite of the regulatory efforts from federal officials that have defined evaluation criteria, educators have only been moderately successful in discriminating between programs that are well-founded and those less proven. The U.S. Department of Education has now taken on the task of identifying and prioritizing the evidence that school officials should be considering when making purchases or investing in improvements.

A recently issued advisory pamphlet, Non-Regulatory Guidance: Using Evidence to Strengthen Education Investments (September 16, 2016) clarifies and expands upon the term “evidence-based” used throughout the Every Student Succeeds Act, also referred to as “evidence of promise” in the Education Department General Administrative Regulations (PL: 114-95 & 34 CFR § 77.1).

The guidance document is basically divided into two sections: the first dealing with the strategic planning that can be utilized for organizational betterment; and the second setting forth a hierarchical method for selecting interventions, resources, or materials.

Part I, entitled “Strengthening the Effectiveness of ESEA Investments,” suggests a five-stage process for instituting change or enhancing system-wide performance as follows:

  1. Identify local needs
  2. Select relevant evidence-based interventions
  3. Plan for implementation
  4. Implement
  5. Examine and reflect.

For each phase, the ED outlines the supplemental actions and poses questions administrators should apply to make sure aims are achieved and progress is assessed.

The final portion, “Guidance on the Definition of ‘Evidence-Based,’” clarifies how to ascertain the merits of research or program models with a rubric for judging potential effectiveness. The evaluative matrix proposes aligning the strength of the data or study construction (strong evidence, moderate evidence, promising evidence, or demonstrating a rationale) with the outcome of a project or experiment (i.e., effect size)–the more robust the study and the greater the positive variance, the better the case for implementation.

This short paper (the entire document is only 12 pages) will be useful for both district and site level administrators who are planning to initiate systemic changes and making purchases to augment curricular or instructional enhancements. As ESSA is implemented, the advice can serve as touchstone for planning and decision-making. It can be found at the ED website, here: http://www2.ed.gov/policy/elsec/leg/essa/guidanceuseseinvestment.pdf.

The manual was characterized as “significant guidance” as defined by the Office of Management and Budget, meaning it is “non-binding and does not create or impose new legal requirements” but is “anticipated to …[r]aise novel legal or policy issues arising out of legal mandates…[and] include interpretive rules of general applicability. …” (ED, Non-Regulatory Guidance: Using Evidence & OMB, Final Bulletin for Agency Good Guidance Practices, 72 Fed. Reg. 3432 (January 25, 2007).