2020 Federal Standard of Excellence


U.S. Department of Education

Score
9
Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY20?

1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • USED has a designated Chief Data Officer (CDO). The Office of Planning, Evaluation and Policy Development’s (OPEPD) Office of the Chief Data Officer (OCDO) has a staff of 18 and is actively hiring additional staff. The Evidence Act provides a framework for OCDO’s responsibilities, which include lifecycle data management and developing and enforcing data governance policies. The OCDO has oversight over ED’s information collections approval and associated OMB clearance process. It is responsible for developing and enforcing ED’s open data plan, including management of a centralized comprehensive data inventory accounting for all data assets across ED. The OCDO is also responsible for developing and maintaining a technological and analytical infrastructure that is responsive to ED’s strategic data needs, exploiting traditional and emerging analytical methods to improve decision making, optimize outcomes, and create efficiencies. These activities are carried out by the Governance and Strategy Division, which focuses on data governance, lifecycle data management, and open data; and the Analytics and Support Division, which provides data analytics and infrastructure responsive to ED’s strategic data. The current OCDO budget reflects the importance of these activities to ED leadership, with S&E funding allocated for data governance, data analytics, open data, and information clearances.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • The EO, CDO, and SO meet monthly for the purposes of ensuring ongoing coordination of Evidence Act work. Each leader, or their designee, also participate in the PIO’s Strategic Planning and Review process. In FY20, the CDO is the owner of Goal 3 in ED’s strategic plan: “Strengthen the quality, accessibility, and use of education data through better management, increased privacy protections and transparency.” Leaders of the three embedded objectives come from OCDO, OCIO, and NCES.
  • The Evidence Leadership Group (ELG) supports program staff that run evidence-based grant competitions and monitor evidence-based grant projects. It advises ED leadership and staff on how evidence can be used to improve ED programs and provides support to staff in the use of evidence. It is co-chaired by the Evaluation Officer and the OPEPD Director of Grants Policy. Both co-chairs sit on ED’s Policy Committee (described below). The SO, EO, CDO, and Performance Improvement Officer (PIO) are ex officio members of the ELG.
  • The ED Data Governance Board (DGB) sponsors agency-wide actions to develop an open data culture, and works to improve ED’s capacity to leverage data as a strategic asset for evidence building and operational decisions, including developing the capacity of data professionals in program offices. It is chaired by the CDO, with the SO, EO, and PIO as ex officio members.
  • The ED CDO sits in OPEPD and the Evaluation Officer (EO) and the Statistical Official (SO) sit in the Institute for Education Sciences (IES). Both OPEPD and IES participate in monthly Policy Committee meetings which often address evidence-related topics. OPEPD manages the Secretary’s policy priorities including evidence, while IES is focused on (a) bringing extant evidence to policy conversations and (b) suggesting how evidence can be built as part of policy initiatives. OPEPD plays leading roles in the formation of ED’s policy positions as expressed through annual budget requests, grant competition priorities, including evidence. Both OPEPD and IES provide technical assistance to Congress to ensure evidence appropriately informs policy design.
Score
9
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence-building plan), and did it publicly release the findings of all completed program evaluations in FY20?

2.1 Did the agency have an agency-wide evaluation policy? (Example: Evidence Act 313(d))
  • The Department’s new Evaluation Policy is posted online at ed.gov/data and can be directly accessed here. Key features of the policy include the Department’s commitment to: (1) independence and objectivity; (2) relevance and utility; (3) rigor and quality; (4) transparency; and (5) ethics. Special features include additional guidance to ED staff on considerations for evidence-building conducted by ED program participants, which emphasize the need for grantees to build evidence in a manner consistent with the parameters of their grants (e.g., purpose, scope, and funding levels), up to and including rigorous evaluations that meet WWC standards without reservations.
2.2 Did the agency have an agency-wide evaluation plan? (Example: Evidence Act 312(b))
  • ED’s FY22 Draft Annual Evaluation Plan will be shared with OMB in the fall and finalized in the spring. Consistent with OMB Circular A-11 Section 290, the FY 22 Annual Evaluation Plan will be posted publicly in February 2021 concurrent with the Budget Release. ED anticipates that plan will include all current and planned program evaluations across ED and such details that are required by the Evidence Act and associated OMB guidance.
  • ED’s current evaluation plan covers the subset of agency activities funded by ESSA FY18 and FY19 appropriations, for work to be procured in FY19 and FY20, and begun—effectively—in FY20 and FY21. Since the passage of ESSA, IES has worked with partners across ED, including the Evidence Leadership Group, to prepare and submit to Congress a biennial, forward-looking evaluation plan covering all mandated and discretionary evaluations of education programs funded under ESSA (known as ED’s “8601 plan”).
2.3 Did the agency have a learning agenda (evidence-building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including, but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda? (Example: Evidence Act 312)
  • ED is developing its Learning Agenda consistent with milestones established by the Evidence Act and OMB guidance. Per OMB guidance on performance management systems, ED will share priority questions in its draft learning agenda with OMB in June 2020 as part of the Strategic Review Process. The complete draft Learning Agenda will be shared with OMB in Fall 2020. After receiving feedback from OMB and external stakeholders, ED will submit a final Learning Agenda to OMB in Fall 2021. OMB Circular A-11 Section 290 does not require the Learning Agenda be publicly released prior to February 2022, concurrent with the FY23 Budget Release.
  • To develop its draft Learning Agenda, ED has expanded the question generation and prioritization process used in the development of its “8601 Plan” (see above) to all principal operating components across ED. To help ensure alignment of the draft learning agenda to ED’s Strategic Plan, the Evidence Leadership Group has been expanded to include a member from ED’s Performance Improvement Office. The Evaluation Officer regularly consults with ED’s Enterprise Risk Management (ERM) function to explore the intersection between the Learning Agenda and high-priority issues identified in ERM processes. Broad stakeholder feedback will be received on topics addressed in the Draft Learning Agenda after initial comments have been received from OMB on its format and sufficiency.
2.4 Did the agency publicly release all completed program evaluations?
2.5 What is the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts? (Example: Evidence Act 315, subchapter II (c)(3)(9))
  • ED completed its Interim Capacity Assessment, meeting all milestones established by the Evidence Act and OMB guidance. It addresses six dimensions of the Department’s capacity to build and use evidence, with an emphasis on evaluation. Specific components include: (1) a list of existing activities being evaluated by the Department; and assessments of the extent to which those activities (2) meet the needs of the Department’s operating components; (3) meet the Department’s most important learning, management, and accountability needs; (4) use appropriate methods; (5) are supported by agency capacity for effective planning, execution, and dissemination; and (6) are supported by agency capacity for effective use of evaluation evidence and data for analysis.
  • A distinguishing feature of ED’s Interim Capacity Assessment is an agency-wide survey of all employees that focus in two domains: (1) their capacity to build and use evidence and (2) their capacity to use data. The specific questions employees received depend upon their position level (i.e., supervisory or non-supervisor) and their job role (i.e., grant maker/monitor; non-grant maker/monitor; data professional). The results of this survey are already being used to develop training related to evidence building, evidence use, and analytics and fulfills, in part, requirements of the Federal Data Strategy.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
  • The IES website includes a searchable database of planned and completed evaluations, including those that use experimental, quasi-experimental, or regression discontinuity designs. As of July 2020, that list includes 43 completed or planned experimental studies, two quasi-experimental studies, and five regression discontinuity studies. All impact evaluations rely upon experimental trials. Other methods, including matching and regression discontinuity designs, are classified as rigorous outcomes evaluations. Not included in this count are studies that are descriptive or correlational in nature, including implementation studies and less rigorous outcomes evaluations.
Score
7
Resources

Did the agency invest at least 1% of program funds in evaluations in FY20? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, rigorous evaluations, including random assignments)

3.1. ____ (Name of agency) invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY20 budget.
  • ED invested $237.6 million in rigorous evaluations, technical assistance related to evaluation and evidence-building, and capacity-building in FY20. This includes work awarded by the Regional Educational Laboratories ($83.9 million), NCEE’s Evaluation Division ($56.4 million), NCER Efficacy Trials ($42.7 million), SBIR Phase II Projects ($7.1 million), NCSER Efficacy Trials ($32.7 million), and NCSER Replication Trials ($14.8 million).
  • This represents 0.48% of the agency’s $49 billion FY20 congressional appropriation, not including: (1) Student Financial Assistance and related accounts; (2) Howard and Gallaudet Universities; (3) capital and liquidating accounts for higher education institutions; (4) agency salaries and expenses; and (5) general and special funds receipts.
3.2 Did the agency have a budget for evaluation and how much was it? (Were there any changes in this budget from the previous fiscal year?)
  • ED does not have a specific budget solely for evaluation. Federal program evaluations are supported either by required or allowable program funds or by ESEA Section 8601, which permits the Secretary to reserve up to 0.5% of selected ESEA program funds for rigorous evaluation. Other evaluation activities are supported by the IES budget (i.e., the Regional Educational Laboratories account; the Research, Development, and Dissemination account; the Research in Special Education account; and the Special Education Studies and Evaluations account).
  • FY20 ($364.8M) and FY19 ($357.6M) estimates are not directly comparable due to a change in ED’s FY20 calculation method. However, the amount invested by NCEE in rigorous evaluations increased in FY20 ($56.4 million vs $53.5 million). Evaluation, research and development, and capacity building in the REL program represents a relatively stable appropriation (FY19 $55.4 million vs. FY20 $56.0 million), though, due to how contracts are funded over their lifecycle, a significantly larger expense in FY20 than in FY21. Slight increases were also observed in the Research, Development, and Dissemination account (+$3.2 million) and in the Research in Special Education account (+$500,000).\

3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
  • In March 2020, IES announced its most recent round of SLDS awards. IES anticipates awarding a total of $105 million over four years to 26 states, Guam, and the Commonwealth of the Northern Mariana Islands. Alabama, CNMI, Guam, and Wyoming were first-time SLDS grantees. Five other states, Colorado, Connecticut, Kansas, Maine, Ohio, South Carolina, and Virginia had not received SLDS grants since the 2009 cycle. Priorities for the 2020 grants included (1) infrastructure, (2) education choice, and (3) equity.
  • The Regional Education Laboratories (RELs) provide extensive technical assistance on evaluation and support research partnerships that conduct implementation and impact studies on education policies and programs in ten geographic regions of the U.S., covering all states, territories, and the District of Columbia. Congress appropriated $55.4 million for the RELs in FY20.
  • Comprehensive Centers provide support to States in planning and implementing interventions through coaching, peer-to-peer learning opportunities, and ongoing direct support. The State Implementation and Scaling Up of Evidence-Based Practices Center provides tools, training modules, and resources on implementation planning and monitoring.
Score
8
Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY20?
(Example: Performance stat systems, frequent outcomes-focused data-informed meetings)

4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • ED’s FY18-22 Strategic Plan includes two parallel goals, one for P-12 and one for higher education (Strategic Objectives 1.4 and 2.2, respectively), that focus on supporting agencies and educational institutions in the identification and use of evidence-based strategies and practices. The OPEPD ELG co-chair is responsible for both strategic objectives.
  • All Department Annual Performance Reports (most recent fiscal year) and Annual Performance Plan (upcoming fiscal year) are located on ED’s website. This includes the FY19 Annual Performance Report and the FY21 Annual Performance Plan, which includes FY19 performance results as well as planned targets for FY20, FY21, and FY22. In FY20, ED published new Agency Priority Goals in performance.gov emphasizing (1) education freedom, (2) multiple pathways to student success, (3) federal student aid customer service, (4) student privacy and cybersecurity, and (5) regulatory reform.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • The Grants Policy Office in the Office of Planning, Evaluation and Policy Development (OPEPD) works with offices across ED to ensure alignment with the Secretary’s priorities, including evidence-based practices. The Grants Policy Office looks at where ED and the field can continuously improve by building stronger evidence, making decisions based on a clear understanding of the available evidence, and disseminating evidence to decision makers. Specific activities include: strengthening the connection between the Secretary’s policies and grant implementation from design through evaluation; supporting a culture of evidence-based practices; providing guidance to grant-making offices on how to integrate evidence into program design; and identifying opportunities where ED and field can improve by building, understanding, and using evidence. During the past year, the Grants Policy Office has collaborated with offices across the Department on a variety of activities, including reviews of efforts used to determine grantee performance.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • The Department conducted after-action reviews after the FY 2019 competition cycle to reflect on successes of the year as well as opportunities for improvement. The reviews resulted in process updates for FY 2020. In addition, the Department updated an optional internal tool to inform policy deliberations and progress on the Secretary’s policy priorities, including the use of evidence and data.
Score
6
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY20? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies; data-use policies)

5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • ED’s FY18-22 Performance Plan outlines strategic goals and objectives, including Goal #3: “Strengthen the quality, accessibility and use of education data through better management, increased privacy protections and transparency.” This currently serves as a strategic plan for ED’s governance, protection, and use of data while it develops the Open Data Plan required by the Evidence Act. The plan includes a metric on the number of data assets that are “open by default” as well as a metric on open licensing requirements for deliverables created with Department grant funds.
  • In addition, the Information Resources Management Strategic Plan for FY 2019 to FY 2023, released in December 2019, includes for the first time a strategic goal to “improve data management, enhance the use of data analytics, and promote transparency at the Department.” One of the strategic objectives is to “implement solutions that advance open data and transparency.” Initiatives under this objective are “develop, publish, and execute an open data plan” and “develop, maintain, and enhance technology solutions that foster open data access, public dialogue, and a culture of transparency.” Metrics will include the number of agency open data assets released through ED’s Open Data Platform (ODP), a new tool to make public data discoverable from a single location and easily searchable by topic.
  • ED continues to wait for Phase 2 guidance from OMB to understand required parameters for the open data plan. In the meantime, USED continues to release open data, develop our Open Data Platform as informed by M-13-13 and the open data project, and develop our draft open data plan. USED will finalize the Open Data Plan and conform to new requirements when Phase 2 guidance is released. If guidance is received soon, ED will publish its open data plan in FY21 within the agency’s Information Resource Management Strategic Plan.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • The ED Data Inventory (EDI) was developed in response to the requirements of M-13-13 as ED’s external asset inventory. It describes data reported to ED as part of grant activities, along with administrative and statistical data assembled and maintained by ED. It includes descriptive information about each data collection along with information on the specific data elements in individual data collections. While the EDI continues to meet requirements for M-13-13, ED has also been developing an Open Data Platform (ODP), a new tool to make public data discoverable from a single location and easily searchable by topic. ED will continue to identify, ingest, catalogue, and make available public data assets for public discovery and use. ED continues to wait for Phase 2 guidance from OMB to understand required parameters for comprehensive data inventory. Once published, the ODP will serve as the agency’s comprehensive data inventory and regularly send information to data.gov as required by Evidence Act provisions on the Federal Data Catalog.
  • Information about Department data collected by the National Center for Education Statistics (NCES) have historically been made publicly available online. Prioritized data is further documented or featured on ED’s data page. NCES is also leading a government-wide effort to automatically populate metadata from Information Collection Request packages to data inventories. This may facilitate the process of populating EDI and comprehensive data inventory.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • ED’s forthcoming Open Data Platform features standard metadata contained in Data Profiles for each data asset. Before new assets are added, data stewards conduct quality review checks on the metadata to ensure accuracy and consistency. As the platform matures and expands, ED staff and the public will find it a powerful tool for accessing and analyzing ED data, either through the platform directly or through other tools powered by its API.
  • ED has also made concerted efforts to improve the availability and use of its data with the release of the revised College Scorecard that links data from NCES, the Office of Federal Student Aid, and the Internal Revenue Service. In FY20, ED released data describing debt at the level of fields of study. ED plans to integrate additional fields of study data into its College Scorecard consumer site and the Office of Federal Student Aid’s NextGen student tools. College Scorecard provides privacy-protected consumer access to data that otherwise would not be available to students and parents. Users can see aggregated data for colleges and universities on student income and debt (not otherwise publicly available) in an easy-to-use interface. College Scorecard also includes an API, fueling dissemination of college consumer data through other platforms such as Google.
  • In September 2019, ED established an agency-level Data Governance Body (DGB), chaired by the Chief Data Officer (CDO), with participation from relevant senior-level staff in agency business units. The DGB assists the CDO in assessing and adjudicating competing proposals aimed at achieving and measuring desirable Departmental data outcomes and priorities. Since its inception, the DGB has evaluated data maturity models, selected an assessment that blends CMMI and the Open Data Initiative models, conducted preliminary “discovery” conversations with all ED offices to identify data challenges and opportunities, and completed its first data maturity assessment in each office and for ED as a whole. Data maturity is a metric that will be measured and reported as part of ED’s Annual Performance Plan. Several of these activities have been supported by ED’s investment in a Data Governance Board and Data Governance Infrastructure (DGBDGI) contract.
  • The Information Resources Management Strategic Plan for FY19-FY23, released in December 2019, includes for the first time a strategic goal to “improve data management, enhance the use of data analytics, and promote transparency at the Department.” Strategic Objective 5.6.2 calls for ED to “advance data analytic capabilities for the Department,” and includes an initiative to “leverage new and emerging technologies to facilitate access to and use of Department data.”
  • IES continues to make available all data collected as part of its administrative data collections, sample surveys, and evaluation work. Its support of the Common Education Data Standards (CEDS) Initiative has helped to develop a common vocabulary, data model, and tool set for P-20 education data. The CEDS Open Source Community is active, providing a way for users to contribute to the standards development process.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • IES is collaborating with an outside research team to conduct a proof of concept for multi-party computing. The Department’s general approach is to replicate an existing data collection that involves securely sharing PII across a number of partners using the MPC framework.
  • The Disclosure Review Board (DRB), the EDFacts Governing Board, the Student Privacy Policy Office (SPPO), and SPPO’s Privacy Technical Assistance Center and Privacy Safeguards Team all help to ensure the quality and privacy of education data. In FY19, the ED Data Strategy Team also published a user resource guide for staff on disclosure avoidance considerations throughout the data lifecycle.
  • In FY20, the ED DRB approved 27 releases (as of June 2020) by issuing “Safe to Release” memos. The DRB is in the process of developing a revised Charter that outlines its authority, scope, membership, process for dispute resolution, and how it will work with other DRBs in ED. The DRB is also developing standard operating procedures outlining the types of releases that need to be reviewed along with the submission and review process for data releases. The DRB is currently planning to develop information sessions to build the capacity of ED staff focusing on such topics as disclosure avoidance techniques used at ED, techniques appropriate for administrative and survey data, and how to communicate with stakeholders about privacy and disclosure avoidance. 
  • In ED’s FY18-22 Performance Plan, Strategic Objective 3.2 is to “Improve privacy protections for, and transparency of, education data both at ED and in the education community.” The plan also outlines actions taken in FY18. ED’s Student Privacy website assists stakeholders in protecting student privacy by providing official guidance on FERPA, technical best practices, and the answers to Frequently Asked Questions.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • ED’s forthcoming Open Data Platform will make Department data easily accessible to the public. Data will be machine-readable and searchable by keyword in order to promote easy access to relevant data assets. In addition, the ODP features an API so that aggregators and developers can leverage Department data to provide information and tools for families, policy makers, researchers, developers, advocates and other stakeholders. ODP will ultimately include listings of non-public, restricted data with links to information on the privacy-protective process for requesting restricted-use access to these data.
  • ED’s Privacy Technical Assistance Center (PTAC) responds to technical assistance inquiries on student privacy issues and provides online FERPA training to state and school district officials. FSA conducted a postsecondary institution breach response assessment to determine the extent of a potential breach and provide the institutions with remediation actions around their protection of FSA data and best practices associated with cybersecurity
  • The National Center for Education Statistics (NCES) provides free online training on using its data tools to analyze data while protecting privacy. Distance Learning Dataset Training includes modules on NCES’s data-protective analysis tools, including QuickStats, PowerStats, and TrendStats. A full list of NCES data tools is available on their website.
  • The Institute of Education Sciences (IES) administers a restricted-use data licensing program to make detailed data available to researchers when needed for in-depth analysis and modeling. NCES loans restricted-use data only to qualified organizations in the United States. Individual researchers must apply through an organization (e.g., a university, a research institution, or company). To qualify, an organization must provide a justification for access to the restricted-use data, submit the required legal documents, agree to keep the data safe from unauthorized disclosures at all times, and to participate fully in unannounced, unscheduled inspections of the researcher’s office to ensure compliance with the terms of the License and the Security Plan form.
  • ED’s Privacy Technical Assistance Center (PTAC) responds to technical assistance inquiries on student privacy issues and provides online FERPA training to state and school district officials. FSA conducted a postsecondary institution breach response assessment to determine the extent of a potential breach and provide the institutions with remediation actions around their protection of FSA data and best practices associated with cybersecurity.
Score
10
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY20? (Example: What Works Clearinghouses)

6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • ED has an agency-wide framework for evidence that is based on ratings of studies’ internal validity. ED evidence-building activities are designed to meet the highest standards of internal validity (typically randomized control trials) when causality must be established for policy development or program evaluation purposes. When random assignment is not feasible, rigorous quasi-experiments are conducted. The framework was developed and is maintained by IES’s  What Works ClearinghouseTM (WWC).
  • Since 2002, ED—as part of its compliance with the Information Quality Act and OMB guidance— has required that all “research and evaluation information products documenting cause and effect relationships or evidence of effectiveness should meet that quality standards that will be developed as part of the What Works Clearinghouse” (see Information Quality Guidelines). Those standards, currently in their 4th version, are maintained on the WWC website. A stylized representation of the standards can be found here, along with information about how ED reports findings from research and evaluations that meet these standards.
6.2 Did the agency have a common evidence framework for funding decisions?
  • ED employs the same evidence standards in all discretionary grant competitions that use evidence to direct funds to applicants that are proposing to implement projects that have evidence of effectiveness and/or to build new evidence through evaluation. Those standards, as outlined in the Education Department General Administrative Regulations (EDGAR), build on ED’s What Works ClearinghouseTM (WWC) research design standards.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions? 
  • ED’s What Works ClearinghouseTM (WWC) identifies studies that provide valid and statistically significant evidence of effectiveness of a given practice, product, program, or policy (referred to as “interventions”), and disseminates summary information and reports on the WWC website.
  • As of April 2020, the WWC has reviewed more than 10,650 studies that are available in a searchable database. It has published more than 590 Intervention Reports, which synthesize evidence from multiple studies about the efficacy of specific products, programs, and policies. It has published 24 Practice Guides, which synthesize across products, programs, and policies to surface generalizable practices that can transform classroom practice and improve student outcomes.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • ED has several technical assistance programs designed to promote the use of evidence-based practices, most notably IES’s Regional Educational Laboratory Program and the Office of Elementary and Secondary Education’s Comprehensive Center Program. Both programs use research on evidence-based practices generated by the What Works Clearinghouse and other ED-funded Research and Development Centers to inform their work. RELs also conduct applied research and offer research-focused training, coaching, and technical support on behalf of their state and local stakeholders. Their work is reflected in Strategic Plan Objectives 1.4 and 2.2.
  • Often, those practices are highlighted in WWC Practice Guides, which are based on meta-analytic syntheses of existing research and augmented by the experience of practitioners. These guides are designed to address challenges in classrooms and schools. The WWC is currently developing five new Practice Guides for release in FY21.
  • To ensure continuous improvement of the kind of TA work undertaken by the RELs and Comprehensive Centers, ED has invested in both independent evaluation and grant-funded research. The REL Program is currently undergoing evaluation, and design work for the next Comprehensive Center evaluation is underway. Addition, IES has awarded two grants to study and promote knowledge utilization in education, including the Center for Research Use in Education and the National Center for Research in Policy and Practice. In June of 2020, IES released a report on How States and Districts Support Evidence Use in School Improvement, which may be of value to technical assistance providers and SEA and LEA staff in improving the adoption and implementation of evidence-based practice.
  • Finally, the Evidence Leadership Group has coordinated the development of revised evidence definitions and related selection criteria for competitive programs that align with ESSA to streamline and clarify provisions for grantees. These revised definitions align with ED’s suggested criteria for states’ implementation of ESSA’s four evidence levels, included in ED’s non-regulatory guidance, Using Evidence to Strengthen Education Investments. ED also developed a fact sheet to support internal and external stakeholders in understanding the revised evidence definitions. This document has been shared with internal and external stakeholders through multiple methods, including the Office of Elementary and Secondary Education ESSA technical assistance page for grantees.
Score
6
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY20? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with rigorous evaluation requirements) 

7.1 Did the agency engage leadership and staff in its innovation efforts?
  • In FY19, the Office of Elementary and Secondary Education (OESE) made strategic investments in innovative educational programs and practices and administered discretionary grant programs. In FY19, the Innovation and Improvement account received $1.035 billion. ED reorganized in 2019, consolidating the Office of Innovation and Improvement into the Office of Elementary and Secondary Education. To lead and support innovation within the reorganized OESE, ED created the Evidence-Based Policy (EBP) team. EBP teams work within OESE and with colleagues across the agency to develop and expand efforts to inform policy and improve program practices.
  • In the reorganization that created the Office of the Chief Data Officer, ED leadership established a unit focused explicitly on Innovation and Engagement. These staff focus on innovation in data infrastructure and use, leading the development of the Open Data Platform using agile methodology. Innovations are discussed and disseminatedfor use at monthly meetings of the Data Strategy Team (DST), consisting of data professionals from across ED. Recent DST meetings have included presentations on the Federal Student Aid data warehouse, introductions to Tableau and PowerBI, and a demonstration of the Open Data Platform detailing the role of office data stewards.
 7.2 Did the agency have policies, processes, structures, or programs to promote innovation to improve the impact of its programs?
  • To lead and support innovation within the reorganized OESE, ED created a new component: the Evidence-Based Practices (EBP) team. EBP is tasked with promoting evidence consistent with relevant provisions of the Elementary and Secondary Education Act (ESEA) as amended by Every Student Succeeds Act’s (ESSA). EBP includes two units—one to support the many discretionary grant programs in OESE and one to support the formula grant programs and any discretionary grant programs associated with the formula grant programs. EBP works to advance an evidence-based grantmaking agenda and seeks to operationalize the ESSA evidence framework for strengthening the effectiveness of ESEA investments within OESE programs by: 1) Leading OESE policy development and serving as consultants to grant programs on program design and implementation using evidence, data, trends, field experiences, and stakeholder input; 2) Assessing and improving evidence-based grant-making processes and decision-making to drive results and outcomes aligned with strategic goals; and, 3) Identifying and disseminating promising and evidence-based practices by convening practitioners and producing practitioner-friendly resources.
  • The Education Innovation and Research (EIR) program is ED’s primary innovation program for K–12 public education. EIR grants are focused on validating and scaling evidence-based practices and encouraging innovative approaches to persistent challenges. The EIR program incorporates a tiered-evidence framework that supports larger awards for projects with the strongest evidence base as well as promising earlier-stage projects that are willing to undergo rigorous evaluation. Lessons learned from the EIR program have been shared across the agency and have informed policy approaches in other programs.
7.3 Did the agency evaluate its innovation efforts, including using rigorous methods?
  • ED is currently implementing the Experimental Sites Initiative to assess the effects of statutory and regulatory flexibility for participating institutions disbursing Title IV student aid. ED collects performance data from all participating institutions, and IES is currently conducting rigorous evaluations of selected Experimental Sites, including two related to short-term Pell grants.
  • The Education Innovation and Research (EIR) program, ED’s primary innovation program for K–12 public education, incorporates a tiered-evidence framework that supports larger awards for projects with the strongest evidence base as well as promising earlier-stage projects that are willing to undergo rigorous evaluation.
Score
13
Use of Evidence in Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its competitive grant programs in FY20? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring for evidence; Pay for Success provisions)

8.1 What were the agency’s five largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
  • ED’s top five program accounts based on actual appropriation amounts in FY20 are:
    1. TRIO ($1.96 billion; eligible applicants: eligible grantees: institutions of higher education, public and private organizations);
    2. Charter Schools Program ($440 million; eligible grantees: local charter schools)
    3. GEAR UP ($365 million; eligible grantees: state agencies; partnerships that include IHEs and LEAs)
    4. Teacher and School Leader Incentive Program (TSL) ($200 million; eligible grantees: local education agencies, partnerships between state and local education agencies);
    5. Comprehensive Literacy Development Grants ($192 million; eligible grantees: state education agencies).
8.2 Did the agency use evidence of effectiveness to allocate funds in five largest competitive grant programs? (e.g., Were evidence-based interventions/practices required or suggested? Was evidence a significant requirement?)
  • ED uses evidence of effectiveness when making awards in its largest competitive grant programs.
    • The vast majority of TRIO funding in FY20 was used to support continuation awards to grantees that were successful in prior competitions that awarded competitive preference priority points for projects that proposed strategies supported by moderate evidence of effectiveness. Within the TRIO program, ED will make new awards under Student Support Services. That competition provides points for applicants that propose a project with a key component in its logic model that is informed by research or evaluation findings that suggest it is likely to improve relevant outcomes.
    • Under the Charter Schools Program, ED generally requires or encourages applicants to support their projects through logic models – however, applicants are not expected to develop their applications based on rigorous evidence. Within the CSP program, the Grants to Charter School Management Organizations for the Replication and Expansion of High-Quality Charter Schools (CMO Grants) supports charter schools with a previous track record of success.
    • For the 2019 competition for GEAR UP State awards, ED used a competitive preference priority for projects implementing activities that are supported by promising evidence of effectiveness. FY20 funds are supporting continuation awards.
    • The TSL statute requires applicants to provide a description of the rationale for their project and describe how the proposed activities are evidence-based, and grantees are held to these standards in the implementation of the program.
    • The Comprehensive Literacy Development (CLD) statute requires that grantees provide subgrants to local educational agencies that conduct evidence-based literacy interventions. ESSA requires ED to give priority to applicants demonstrating strong, moderate, or promising levels of evidence.
8.3 Did the agency use its five largest competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations) 
  • The Evidence Leadership Group (ELG) advises program offices on ways to incorporate evidence in grant programs through encouraging or requiring applicants to propose projects that are based on research and by encouraging applicants to design evaluations for their proposed projects that would build new evidence.
  • ED’s grant programs require some form of an evaluation report on a yearly basis to build evidence, demonstrate performance improvement, and account for the utilization of funds. For examples, please see the annual performance reports of TRIO, the Charter Schools Program, and GEAR UP. The Teacher and School Leader Incentive Program is required by ESSA to conduct a national evaluation. The Comprehensive Literacy Development Grant requires evaluation reports. In addition, IES is currently conducting rigorous evaluations to identify successful practices in TRIO-Educational Opportunities Centers and GEAR UP. In FY19, IES released a rigorous evaluation of practices embedded within TRIO-Upward Bound that examined the impact of enhanced college advising practices on students’ pathway to college.
8.4 Did the agency use evidence of effectiveness to allocate funds in any other competitive grant programs in FY20 (besides its five largest grant programs)?
  • The Education and Innovation (EIR) program supports the creation, development, implementation, replication, and taking to scale of entrepreneurial, evidence-based, field-initiated innovations designed to improve student achievement and attainment for high-need students. The program uses three evidence tiers to allocate funds based on evidence of effectiveness, with larger awards given to applicants who can demonstrate stronger levels of prior evidence and produce stronger evidence of effectiveness through a rigorous, independent evaluation. The FY19 competition included checklists and PowerPoints to help applicants clearly understand the evidence requirements.
  • ED incorporates the evidence standards established in EDGAR as priorities and selection criteria in many competitive grant programs.
8.5 What are the agency’s 1-2 strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?  
  • The Education and Innovation (EIR) program supports the creation, development, implementation, replication, and scaling up of evidence-based, field-initiated innovations designed to improve student achievement and attainment for high-need students. IES released The Investing in Innovation Fund: Summary of 67 Evaluations, which can be used to inform efforts to move to more effective practices. ED is exploring the results to determine what lessons learned can be applied to other programs.
8.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • In 2016, ED released non-regulatory guidance to provide state educational agencies, local educational agencies (LEAs), schools, educators, and partner organizations with information to assist them in selecting and using “evidence-based” activities, strategies, and interventions, as defined by ESSA, including carrying out evaluations to “examine and reflect” on how interventions are working. However, the guidance does not specify that federal competitive funds can be used to conduct such evaluations. Frequently, though, programs do include a requirement to evaluate the grant during and after the project period.
Score
7
Use of Evidence in Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its non-competitive grant programs in FY20? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

 9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • ED worked with Congress in FY16 to ensure that evidence played a major role in ED’s large non-competitive grant programs in the reauthorized ESEA. As a result, section 1003 of ESSA requires states to set aside at least 7% of their Title I, Part A funds for a range of activities to help school districts improve low-performing schools. School districts and individual schools are required to create action plans that include “evidence-based” interventions that demonstrate strong, moderate, or promising levels of evidence.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • ESEA requires a National Assessment of Title I– Improving the Academic Achievement of the Disadvantaged. In addition, Title I Grants require state education agencies to report on school performance, including those schools identified for comprehensive or targeted support and improvement.
  • Federal law (ESEA) requires states receiving funds from 21st Century Community Learning Centers to “evaluate the effectiveness of programs and activities” that are carried out with federal funds (section 4203(a)(14)), and it requires local recipients of those funds to conduct periodic evaluations in conjunction with the state evaluation (section 4205(b)).
  • The Office of Special Education Programs (OSEP), the implementing office for IDEA grants to states, has revised its accountability system to shift the balance from a system focused primarily on compliance to one that puts more emphasis on results through the use of Results Driven Accountability.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs in FY20 (besides its five largest grant programs)?
  • Section 4108 of ESEA authorizes school districts to invest “safe and healthy students” funds in Pay for Success initiatives. Section 1424 of ESEA authorizes school districts to invest their Title I, Part D funds (Prevention and Intervention Programs for Children and Youth Who are Neglected, Delinquent, or At-Risk) in Pay for Success initiatives; under the section 1415 of the same program, a State agency may use funds for Pay for Success initiatives.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • States and school districts are beginning to implement the requirements in Title I of the ESEA regarding using evidence-based interventions in school improvement plans. Some States are providing training or practice guides to help schools and districts identify evidence-based practices.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • In 2016, ED released non-regulatory guidance to provide state educational agencies, local educational agencies (LEAs), schools, educators, and partner organizations with information to assist them in selecting and using “evidence-based” activities, strategies, and interventions, as defined by ESSA, including carrying out evaluations to “examine and reflect” on how interventions are working. However, the guidance does not specify that federal non-competitive funds can be used to conduct such evaluations.
Score
5
Repurpose for Results

In FY20, did the agency shift funds away from or within any practice, policy, or program that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; incentivizing or urging grant applicants to stop using ineffective practices in funding announcements; proposing the elimination of ineffective programs through annual budget requests; incentivizing well-designed trials to fill specific knowledge gaps; supporting low-performing grantees through mentoring, improvement plans, and other forms of assistance; using rigorous evaluation results to shift funds away from a program)

10.1 Did the agency have policy(ies) for determining when to shift funds away from grantees, practices, policies, interventions, and/or programs that consistently failed to achieve desired outcomes, and did the agency act on that policy?
  • The Education Department General Administrative Regulations (EDGAR) explains that ED considers whether grantees make “substantial progress” when deciding whether to continue grant awards. In deciding whether a grantee has made substantial progress, ED may consider grantee performance data, among other information. If a continuation award is reduced, more funding may be made available for other applicants, grantees, or activities.
10.2 Did the agency identify and provide support to agency programs or grantees that failed to achieve desired outcomes?
  • The Department conducts a variety of technical assistance to support grantees to improve outcomes. Department staff work with grantees to assess their progress and, when needed, provide technical assistance to support program improvement. On a national scale, the Comprehensive Centers program, Regional Educational Laboratories, and technical assistance centers managed by the Office of Special Education Programs develop resources and provide technical assistance. The Department uses a tiered approach in these efforts, providing universal general technical assistance through a more general dissemination strategy; targeted technical assistance that addresses needs common issues among a number of grantees, and intensive technical assistance that is more focused on specific issues faced by specific recipients. The Department also supports program-specific technical assistance for a variety of individual grant programs.
Back to the Standard

Visit Results4America.org