2020 Federal Standard of Excellence


Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY20? (Example: What Works Clearinghouses)

Score
8
Administration for Children and Families (HHS)
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • ACF has established a common evidence framework adapted for the human services context from the framework for education research developed by the U.S. Department of Education and the National Science Foundation. The ACF framework, which includes the six types of studies delineated in the ED/NSF framework, aims to (1) inform ACF’s investments in research and evaluation and (2) clarify for potential grantees’ and others’ expectations for different types of studies.
6.2 Did the agency have a common evidence framework for funding decisions?
  • While ACF does not have a common evidence framework across all funding decisions, certain programs do use a common evidence framework for funding decisions. For example:
    • The Family First Prevention Services Act (FFPSA) enables states to use funds for certain evidence-based services. In April 2019, ACF published the Prevention Services Clearinghouse Handbook of Standards and Procedures, which provides a detailed description of the standards used to identify and review programs and services in order to rate programs and services as promising, supported, and well-supported practices. 
    • The Personal Responsibility Education Program Competitive Grants were funded to replicate effective, evidence-based program models or substantially incorporate elements of projects that have been proven to delay sexual activity, increase condom or contraceptive use for sexually active youth, and/or reduce pregnancy among youth. Through a systematic evidence review, HHS selected 44 models that grantees could use, depending on the needs and age of the target population of each funded project.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • ACF sponsors several user-friendly tools that disseminate and promote evidence-based interventions. Several evidence reviews of human services interventions disseminate and promote evidence-based interventions by rating the quality of evaluation studies and presenting results in a user-friendly searchable format. Reviews to date have covered: teen pregnancy prevention; home visiting; marriage education and responsible fatherhood; and employment and training and include both ACF-sponsored and other studies. ACF has developed two new websites that disseminate information on rigorously evaluated, evidence-based solutions: 1) The Pathways to Work Evidence Clearinghouse is a user-friendly website that reports on “projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects”; 2) ACF’s Title IV-E Prevention Services Clearinghouse project launched a website in June 2019 that is easily accessible and searchable and allows users to navigate the site and find information about mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services designated as “promising,” “supported,” and “well-supported” practices by an independent systematic review. 
  • Additionally, most ACF research and evaluation projects produce and widely disseminate short briefs, tip sheets, or infographics that capture high-level findings from the studies and make information about program services, participants, and implementation more accessible to policymakers, practitioners, and other stakeholders. For example, the Pathways for Advancing Careers and Education (PACE) project released a series of nine short briefs to accompany the implementation and early impact reports that were released for each of the nine PACE evaluation sites.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • ACF’s evaluation policy states that it is important for evaluators to disseminate research findings in ways that are accessible and useful to policymakers and practitioners and that OPRE and program offices will work in partnership to inform potential applicants, program providers, administrators, policymakers, and funders through disseminating evidence from ACF-sponsored and other good quality evaluations. OPRE research contracts include a standard clause requiring contractors to develop a dissemination plan during early project planning to identify key takeaways, target audiences, and strategies for most effectively reaching the target audiences. OPRE’s dissemination strategy is also supported by a commitment to plain language; OPRE works with its research partners to ensure that evaluation findings and other evidence are clearly communicated. OPRE also has a robust dissemination function that includes the OPRE website, an OPRE e-newsletter, and social media presence on Facebook, Twitter, Instagram, and LinkedIn
  • OPRE biennially hosts two major conferences, the Research and Evaluation Conference on Self-Sufficiency (RECS) and the National Research Conference on Early Childhood (NRCEC) to share research findings with researchers and with program administrators and policymakers at all levels. OPRE also convenes the Network of Infant and Toddler Researchers (NITR) which brings together applied researchers with policymakers and technical assistance providers to encourage research-informed practice and practice-informed research; and the Child Care and Early Education Policy Research Consortium (CCEEPRC) which brings together researchers, policymakers, and practitioners to discuss what we are learning from research that can help inform policy decisions for ACF, States, Territories, localities, and grantees and to consider the next steps in early care and education (ECE) research. In light of COVID-19, OPRE plans to convene the Network, Consortium, and RECS and NRCEC conferences virtually in 2020. 
  • The Children’s Bureau (CB) sponsors the recurring National Child Welfare Evaluation Summit to bring together partners from child welfare systems and the research community to strengthen the use of data and evaluation in child welfare; disseminate information about effective and promising prevention and child welfare services, programs, and policies; and promote the use of data and evaluation to support sound decision-making and improved practice in state and local child welfare systems.
  • ACF also sponsors several:
Score
5
Administration for Community Living (HHS)
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • ACL defines evidence-based programs on its website. ACL’s National Institute on Disability, Independent Living, and Rehabilitation Research (NIDILRR) uses a stages of research framework (SORF) to classify and describe its funded grants and research projects within the grants. Rigorous evaluations methods are applied based on appropriateness. The four stages of SORF include: exploration and discovery, intervention development, intervention efficacy, and scale-up evaluation. Using SORF, NIDILRR gains insight into what is known and unknown about a problem; whether it is time to develop interventions to address a particular problem; whether it is time to test the efficacy of interventions; and whether it is time to scale-up interventions for broader use.
6.2 Did the agency have a common evidence framework for funding decisions?
  • The Older Americans Act requires the use of evidence-based programming in Title III-D-funded activities: Disease Prevention and Health Promotion Services. In response, ACL developed a definition of the term evidence-based, and created a website containing links to a range of resources for evidence-based programs. This is a common evidence framework used for Older Americans Act funded activities. For programs that are not legislatively required to use evidence-based models, through its funding process ACL requires all programs to provide clear justification and evidence (where available) that proposed projects will achieve their stated outcomes.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • ACL works through its resource centers to help grantees use evidence to drive improvements in outcomes for older adults and individuals with disabilities. For example, with funding from ACL, the National Resource Centers at National Center on Aging (NCOA), in collaboration with the Evidence-Based Leadership Council, led an innovative vetting process to increase the number of programs available to ACL’s aging network that meet the Title III-D evidence-based criteria. This process resulted in adding six new health promotion programs and three new programs for preventing falls. The Alzheimer’s Disease Supportive Services Program (ADSSP) funds competitive grants to expand the availability of evidence-based services that support persons with Alzheimer’s disease and related dementia (ADRD) and their family caregivers. Extensive evaluation of the National Chronic Disease Self-Management Education (CDSME) and Falls Prevention database helped generate important insights for potential new ACL applicants in preparing their applications using data-driven estimation procedures for participant and completion targets (see Guidance for Administration for Community Living 2019 Chronic Disease Self-Management Education Grant Applicants: Considerations for Estimating Participation and Completion Targets). ACL also funded several grants, such as the Lifespan Respite Care Program: State Program Enhancement Grants and Disability and Rehabilitation Research Projects (DRRP) Program: Chronic Disease Management for People with Traumatic Brain Injury (TBI) which are designed, in part, to develop an evidence base for respite care and related services and contribute to the evidence base upon which people with TBI and their health care providers can employ effective chronic disease management practices respectively.
  • Starting in FY20, ACL is also conducting an evaluation of the fidelity with which ACL and its grantees under the Older Americans Act are implementing the required evidence-based programs. This will result in a report documenting the information collected and providing clear, actionable recommendations for ensuring the effective use of evidence-based programming. Recommendations will address what ACL, its grantees, and sub-grantees can do to improve the selection, implementation, and monitoring of evidence-based programming. The report will also include the development of a standardized tool for use by ACL and its OAA state grantees that assesses evidence-based program fidelity. This tool will greatly enhance ACL’s ability to ensure that evidence-based practices are used in the field.
Score
5
U.S. Agency for International Development
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • USAID is developing an agency-level evidence framework to clarify evidence standards for different decisions, including those related to funding. 
  • USAID’s evidence standards are embedded within its policies and include requirements for the use of evidence in strategic planning, project design, activity design, program monitoring, and evaluation. USAID has a Scientific Research Policy that sets out quality standards for research across the Agency. USAID’s Program Cycle Policy requires the use of evidence and data to assess the development context, challenges, potential solutions, and opportunities in all of USAID’s country strategies. Specific programs, such as the Development Innovation Ventures (DIV) use evaluation criteria related to evidence of cost effectiveness and ability to scale to determine funding decisions to test and scale innovations. As USAID’s flagship open innovation program, DIV helps to test and scale creative solutions to any global development challenge. By investing in breakthrough proven innovations, driven by rigorous evidence and ongoing monitoring, USAID’s DIV program has proven to impact millions of lives at a fraction of the usual cost.
  • GAO found in their December 2019 report Evidence-Based Policymaking: EVIDENCE-BASED POLICYMAKING Selected Agencies Coordinate Activities, but Could Enhance Collaboration that USAID reflects leading practices for collaborating when building and assessing evidence.
6.2 Did the agency have a common evidence framework for funding decisions?
  • USAID is developing an agency-level evidence framework to clarify evidence standards for different decisions, including those related to funding. In addition, there are specific types of programs at the sub-agency level that do use evidence framework or standards to make funding decisions. 
  • Development Innovation Ventures (DIV) uses a tiered funding system to test and scale evidence-based innovations, making funding decisions based on its evaluation criteria: evaluation and impact; cost-effectiveness; evidence and evaluation; implementation; sustainability and pathway to scale; and project team (see page 6 in DIV’s most recent Annual Program Statement for the evaluation criteria). DIV’s expectations vary by stage, but every awardee must report against a set of pre-negotiated key performance indicators and nearly all grants are structured in a pay-for-performance model.
  • For large scale Stage 2 DIV grants of $500,000 or more, DIV requires evidence of impact that must be causal and rigorous – the grantee must either have rigorous underlying evidence already established, use this funding to run an evaluation with an evaluation partner, or run an evaluation with its own funding during the grant period. There must be significant demonstrated demand for the innovation.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • USAID does have an Agency-wide repository for development information (including evaluation reports and other studies) which is available to the public at the Development Experience Clearinghouse. In addition, USAID uses the International Initiative for Impact Evaluations (3ie) database of impact evaluations relevant to development topics (including over 4,500 entries to date), knowledge gap maps, and systematic reviews that pull the most rigorous evidence and data from across international development donors. 3ie also houses a collection of institutional policies and reports that examine findings from its database of impact evaluations on overarching policy questions to help policymakers and development practitioners improve development impact through better evidence. 
  • USAID’s Agency Programs and Functions policy designates technical bureaus responsible for being the repository for latest information in the sectors they oversee; prioritizing evidence needs and taking actions to build evidence; and disseminating that evidence throughout the agency for those sectors. Several USAID bureaus and sectors have created user friendly tools to disseminate information on evidence-based solutions. These include, but are not limited to:
  • Finally, USAID led a data-harmony initiative across the industry with other countries called the Global Innovation Exchange which surfaces, validates, and shares a repository of over 16,000 development relevant solutions across all actors, players and locations.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • USAID’s approach to Collaborating, Learning, and Adapting (CLA) helps ensure that evidence from evaluation of USAID programming is shared with and used by staff, partners, and stakeholders in the field. USAID requires a dissemination plan and post-evaluation action plan for each evaluation, and USAID field staff are encouraged to co-create evaluation action plans with key stakeholders based on evaluation evidence. USAID collects examples through the CLA Case Competition, held annually, which recognizes implementers, stakeholders, and USAID staff for their work generating and sharing technical evidence and learning from monitoring and evaluation. It is another way that the Agency encourages evidence-based practices among its stakeholders.
  • USAID also periodically holds large learning events with partners and others in the development community around evidence including, but not limited to, Evaluation Summits, engagement around the Self-Reliance Learning Agenda, and Moving the Needle. These gatherings are designed to build interest in USAID’s evidence, build capacity around applying that evidence and learning, and elicit evidence and learning contributions.
  • USAID created and led the “Million Lives Club” coalition, with more than 30 partners, which has identified more than 100 social entrepreneurs who have at least a million customers in order to share the learning that this successful cohort has had and better inform how USAID funding can assist more social entrepreneurs to grow successfully and rapidly. This unique learning platform brings donors, funders, governments, and the entrepreneurial community to the table together to learn and iterate on our approaches.
Score
7
AmeriCorps
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • AmeriCorps uses the same standard scientific research methods and designs for all of its studies and evaluations following the model used by clearinghouses like Department of Education’s What Works Clearinghouse, the Department of Labor’s Clearinghouse for Labor Evaluation and Research, and the Department of Health and Human Services’ Home Visiting Evidence of Effectiveness project. 
6.2 Did the agency have a common evidence framework for funding decisions?
  • AmeriCorps has a common evidence framework for funding decisions in the Senior Corps and AmeriCorps State and National programs. This framework, which is articulated in the AmeriCorps State and National program notice of funding, includes the following evidence levels: pre-preliminary, preliminary, moderate, and strong.  
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • The AmeriCorps Evidence Exchange is a virtual repository of reports and resources intended to help AmeriCorps grantees and other interested stakeholders find information about evidence- and research-based national service programs. Examples of the types of resources available in the Evidence Exchange include research briefs that describe the core components of effective interventions such as those in the areas of education, economic opportunity, and health
  • R&E also creates campaigns and derivative products to distill complex report findings and increase their utility for practitioners (for example, this brief on a study about the health benefits of Senior Corps). R&E has categorized reports according to their research design, so that users can easily search for experimental, quasi-experimental, or non-experimental studies, and those that qualify for strong, moderate, or preliminary evidence levels.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • AmeriCorps has an agency-wide approach to promoting the use of evidence-based practices by the field and employs a variety of strategies including evidence briefs, broad-based support to national service organizations, and targeted technical assistance to grantees. First, R&E creates campaigns and derivative products to distill complex report findings and increase their utility for practitioners (for example, this brief on a study about the health benefits of Senior Corps). Second, AmeriCorps has created user-friendly research briefs that describe the core components of effective interventions in the areas of education, economic opportunity, and health. These briefs are designed to help grantees (and potential grantees) adopt evidence-based approaches. Third, R&E funds a contractor to provide AmeriCorps grantees with evaluation capacity building support; R&E staff are also available to State Commissions for their evaluation questions and make resources (e.g., research briefs summarizing effective interventions, online evaluation planning and reporting curricula) available to them and the general public. Fourth, AmeriCorps funds and participates in grantee conferences that include specific sessions on how to incorporate evidence and data into national service programs. Fifth, as part of the AmeriCorps State and National FY20 application process, AmeriCorps provided technical assistance to grantees on using evidence-based practices through webinars and calls. R&E and AmeriCorps conducted a process evaluation of grantees with varied replication experiences to produce a series of products designed to help grantees implement evidence-based interventions (including a forthcoming article in The Foundation Review). SeniorCorps continues to encourage and support the use of evidence-based programs, as identified by the HHS’s Administration for Community Living, by its grantee organizations.
Score
10
U.S. Department of Education
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • ED has an agency-wide framework for impact evaluations that is based on ratings of studies’ internal validity. ED evidence-building activities are designed to meet the highest standards of internal validity (typically randomized control trials) when causality must be established for policy development or program evaluation purposes. When random assignment is not feasible, rigorous quasi-experiments are conducted. The framework was developed and is maintained by IES’s  What Works ClearinghouseTM (WWC). WWC standards are maintained on the WWC website. A stylized representation of the standards can be found here, along with information about how ED reports findings from research and evaluations that meet these standards.
  • Since 2002, ED—as part of its compliance with the Information Quality Act and OMB guidance— has required that all “research and evaluation information products documenting cause and effect relationships or evidence of effectiveness should meet that quality standards that will be developed as part of the What Works Clearinghouse” (see Information Quality Guidelines).
6.2 Did the agency have a common evidence framework for funding decisions?
  • ED employs the same evidence standards in all discretionary grant competitions that use evidence to direct funds to applicants that are proposing to implement projects that have evidence of effectiveness and/or to build new evidence through evaluation. Those standards, as outlined in the Education Department General Administrative Regulations (EDGAR), build on ED’s What Works ClearinghouseTM (WWC) research design standards.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • ED’s What Works ClearinghouseTM (WWC) identifies studies that provide valid and statistically significant evidence of effectiveness of a given practice, product, program, or policy (referred to as “interventions”), and disseminates summary information and reports on the WWC website. 
  • As of April 2020, the WWC has reviewed more than 10,650 studies that are available in a searchable database. It has published more than 590 Intervention Reports, which synthesize evidence from multiple studies about the efficacy of specific products, programs, and policies. It has published 24 Practice Guides, which synthesize across products, programs, and policies to surface generalizable practices that can transform classroom practice and improve student outcomes.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • ED has several technical assistance programs designed to promote the use of evidence-based practices, most notably IES’s Regional Educational Laboratory Program and the Office of Elementary and Secondary Education’s Comprehensive Center Program. Both programs use research on evidence-based practices generated by the What Works Clearinghouse and other ED-funded Research and Development Centers to inform their work. RELs also conduct applied research and offer research-focused training, coaching, and technical support on behalf of their state and local stakeholders. Their work is reflected in Strategic Plan Objectives 1.4 and 2.2.
  • Often, those practices are highlighted in WWC Practice Guides, which are based on meta-analytic syntheses of existing research and augmented by the experience of practitioners. These guides are designed to address challenges in classrooms and schools. The WWC is currently developing five new Practice Guides for release in FY21.   
  • To ensure continuous improvement of the kind of TA work undertaken by the RELs and Comprehensive Centers, ED has invested in both independent evaluation and grant-funded research. The REL Program is currently undergoing evaluation, and design work for the next Comprehensive Center evaluation is underway. Addition, IES has awarded two grants to study and promote knowledge utilization in education, including the Center for Research Use in Education and the National Center for Research in Policy and Practice. In June of 2020, IES released a report on How States and Districts Support Evidence Use in School Improvement, which may be of value to technical assistance providers and SEA and LEA staff in improving the adoption and implementation of evidence-based practice. 
  • Finally, the Evidence Leadership Group has coordinated the development of revised evidence definitions and related selection criteria for competitive programs that align with ESSA to streamline and clarify provisions for grantees. These revised definitions align with ED’s suggested criteria for states’ implementation of ESSA’s four evidence levels, included in ED’s non-regulatory guidance, Using Evidence to Strengthen Education Investments. ED also developed a fact sheet to support internal and external stakeholders in understanding the revised evidence definitions. This document has been shared with internal and external stakeholders through multiple methods, including the Office of Elementary and Secondary Education ESSA technical assistance page for grantees.
Score
3
U.S. Dept. of Housing & Urban Development
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • PD&R’s Program Evaluation Policy defines standards that prioritize rigorous methods for research and evaluation coveringimpact evaluations; implementation of process evaluations; descriptive studies; outcome evaluations; and formative evaluations;and both qualitative and quantitative approaches. It also provides for dissemination of such evidence to stakeholders in a timelyfashion.
6.2 Did the agency have a common evidence framework for funding decisions?
  • HUD seeks to employ tiered evidence in funding decisions by embedding implementation and impact evaluations in funding requests for program initiatives, including major program demonstrations that employ random assignment methods. These include the Moving To Work Expansion demonstration, the Rental Assistance Demonstration, the Rent Reform Demonstration, the Family Self-Sufficiency Demonstration, the Housing Counseling Demonstration, and the Family Options Demonstration. Such trials provide robust evidence to inform scale-up funding decisions. 
  • HUD extended its standardized data collection and reporting framework, Standards for Success, to additional discretionary grant programs in FY19. The framework consists of a repository of data elements that participating programs use in their grant reporting, creating common definitions and measures across programs for greater analysis and coordination of services.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • HUD provides resources and assistance to support community partners in evidence-based practice through the HUD Exchange web portal and through Community Compass technical assistance. PD&R provides the public, policymakers, and practitioners with evidence of what works through the Regulatory Barriers Clearinghouse and HUD USER, which is a portal and web store for program evaluations, case studies, and policy analysis and research. The evaluations of major program demonstrations provide rigorous evidence about effect sizes and variations in effects between key subgroups.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • HUD provides resources and assistance to support community partners in evidence-based practice through the HUD Exchange web portal and through technical assistance. PD&R provides the public, policymakers, and practitioners with evidence of what works primarily through HUD USER, a portal and web store for program evaluations, case studies, and policy analysis and research; the Regulatory Barriers Clearinghouse; and through initiatives such as Innovation of the Day, Sustainable Construction Methods in Indian Country, and the Consumer’s Guide to Energy-Efficient and Healthy Homes. This content is designed to provide current policy information, elevate effective practices, and synthesize data and other evidence in accessible formats such as Evidence Matters. Through these resources, researchers and practitioners can see the full breadth of work on a given topic (e.g., rigorous established evidence, case studies of what has worked in the field, and new innovations currently being explored) to inform their work.
  • Community Compass technical assistance for urban, rural, and tribal partners is designed to facilitate understanding of community and housing development issues in a way that cuts across program silos. It supports them in evaluation, evidence-building, integrating knowledge management principles, and sharing practices. 
Score
9
U.S. Department of Labor
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • DOL’s Clearinghouse for Labor Evaluation and Research’s (CLEAR) evidence guidelines, which describe quality standards for different types of studies, are applied to all independent evaluations, including all third party evaluations of DOL programs, determined eligible for CLEAR’s evidence reviews across different topic areas. Requests for proposals also indicate these CLEAR standards should be applied to all Chief Evaluation Office (CEO) evaluations when considering which designs are the most rigorous and appropriate to answer specific research questions.
  • In addition, the DOL Evaluation Policy principles and standards for evaluation planning and dissemination. Additionally, DOL collaborates with other agencies (U.S. Department of Health and Human Services (HHS), the U.S. Department of Education’s Institute of Education Sciences (IES), the National Science Foundation (NSF), and the Corporation for National and Community Service (CNCS)) to develop technological procedures to link and share reviews across clearinghouses.
6.2 Did the agency have a common evidence framework for funding decisions?
  • DOL uses the CLEAR evidence guidelines and standards to make decisions about discretionary program grants awarded using evidence-informed or evidence-based criteria. The published guidelines and standards are used to identify evidence-based programs and practices and to review studies to assess the strength of their causal evidence or to do a structured evidence review in a particular topic area or timeframe to help inform agencies what strategies appear promising and where gaps exist.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • DOL’s CLEAR is an online evidence clearinghouse. CLEAR’s goal is to make research on labor topics more accessible to practitioners, policymakers, researchers, and the public more broadly, so that it can inform their decisions about labor policies and programs. CLEAR identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation studies, and causal impact studies. For causal impact studies, CLEAR assesses the strength of the design and methodology in studies that look at the effectiveness of particular policies and programs. CLEAR’s study summaries and icons, found in each topic area, can help users quickly and easily understand what studies found and how much confidence to have in the results. 
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • DOL promotes the utilization of evidence-based practices in a variety of ways. For example, the Employment & Training Administration (ETA) maintains a user friendly technical assistance tool that promotes state and local service providers’ use of evidence-based interventions through Workforce System Strategies, a comprehensive database of over 1,000 profiles that summarize a wide range of findings from reports, studies, technical assistance tools and guides that support program administration and improvement. Additionally, recognizing that research over the past four decades has found subsidized on-the-job training strategies like apprenticeship to improve participants’ employment and earnings outcomes, DOL has awarded or announced several apprenticeship grant opportunities this fiscal year in addition to the State Apprenticeship Expansion Grants awarded in 2018. These include the ETA’s Scaling Apprenticeship Through Sector-Based Strategies and Apprenticeships: Closing the Skills Gap opportunities and the Women’s Bureau’s Women in Apprenticeship and Nontraditional Occupations grant program.
Score
6
Millennium Challenge Corporation
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • For each investment, MCC’s Economic Analysis (EA) division undertakes a Constraints Analysis to determine the binding constraints to economic growth in a country. To determine the individual projects in which MCC will invest in a given sector, MCC’s EA division combines root cause analysis with a cost-benefit analysis. The results of these analyses allow MCC to determine which investments will yield the greatest development impact and return on MCC’s investment. Every investment also has its own set of indicators as well as standard, agency-wide sector indicators for monitoring during the lifecycle of the investment and an evaluation plan for determining the results and impact of a given investment. MCC’s Policy for Monitoring and Evaluation details MCC’s evidence-based research and evaluation framework. Per the Policy, each completed evaluation requires a summary of findings, now called the Evaluation Brief, to summarize the key components, results, and lessons learned from the evaluation. Evidence from previous MCC programming is considered during the development of new programs. Per the Policy, “monitoring and evaluation evidence and processes should be of the highest practical quality. They should be as rigorous as practical and affordable. Evidence and practices should be impartial. The expertise and independence of evaluators and monitoring managers should result in credible evidence. Evaluation methods should be selected that best match the evaluation questions to be answered. Indicators should be limited in number to include the most crucial indicators. Both successes and failures must be reported.”
6.2 Did the agency have a common evidence framework for funding decisions?
  • MCC uses a rigorous evidence framework to make every decision along the investment chain, from country partner eligibility to sector selection to project choices. MCC uses evidence-based selection criteria, generated by independent, objective third parties, to select countries for grant awards. To be eligible for selection, World Bank-designated low- and lower-middle-income countries must first pass the MCC  – a collection of 20 independent, third-party indicators that objectively measure a country’s policy performance in the areas of economic freedom, investing in people, and ruling justly. An in-depth description of the country selection procedure can be found in the annual report.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • All evaluation designs, data, reports, and summaries are made publicly available on MCC’s Evaluation Catalog, which includes evaluation information for every MCC program. Evaluation packages have a depth of information for each program including evaluation designs and questions, baseline data, surveys, questionnaires, microdata, interim reports, and final reports. To further the dissemination and use of MCC’s evaluations’ evidence and learning, the Agency publishes Evaluation Briefs, a new product to capture and disseminate the results and findings of its independent evaluation portfolio. An Evaluation Brief will be produced for each evaluation and offers a succinct, user-friendly, systematic format to better capture and share the relevant evidence and learning from MCC’s independent evaluations. These accessible products will take the place of MCC’s Summaries of Findings. Evaluation Briefs will be published on the Evaluation Catalog and will complement the many other products published for each evaluation. In FY20, MCC also began the process of re-designing the Evaluation Catalog into a new MCC Evidence Platform, in part to make MCC’s evaluation evidence and data easier to find and use. 
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • Using internal research and analysis to understand where and how its published evaluations, datasets, and knowledge products are utilized, MCC is embarking on a re-designed Evaluation Catalog, prioritizing evidence-building in key sectors, and continuing to refine and publish new evidence dissemination products. Under this comprehensive approach, Evaluation Briefs act as a cornerstone to promoting utilization across audience groups. Enhanced utilization of MCC’s vast evidence base and learning was a key impetus behind the creation and expansion of the Evaluation Briefs and Star Reports, two new MCC products. A push to ensure sector-level evidence use has led to renewed emphasis of the Principles into Practice series, with recent reports on the transport, education, and water & sanitation (forthcoming) sectors.
  • MCC has also enhanced its in-country evaluation dissemination events to ensure further results and evidence building with additional products in local languages and targeted stakeholder learning dissemination strategies.
Score
4
Substance Abuse and Mental Health Services Administration
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • There is great diversity across SAMHSA programming, ranging from community-level prevention activities to residential programs for pregnant and postpartum women with substance misuse issues. While this diversity allows SAMHSA to be responsive to a wide set of vulnerable populations, it limits the utility of a common evidence framework for the entire agency. Within Centers (the Center for Substance Abuse Prevention, the Center for Substance Abuse Treatment, and the Center for Mental Health Services), consistent evidence frameworks are in use and help to shape the process of grant-making (e.g., Center staff are familiar with the pertinent evidence base for their particular portfolios).
  • In 2011, based on the model of the National Quality Strategy, SAMHSA developed the National Behavioral Health Quality Framework (NBHQF). With the NBHQF, SAMHSA proposes a set of core measures to be used in a variety of settings and programs, as well as in evaluation and quality assurance efforts. The proposed measures are not intended to be a complete or total set of measures a payer, system, practitioner, or program may want to use to monitor the quality of its overall system or the care or activities it provides. SAMHSA encourages such entities to utilize these basic measures as appropriate as a consistent set of indicators of quality in behavioral health prevention, promotion, treatment, and recovery support efforts across the nation.
6.2 Did the agency have a common evidence framework for funding decisions?
  • SAMHSA has universal language about using evidence-based practices (EBPs) that is included in its Funding Opportunity Announcements (FOAs) (entitled Using Evidence-Based Practices (EBPs)). This language includes acknowledgement that, “EBPs have not been developed for all populations and/or service settings” thus encouraging applicants to “provide other forms of evidence” that a proposed practice is appropriate for the intended population. Specifically, the language states that applicants should: (1) document that the EBPs chosen are appropriate for intended outcomes; (2) explain how the practice meets SAMHSA’s goals for the grant program; (3) describe any modifications or adaptations needed for the practice to meet the goals of the project; (4) explain why the EBP was selected; (5) justify the use of multiple EBPs, if applicable; and (6) discuss training needs or plans to ensure successful implementation. Lastly, the language includes resources the applicant can use to understand EBPs. Federal grants officers work in collaboration with the SAMHSA Office of Financial Resources to ensure that grantee funding announcements clearly describe the evidence standard necessary to meet funding requirements.
  • SAMHSA developed a manual, Developing a Competitive SAMHSA Grant Application, which explains information applicants will likely need for each section of the grant application. The manual has two sections devoted to evidence-based practices (p. 8, p. 26), including: 1) A description of the EBPs applicants plan to implement; 2) Specific information about any modifications applicants plan to make to the EBPs and a justification for making them; and 3) How applicants plan to monitor the implementation of the EBPs. In addition, if applicants plan to implement services or practices that are not evidence-based, they must show that these services/practices are effective.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • Until 2018, SAMHSA regarded the National Registry of Evidence-based Programs and Practices (NREPP) as the primary online user-friendly tool for identifying evidence-based programs for grantee implementation. In January 2018, SAMHSA announced that it was “moving to EBP [evidence-based practice] implementation efforts through targeted technical assistance and training that makes use of local and national experts and will assist programs with actually implementing services….” NREPP was taken offline in August 2018. In August 2019, the Pew-MacArthur Results First Initiative announced it had restored users’ access to this information, which can be found in the Results First Clearinghouse Database. The Evidence-Based Practices Resource Centerprovides communities, clinicians, policy-makers and others with the information and tools to incorporate evidence-based practices into their communities or clinical settings.” As of August 2020, the EBP Resource Center included 149 items.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • In April 2018, SAMHSA launched the Evidence-Based Practices Resource Center (Resource Center) that aims to provide communities, clinicians, policy-makers and others in the field with the information and tools they need to incorporate evidence-based practices into their communities or clinical settings. The Resource Center contains a collection of science-based resources, including Treatment Improvement Protocols, toolkits, resource guides, and clinical practice guidelines, for a broad range of audiences. As of August 2020, the Resource Center includes 149 items, including 15 data reports, 24 toolkits, 24 fact sheets, and 96 practice guides. 
  • The Mental Health Technology Transfer Center (MHTTC) Network works with organizations and treatment practitioners involved in the delivery of mental health services to strengthen their capacity to deliver effective evidence-based practices to individuals, including the full continuum of services spanning mental illness prevention, treatment, and recovery support. The State Targeted Response Technical Assistance (STR-TA), known as the Opioid Response Network, was created to support efforts to address opioid use disorder prevention, treatment, and recovery, and to provide education and training at the local level in evidence-based practices.
  • To date SAMHSA has produced 11 Evidence-Based Practice Knowledge Informing Transformation (KIT) guides to help move the latest information available on effective behavioral health practices into community-based service delivery. The KITs contain information sheets, introductory videos, practice demonstration videos, and training manuals. Each KIT outlines the essential components of the evidence-based practice and provides suggestions collected from those who have successfully implemented them.
Back to the Standard

Visit Results4America.org