2020 Federal Standard of Excellence


Administration for Children and Families (HHS)

Score
9
Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY20?

1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Deputy Assistant Secretary for Planning, Research, and Evaluations at the Office of Planning, Research, and Evaluation (OPRE) serves as the Administration for Children and Families Chief Evaluation Officer. The Deputy Assistant Secretary oversees ACF’s Office of Planning, Research, and Evaluation (OPRE) which supports evaluation and other learning activities across the agency. ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation oversees a research and evaluation budget of approximately $180 million in FY20. OPRE has 68 federal staff positions; OPRE staff are experts in research and evaluation methods and data analysis as well as ACF programs, policies, and the populations they serve. In August 2019, the Department of Health and Human Services’ (HHS) Assistant Secretary for Planning and Evaluation was named the Chief Evaluation Officer of HHS.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • The HHS Chief Information Officer serves as the HHS Chief Data Officer. In August 2019, the HHS Chief Information Officer was named the acting Chief Data Officer of HHS. In September of 2019, the Assistant Secretary for Children and Families designated the Deputy Assistant Secretary for Planning, Research, and Evaluation as the primary ACF member to serve on the HHS Data Council, the body responsible for advising the HHS Chief Data Officer on implementation of Evidence Act activities across HHS.
  • Additionally, in 2016, ACF established a new Division of Data and Improvement (DDI) providing federal leadership and resources to improve the quality, use, and sharing of ACF data. The Director of DDI reports to the Deputy Assistant Secretary for Planning, Research, and Evaluation and oversees work to improve the quality, usefulness, interoperability, and availability of data and to address issues related to privacy and data security and data sharing. DDI has 12 federal staff positions and an FY20 budget of approximately $6.4M (not including salaries).
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, performance improvement officer, and other related officials in order to support, improve, and evaluate the agency’s major programs?
  • As of September 2019, ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation serves as the primary ACF representative to HHS’ Leadership Council, Data Council, and Evidence and Evaluation Council — the HHS bodies responsible for implementing Evidence Act activities across HHS. These cross-agency councils meet regularly to discuss agency-specific needs and experiences and to collaboratively develop guidance for department-wide action.
  • Within ACF, the 2016 reorganization that created the Division of Data and Improvement (DDI) endowed ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation with oversight of the agency’s strategic planning; performance measurement and management; research and evaluation; statistical policy and program analysis; synthesis and dissemination of research and evaluation findings; data quality, usefulness, and sharing; and application of emerging technologies to improve the effectiveness of programs and service delivery. ACF reviews program office performance measures and associated data three times per year in sync with the budget process; OPRE has traditionally worked with ACF program offices to develop research plans on an annual basis and has worked to integrate the development of program-specific learning agendas into this process. In addition, OPRE holds regular and ad hoc meetings with ACF program offices to discuss research and evaluation findings, as well as other data topics.
Score
8
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence-building plan), and did it publicly release the findings of all completed program evaluations in FY20?

2.1 Did the agency have an agency-wide evaluation policy? (Example: Evidence Act 313(d))
  • ACF’s evaluation policy confirms ACF’s commitment to conducting evaluations and using evidence from evaluations to inform policy and practice. ACF seeks to promote rigor, relevance, transparency, independence, and ethics in the conduct of evaluations. ACF established the policy in 2012 and published it in the Federal Register on August 29, 2014. In late 2019, ACF released a short video about the policy’s five principles and how we use them to guide our work.
  • As ACF’s primary representative to the HHS Evidence and Evaluation Council, the ACF Deputy Assistant Secretary for Planning, Research, and Evaluation co-chairs the HHS Evaluation Policy Subcommittee—the body responsible for developing an HHS- wide evaluation policy.
2.2 Did the agency have an agency-wide evaluation plan? (Example: Evidence Act 312(b))
  • In accordance with OMB guidance, ACF is contributing to an HHS-wide evaluation plan. OPRE also annually identifies questions relevant to the programs and policies of ACF and proposes a research and evaluation spending plan to the Assistant Secretary for Children and Families. This plan focuses on activities that the Office of Planning, Research, and Evaluation plans to conduct during the following fiscal year.
2.3 Did the agency have a learning agenda (evidence-building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including, but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda? (Example: Evidence Act 312)
  • In accordance with OMB guidance, HHS is developing an HHS-wide evidence-building plan. To develop this document, HHS asked each sub-agency to submit examples of their agency’s priority research questions, potential data sources, anticipated approaches, challenges and mitigation strategies, and stakeholder engagement strategies. ACF drew from our existing program-specific learning agendas and research plans and has contributed example priority research questions and anticipated learning activities for inclusion in the HHS evidence-building plan. ACF also intends to release to the public a broad learning plan.
  • In addition to fulfilling requirements of the Evidence Act, ACF has supported and continues to support systematic learning and stakeholder engagement activities across the agency. For example:
    • Many ACF program offices have or are currently developing detailed program-specific learning agendas to systematically learn about and improve their programs—studying existing knowledge, identifying gaps, and setting program priorities. For example, ACF and HRSA have developed a learning agenda for the MIECHV program, and ACF is supporting ongoing efforts to build a learning agenda for ACF’s Healthy Marriage and Responsible Fatherhood (HMRF) programming.
    • ACF will continue to release annual portfolios that describe key findings from past research and evaluation work and how ongoing projects are addressing gaps in the knowledge base to answer critical questions in the areas of family self-sufficiency, child and family development, and family strengthening. In addition to describing key questions, methods, and data sources for each research and evaluation project, the portfolios provide narratives describing how evaluation and evidence-building activities unfold in specific ACF programs and topical areas over time, and how current research and evaluation initiatives build on past efforts and respond to remaining gaps in knowledge.
    • ACF works closely with many stakeholders to inform priorities for its research and evaluation efforts and solicits their input through conferences and meetings such as the Research and Evaluation Conference on Self-Sufficiency, the National Research Conference on Early Childhood, and the Child Care and Early Education Policy Research Consortium Annual Meetings; meetings with ACF grantees and program administrators; engagement with training and technical assistance networks; surveys, focus groups, interviews, and other activities conducted as a part of research and evaluation studies; and through both project-specific and topical technical working groups, including the agency’s Family Self-Sufficiency Research Technical Working Group. ACF’s ongoing efforts to engage its stakeholders will be described in more detail in ACF’s forthcoming description of its learning activities.
2.4 Did the agency publicly release all completed program evaluations?
  • ACF’s evaluation policy requires that “ACF will release evaluation results regardless of findings…Evaluation reports will present comprehensive findings, including favorable, unfavorable, and null findings. ACF will release evaluation results timely – usually within two months of a report’s completion.” ACF has publicly released the findings of all completed evaluations to date. In 2019, OPRE released over 110 research publications. OPRE publications are publicly available on the OPRE website.
2.5 What is the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts? (Example: Evidence Act 3115, subchapter II (c)(3)(9))
  • In accordance with OMB guidance, ACF is contributing to an HHS-wide capacity assessment to be released by September 2020. ACF also continues to support the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts as follows:
  • Coverage: ACF conducts research in areas where Congress has given authorization and appropriations. Programs for which ACF is able to conduct research and evaluation using dedicated funding include Temporary Assistance for Needy Families, Health Profession Opportunity Grants, Head Start, Child Care, Child Welfare, Home Visiting, Healthy Marriage and Responsible Fatherhood, Personal Responsibility Education Program, Sexual Risk Avoidance Education, Teen Pregnancy Prevention, Runaway and Homeless Youth, Family Violence Prevention Services, and Human Trafficking services. These programs represent approximately 85% of overall ACF spending.
  • Quality: ACF’s Evaluation Policy states that ACF is committed to using the most rigorous methods that are appropriate to the evaluation questions and feasible within budget and other constraints, and that rigor is necessary not only for impact evaluations, but also for implementation/process evaluations, descriptive studies, outcome evaluations, and formative evaluations; and in both qualitative and quantitative approaches.
  • Methods: ACF uses a range of evaluation methods. ACF conducts impact evaluations as well as implementation and process evaluations, cost analyses and cost benefit analyses, descriptive and exploratory studies, research syntheses, and more. ACF is committed to learning about and using the most scientifically advanced approaches to determining effectiveness and efficiency of ACF programs; to this end, OPRE annually organizes meetings of scientists and research experts to discuss critical topics in social science research methodology and how innovative methodologies can be applied to policy-relevant questions.
  • Effectiveness: ACF’s Evaluation Policy states that ACF will conduct relevant research and disseminate findings in ways that are accessible and useful to policymakers and practitioners. OPRE engages in ongoing collaboration with ACF program office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions such as ACF regulations and funding opportunity announcements. For example, when ACF’s Office of Head Start significantly revised its Program Performance Standards—the regulations that define the standards and minimum requirements for Head Start services—the revisions drew from decades of OPRE research and the recommendations of the OPRE-led Secretary’s Advisory Committee on Head Start Research and Evaluation. Similarly, ACF’s Office of Child Care drew from research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of funds dedicated to improving the quality of programs, and other information to inform the regulations accompanying the reauthorization of the Child Care and Development Block Grant.
  • Independence: ACF’s Evaluation Policy states that independence and objectivity are core principles of evaluation and that it is important to insulate evaluation functions from undue influence and from both the appearance and the reality of bias. To promote objectivity, ACF protects independence in the design, conduct, and analysis of evaluations. To this end, ACF conducts evaluations through the competitive award of grants and contracts to external experts who are free from conflicts of interest; and, the Deputy Assistant Secretary for Planning, Research, and Evaluation, a career civil servant, has authority to approve the design of evaluation projects and analysis plans; and has authority to approve, release, and disseminate evaluation reports.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
Score
7
Resources

Did the agency invest at least 1% of program funds in evaluations in FY20? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, rigorous evaluations, including random assignments)

3.1.____ (Name of agency) invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY20 budget.
  • The Administration for Children and Families invested approximately $208 million in evaluations, evaluation technical assistance, and evaluation capacity-building, representing approximately 0.3% of the agency’s approximately $60.4 billion FY20 budget.
3.2 Did the agency have a budget for evaluation and how much was it? (Were there any changes in this budget from the previous fiscal year?)
  • In FY20, the Administration for Children and Families has an evaluation budget of approximately $208 million, an $8 million increase from FY19.
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
Score
6
Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY20?

4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • ACF was an active participant in the development of the FY 2018-2022 HHS Strategic Plan, which includes several ACF-specific objectives. ACF regularly reports on progress associated with those objectives as part of the FY 2020 HHS Annual Performance Plan/Report, including the ten total performance measures from ACF programs that support this Plan. ACF performance measures primarily support Goal Three: “Strengthen the Economic and Social Well-Being of Americans Across the Lifespan.” ACF supports Objective 3.1 (Encourage self-sufficiency and personal responsibility, and eliminate barriers to economic opportunity), Objective 3.2 (Safeguard the public against preventable injuries and violence or their results), and Objective 3.3 (Support strong families and healthy marriage, and prepare children and youth for healthy, productive lives) by reporting annual performance measures. ACF is also an active participant in the HHS Strategic Review process, which is an annual assessment of progress on the subset of ten performance measures that ACF reports on as part of the HHS Strategic Plan. 
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • OPRE currently reviews all ACF funding opportunity announcements and advises program offices, in accordance with their respective legislative authorities, on how to best integrate evidence into program design. Similarly, program offices have applied ACF research to inform their program administration. For example, ACF developed the Learn Innovate Improve (LI2) model — a systematic, evidence-informed approach to program improvement — which has since informed targeted TA efforts for the TANF program and the evaluation requirement for the child support demonstration grants.
  • ACF programs also regularly analyze and use data to improve performance. For example, two ACF programs (Health Profession Opportunity Grants & Healthy Marriage and Responsible Fatherhood programs) have developed advanced web-based management information systems (PAGES and nFORM, respectively) that are used to track grantee progress, produce real-time reports so that grantees can use their data to adapt their programs, and record grantee and participant data for research and evaluation purposes.
  • ACF also uses the nFORM data to conduct the HMRF Compliance Assessment and Performance (CAPstone) Grantee Review: a process by which federal staff and technical assistance providers assess grantee progress toward and achievement in meeting programmatic, data, evaluation, and implementation goals. The results of the CAPstone process guide federal directives and future technical assistance.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
Score
6
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY20? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies; data-use policies)

5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • ACF’s Interoperability Action Plan was established in 2017 to formalize ACF’s vision for effective and efficient data sharing. Under this plan ACF and its program offices will develop and implement a Data Sharing First (DSF) strategy that starts with the assumption that data sharing is in the public interest. The plan states that ACF will encourage and promote data sharing broadly, constrained only when required by law or when there are strong countervailing considerations.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • In 2020, ACF released a Compendium of ACF Administrative and Survey Data Resources. The Compendium documents administrative and survey data collected by ACF that could be used for evidence-building purposes. It includes summaries of twelve major ACF administrative data sources and seven surveys. Each summary includes an overview, basic content, available documentation, available data sets, restrictions on use, capacity to link to other data sources, and examples of prior research. It is a joint product of the Office of Planning, Research, and Evaluation (OPRE) in ACF, and the office of the Assistant Secretary for Planning and Evaluation (ASPE), U.S. Department of Health and Human Services.
  • In addition, in 2019 OPRE compiled the descriptions and locations of hundreds of OPRE-archived datasets that are currently available for secondary analysis and made this information available on a single webpage. OPRE regularly archives research and evaluation data for secondary analysis, consistent with the ACF evaluation policy, which promotes rigor, relevance, transparency, independence, and ethics in the conduct of evaluation and research. This new consolidated webpage serves as a one-stop resource that will help to make it easier for potential users to find and use the data that OPRE archives for secondary analysis.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • ACF has multiple efforts underway to promote and support the use of documented data for research and improvement, including making numerous administrative and survey datasets publicly available for secondary use and actively promoting the archiving of research and evaluation data for secondary use. These data are machine readable, downloadable, and de-identified as appropriate for each data set. For example, individual-level data for research is held in secure restricted use formats, while public-use data sets are made available online. To make it easier to find these resources, ACF released a Compendium of ACF Administrative and Survey Data and consolidated information on archived research and evaluation data on the OPRE website.
  • Many data sources that may be useful for data linkage for building evidence on human services programs reside outside of ACF.  In 2020, OPRE released the Compendium of Administrative Data Sources for Self-Sufficiency Research, describing promising administrative data sources that may be linked to evaluation data in order to assess long-term outcomes of economic and social interventions. It includes national, federal, and state sources covering a range of topical areas. It was produced under contract by MDRC as a part of OPRE’s Assessing Options Evaluate Long-Term Outcomes (LTO) Using Administrative Data project..
  • Additionally, ACF is actively exploring how enhancing and scaling innovative data linkage practices can improve our understanding of the populations served by ACF and build evidence on human services programs more broadly. For instance, the Child Maltreatment Incidence Data Linkages (CMI Data Linkages) project is examining the feasibility of leveraging administrative data linkages to better understand child maltreatment incidence and related risk and protective factors.
  • ACF actively promotes archiving of research and evaluation data for secondary use. OPRE research contracts include a standard clause requiring contractors to make data and analyses supported through federal funds available to other researchers and to establish procedures and parameters for all aspects of data and information collection necessary to support archiving information and data collected under the contract. Many datasets from past ACF projects are stored in archives including the ACF-funded National Data Archive on Child Abuse and Neglect (NDACAN), the ICPSR Child and Family Data Archive, and the ICPSR data archive more broadly. OPRE has funded grants for secondary analysis of ACF/OPRE data; examples in recent years include secondary analysis of strengthening families datasets and early care and education datasets. In 2019 ACF awarded Career Pathways Secondary Data Analysis Grants to stimulate and fund secondary analysis of data collected through the Pathways for Advancing Careers and Education (PACE) Study, Health Professions Opportunity Grants (HPOG) Impact Study, and HPOG National Implementation Evaluation (NIE) on questions relevant to career pathways programs’ goals and objectives. Information on all archived datasets that are currently available for secondary analysis is available on OPRE’s website.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • ACF developed a Confidentiality Toolkit that supports state and local efforts by explaining rules governing confidentiality in ACF and certain related programs, by providing examples of how confidentiality requirements can be addressed, and by including sample memoranda of understandings and data sharing agreements. ACF is currently in the process of updating the Toolkit for recent changes in statute, and to provide real-world examples of how data has been shared across domains—which frequently do not have harmonized privacy requirements—while complying with all relevant privacy and confidentiality requirements (e.g. FERPA, HIPPA). These case studies will also include downloadable, real-world tools that have been successfully used in the highlighted jurisdictions.
  • ACF also takes appropriate measures to safeguard the privacy and confidentiality of individuals contributing data for research throughout the archiving process, consistent with ACF’s core principle of ethics. Research data may be made available as public use files (when the data would not likely lead to harm or to the re-identification of an individual) or through restricted access. Restricted access files are de-identified and made available to approved researchers either through secure transmission and download, virtual data enclaves, physical data enclaves, or restricted online analysis.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • ACF undertakes many program-specific efforts to support state, local, and tribal efforts to use human services data while protecting privacy and confidentiality. For example, ACF’s TANF Data Innovation Project supports innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. This work includes encouraging and strengthening state integrated data systems, promoting proper payments and program integrity, and enabling data analytics for TANF program improvement. Similarly, in 2020 OPRE awarded Human Services Interoperability Demonstration Grants, which are intended to expand data sharing efforts by state, local, and tribal governments to improve human services program delivery, and to identify novel data sharing approaches that can be replicated in other jurisdictions. Also in 2019, OPRE in partnership with ASPE began a project to support states in linking Medicaid and Child Welfare data at the parent-child level to support outcomes research. Under this project, HHS will work with two to four states to enhance capacity to examine outcomes for children and parents who are involved in state child welfare systems and who may have behavioral health issues. Of particular interest are outcomes for families that may have substance use disorders, like opioid use disorder. Specifically this project seeks to develop state data infrastructure and increase the available de-identified data for research in this area.
  • ACF also engages in several broad-based and cross-cutting efforts to support state, local, and tribal efforts to use human services data while protecting privacy and confidentiality. Through the Interoperability Initiative, ACF supports data sharing through developing standards and tools that are reusable across the country, addressing common privacy and security requirements to mitigate risks, and providing request-based technical assistance to states, local jurisdictions, and ACF program offices. Several ACF divisions have also been instrumental in supporting cross-governmental efforts, such as the National Information Exchange Model (NIEM) that will enable human services agencies to collaborate with health, education, justice, and many other constituencies that play a role in the well-being of children and families.
Score
8
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY20?

6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • ACF has established a common evidence framework adapted for the human services context from the framework for education research developed by the U.S. Department of Education and the National Science Foundation. The ACF framework, which includes the six types of studies delineated in the ED/NSF framework, aims to (1) inform ACF’s investments in research and evaluation and (2) clarify for potential grantees’ and others’ expectations for different types of studies.
6.2 Did the agency have a common evidence framework for funding decisions?
  • While ACF does not have a common evidence framework across all funding decisions, certain programs do use a common evidence framework for funding decisions. For example:
    • The Family First Prevention Services Act (FFPSA) enables states to use funds for certain evidence-based services. In April 2019, ACF published the Prevention Services Clearinghouse Handbook of Standards and Procedures, which provides a detailed description of the standards used to identify and review programs and services in order to rate programs and services as promising, supported, and well-supported practices.
    • The Personal Responsibility Education Program Competitive Grants were funded to replicate effective, evidence-based program models or substantially incorporate elements of projects that have been proven to delay sexual activity, increase condom or contraceptive use for sexually active youth, and/or reduce pregnancy among youth. Through a systematic evidence review, HHS selected 44 models that grantees could use, depending on the needs and age of the target population of each funded project.
6.3 Did the agency have a user-friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • ACF sponsors several user-friendly tools that disseminate and promote evidence-based interventions. Several evidence reviews of human services interventions disseminate and promote evidence-based interventions by rating the quality of evaluation studies and presenting results in a user-friendly searchable format. Reviews to date have covered: teen pregnancy prevention; home visiting; marriage education and responsible fatherhood; and employment and training and include both ACF-sponsored and other studies. ACF has developed two new websites that disseminate information on rigorously evaluated, evidence-based solutions: 1) The Pathways to Work Evidence Clearinghouse is a user-friendly website that reports on “projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects”; 2) ACF’s Title IV-E Prevention Services Clearinghouse project launched a website in June 2019 that is easily accessible and searchable and allows users to navigate the site and find information about mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services designated as “promising,” “supported,” and “well-supported” practices by an independent systematic review.
  • Additionally, most ACF research and evaluation projects produce and widely disseminate short briefs, tip sheets, or infographics that capture high-level findings from the studies and make information about program services, participants, and implementation more accessible to policymakers, practitioners, and other stakeholders. For example, the Pathways for Advancing Careers and Education (PACE) project released a series of nine short briefs to accompany the implementation and early impact reports that were released for each of the nine PACE evaluation sites.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • ACF’s evaluation policy states that it is important for evaluators to disseminate research findings in ways that are accessible and useful to policymakers and practitioners and that OPRE and program offices will work in partnership to inform potential applicants, program providers, administrators, policymakers, and funders through disseminating evidence from ACF-sponsored and other good quality evaluations. OPRE research contracts include a standard clause requiring contractors to develop a dissemination plan during early project planning to identify key takeaways, target audiences, and strategies for most effectively reaching the target audiences. OPRE’s dissemination strategy is also supported by a commitment to plain language; OPRE works with its research partners to ensure that evaluation findings and other evidence are clearly communicated. OPRE also has a robust dissemination function that includes the OPRE website, an OPRE e-newsletter, and social media presence on Facebook, Twitter, Instagram, and LinkedIn.
  • OPRE biennially hosts two major conferences, the Research and Evaluation Conference on Self-Sufficiency (RECS) and the National Research Conference on Early Childhood (NRCEC) to share research findings with researchers and with program administrators and policymakers at all levels. OPRE also convenes the Network of Infant and Toddler Researchers (NITR) which brings together applied researchers with policymakers and technical assistance providers to encourage research-informed practice and practice-informed research; and the Child Care and Early Education Policy Research Consortium (CCEEPRC) which brings together researchers, policymakers, and practitioners to discuss what we are learning from research that can help inform policy decisions for ACF, States, Territories, localities, and grantees and to consider the next steps in early care and education (ECE) research. In light of COVID-19, OPRE plans to convene the Network, Consortium, and RECS and NRCEC conferences virtually in 2020.
  • The Children’s Bureau (CB) sponsors the recurring National Child Welfare Evaluation Summit to bring together partners from child welfare systems and the research community to strengthen the use of data and evaluation in child welfare; disseminate information about effective and promising prevention and child welfare services, programs, and policies; and promote the use of data and evaluation to support sound decision-making and improved practice in state and local child welfare systems.
  • ACF also sponsors several:
Score
6
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY20?
(Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with rigorous evaluation requirements)

7.1 Did the agency engage leadership and staff in its innovation efforts to improve the impact of its programs?
  • In late 2019, ACF stood up a customer experience initiative to enhance ACF’s delivery and administration of human services. This initiative focuses on ways to improve the experiences of both grantees and ACF employees. One sub-initiative is to equip ACF leaders and staff with fundamental innovation strategies and ways of fostering a culture of innovation within their programs. In early 2020, ACF invited external experts to host two skill-based trainings for ACF staff: 1) Innovation as a Discipline: Empowering Employees to Change the Game, and 2) Human-Centered Design Training: Putting People at the Center of What we Do. This initiative is ongoing.
  • ACF leadership and staff have also collaborated with other federal agencies, local leaders, and entrepreneurs around the practice of innovation. In January of 2020, the ACF Office of Early Childhood Development (ECD) partnered with other ACF program offices and the Department of Education to put on the first-ever Showcase on Early Childhood Development and Learning within the Annual ED Games Expo. About 100 federal, national, state, and local leaders in early childhood, education, health, and human services joined ACF leadership and 11 thought leaders and entrepreneurs who presented a series of Big Idea talks about innovation and how to scale-up good ideas. This Showcase was one part of a multi-day event sponsored by ED’s Institute of Education Science which focused on games and technology targeted at children, parents, educators, funders, and other stakeholders with more than 1000 people attending.
  • HHS has embarked on a process called ReImagine HHS, which has engaged leadership and staff from around the department to identify strategic shifts to transform how HHS operates. One part of this larger initiative, called Aim for Independence (AFI), is using a human centered design approach to rethink how ACF does work and how that work translates into long-lasting, positive outcomes for parents and children. Engagement activities have included a leadership retreat and opportunities for staff input.
  • ACF leadership has proposed new Opportunity and Economic Mobility Demonstrations to allow states to redesign safety net service delivery by streamlining funding from multiple public assistance and workforce development programs and providing services tailored to their populations’ specific needs. The demonstrations would be subject to rigorous evaluation.
 7.2 Did the agency have policies, processes, structures, or programs to promote innovation to improve the impact of its programs?
  • ACF’s mission to “foster health and well-being by providing federal leadership, partnership and resources for the compassionate and effective delivery of human services” is undergirded by six values: dedication, professionalism, integrity, stewardship, respect, and excellence. ACF’s emphasis on excellence, “exemplified by innovations and solutions that are anchored in available evidence, build knowledge and transcend boundaries,” drives the agency’s support for innovation across programs and practices.
  • For example, ACF’s customer experience initiative is supporting the development of innovative practices for more efficient and responsive agency operations, including the identification of new ways to streamline grantee compliance requirements, minimize administrative burden, and increase grantee capacity for service delivery.
  • ACF also administers select grant programs—through innovation projects, demonstration projects, and waivers to existing program requirements–that are designed to both implement and evaluate innovative interventions, as a part of an ACF-sponsored evaluation or an individual evaluation to accompany implementation of that innovation. For example:
  • ACF projects that support innovation include:
7.3 Did the agency evaluate its innovation efforts, including using rigorous methods?
  • In addition to the list of ACF demonstration projects, innovation projects, and waiver programs with rigorous evidence activities built into their delivery (as described in sub-criteria 7.2), ACF also conducts rigorous research on other innovative human services.
  • The evaluations below are on-going rigorous evaluations conducted by ACF:
  • ACF administers a series of Head Start and Early Head Start University Partnership Grants in which university researcherspartner with local Head Start or Early Head Start programs to conduct an implementation study and evaluate the effectiveness ofinnovative strategies for improving service quality and/or child/family outcomes. Past grants programs have examined promisingparenting interventions, dual-generation approaches, integrated interventions in center-based Early Head Start, and approachesfor working with dual language learner.
Score
7
Use of Evidence in Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its competitive grant programs in FY20? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring for evidence; Pay for Success provisions)

8.1 What were the agency’s five largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
8.2 Did the agency use evidence of effectiveness to allocate funds in 5 largest competitive grant programs? (e.g., Were evidence-based interventions/practices required or suggested? Was evidence a significant requirement?)
  • ACF reviewed performance data from current Healthy Marriage and Responsible Fatherhood grantees (using the nFORM system) to set priorities, interests, and expectations for 2020 HMRF grant applicants. For example, because nFORM data indicated that organizations were more likely to meet enrollment targets and engage participants when they focused on implementing one program model, ACF’s 2020 FOA mentions specific interest in grantee projects, “that implement only one specific program model designed for one specific youth service population (p. 12)”
  • ACF “anticipates giving preference to those applicants that were awarded a Healthy Marriage or Responsible Fatherhood grant between 2015 and 2019, and that (a) are confirmed by ACF to have met all qualification requirements under Section IV.2, The Project Description, Approach, Organizational Capacity of this FOA; and (b) are confirmed by ACF to have received an acceptable rating on their semi-annual grant monitoring statements during years three and four of the project period. Particular consideration will be given to applicants that: (1) designed and successfully implemented, through to end of 2019, an impact evaluation of their program model, and that the impact evaluation was a fair impact test of their program model and that was not terminated prior to analysis; or (2) successfully participated in a federally-led impact evaluation” (p. 17).
  • ACF will evaluate HMRF grant applicants based upon their capacity to conduct a local impact evaluation and their proposed approach (for applicants required or electing to conduct local evaluations); their ability to provide a reasonable rationale and/or research base for the program model(s) and curriculum(a) proposed; and their inclusion of a Continuous Quality Improvement Plan, clearly describing the organizational commitment to data-driven approaches to identify areas for program performance, testing potential improvements, and cultivating a culture and environment of learning and improvement, among other things. Further, The Compliance And Performance reviews (CAPstone) entail a thorough review of each grantee’s performance. The Office of Family Assistance (OFA) sends a formal set of questions about grantee performance that the grant program specialists and TA providers answer ahead of time, and then they convene meetings where the performance of each grantee is discussed by OFA, OPRE, and the TA provider at length using nFORM data and the answers to the formal questions mentioned above.
  • The Head Start Designation Renewal System (DRS) determines whether Head Start/Early Head Start grantees are delivering high-quality comprehensive services to the children and families that they serve. These determinations are based on seven conditions, one of which looks at how Head Start classrooms within programs perform on the Classroom Assessment Scoring System (CLASS), an observation-based measure of the quality of teacher-child interactions. When the DRS deems grantees to be underperforming, grantees are denied automatic renewal of their grant and must apply for funding renewal through a standard open competition process. In the most recent Head Start FOA language, grantees who are re-competing for Head Start funds must include a description of any violations, such as deficiencies, areas of non-compliance, and/or audit finding in their record of Past Performance (p. 26). Applicants may describe the actions they have taken to address these violations. According to Head Start policy, in competitions to replace or potentially replace a current grantee, the responsible HHS official will give priority to applicants that have demonstrated capacity in providing effective, comprehensive, and well-coordinated early childhood education and development services and programs (see section 1304.20: Selection among applicants).
  • ACF manages the Runaway and Homeless Youth Training and Technical Assistance Center (RHYTTAC), the national training and technical assistance entity that provides resources and direct assistance to the Runaway and Homeless Youth (RHY) grantees and other youth serving organizations eligible to receive RHY funds. RHYTTAC disseminates information about and supports grantee implementation of high-quality, evidence-informed, and evidence-based practices. The RHYTTAC funding opportunity announcement evaluates applicants based on their strategy for tracking RHY grantee uptake and implementation of evidence-based or evidence-informed strategies.
  • The Head Start Designation Renewal System (DRS) determines whether Head Start/Early Head Start grantees are delivering high-quality comprehensive services to the children and families that they serve. These determinations are based on seven conditions, one of which looks at how Head Start classrooms within programs perform on the Classroom Assessment Scoring System (CLASS), an observation-based measure of the quality of teacher-child interactions. When the DRS deems grantees to be underperforming, grantees are denied automatic renewal of their grant and must apply for funding renewal through a standard open competition process. In the most recent Head Start FOA language, grantees who are re-competing for Head Start funds must include a description of any violations, such as deficiencies, areas of non-compliance, and/or audit finding in their record of Past Performance (p. 26). Applicants may describe the actions they have taken to address these violations. According to Head Start policy, in competitions to replace or potentially replace a current grantee, the responsible HHS official will give priority to applicants that have demonstrated capacity in providing effective, comprehensive, and well-coordinated early childhood education and development services and programs (see section 1304.20: Selection among applicants).
  • ACF also evaluates Unaccompanied Children Services, Preschool Development Grants, and Runaway and Homeless Youth grant applicants based upon: their proposed program performance evaluation plan; how their data will contribute to continuous quality improvement; and their demonstrated experience with comparable program evaluation, among other factors.
8.3 Did the agency use its 5 largest competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • ACF’s template (see p. 14 in Attachment C) for competitive grant announcements includes standard language that funding opportunity announcement drafters may select to require grantees to either 1) collect performance management data that contributes to continuous quality improvement and is tied to the project’s logic model, or 2) conduct a rigorous evaluation for which applicants must propose an appropriate design specifying research questions, measurement and analysis.
  • As a condition of award, Head Start grantees are required to participate fully in ACF-sponsored evaluations, if selected to do so. As such, ACF has an ongoing research portfolio that is building evidence in Head Start. Research sponsored through Head Start funding over the past decade has provided valuable information not only to guide program improvement in Head Start itself, but also to guide the field of early childhood programming and early childhood development. Dozens of Head Start programs have collaborated with researchers in making significant contributions in terms of program innovation and evaluation, as well as the use of systematic data collection, analysis and interpretation in program operations.
  • ACF’s 2020 Healthy Marriage and Responsible Fatherhood (HMRF) Grants establish required evidence activities by scope of grantee services (p.4). For example, large scope services (requesting funding between $1M-$1.5M) “must propose a rigorous impact evaluation (i.e., randomized-controlled trial (RCT) or high-quality, quasi-experimental design (QED) study)…and must allocate at least 15 percent, but no more than 20 percent, of their total annual funding for evaluation” (p. 19) Regardless of their scope of services, all 2020 HMRF grantees must plan for and carry out continuous quality improvement activities (p. 18) and conduct a local evaluation (p. 18) or participate in a federally led evaluation or research effort (p. 22). ACF has an ongoing research portfolio building evidence related to Strengthening Families, Healthy Marriage, and Responsible Fatherhood, and has conducted randomized controlled trials with grantees in each funding round of these grants.
  • The 2003 Reauthorization of the Runaway and Homeless Youth Act called for a study of long-term outcomes for youth who are served through the Transitional Living Program (TLP). In response, ACF is sponsoring a study that will capture data from youth at program entry and at intermediate- and longer-term follow-up points after program exit and will assess outcomes related to housing, education, and employment. ACF is also sponsoring a process evaluation of the 2016 Transitional Living Program Special Population Demonstration Project.
  • Additionally, Unaccompanied Children Services (p. 33), Preschool Development Grants (p. 30), andRunaway and Homeless Youth (p.24) grantees are required to develop a program performance evaluation plan.
8.4 Did the agency use evidence of effectiveness to allocate funds in any other competitive grant programs in FY20 (besides its 5 largest grant programs)?
  • ACF’s Personal Responsibility Education Program includes three individual discretionary grant programs that fund programs exhibiting evidence of effectiveness, innovative adaptations of evidence-based programs, and promising practices that teach youth about abstinence and contraception to prevent pregnancy and sexually transmitted infections.
  • To receive funding through ACFs Sexual Risk Avoidance Education (SRAE) program, applicants must cite evidence published in a peer-reviewed journal and/or a randomized controlled trial or quasi-experimental design to support their chosen interventions or models.
 8.5 What are the agency’s 1-2 strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?  
  • As mentioned above, ACF is conducting a multi-pronged evaluation of the Health Profession Opportunity Grants Program (HPOG). Findings from the first cohort of HPOG grants influenced the funding opportunity announcement for the second round of HPOG (HPOG 2.0) funding. ACF used findings from the impact evaluation of the first cohort of HPOG grants to provide insights to the field about which HPOG program components are associated with stronger participant outcomes. For example, based on the finding that many participants engaged in short-term training for low-wage, entry-level jobs, the HPOG 2.0 FOA more carefully defined the career pathways framework, described specific strategies for helping participants progress along a career pathway, and identified and defined key HPOG education and training components. Applicants were required to more clearly describe how their program would support career pathways for participants. Based on an analysis, which indicated limited collaborations with healthcare employers, the HPOG 2.0 FOA required applicants to demonstrate the use of labor market information, consult with local employers, and describe their plans for employer engagement. The HPOG 2.0 FOA also placed more emphasis on the importance of providing basic skills education and assessment of barriers to make the programs accessible to clients who were most prepared to benefit, based on the finding that many programs were screening out applicants with low levels of basic literacy, reading, and numeracy skills.
  • ACF’s Personal Responsibility Education Innovative Strategies Program (PREIS) grantees must conduct independent evaluations of their innovative strategies for the prevention of teen pregnancy, births, and STIs, supported by ACF training and technical assistance. These rigorous evaluations are designed to meet the HHS Teen Pregnancy Prevention Evidence-Based Standards and are expected to generate lessons learned so that others can benefit from these strategies and innovative approaches.
  • In 2019, ACF awarded two child welfare discretionary grants to build knowledge of what works: (1) Regional Partnership Grants to Increase the Well-Being of, and to Improve the Permanency Outcomes for, Children and Families Affected By Opioids and Other Substance Abuse: these grants aim to build evidence on the effectiveness of targeted approaches that improve outcomes for children and families affected by opioids and other substance use orders. To this end, grantees will evaluate their local program; select and report on performance indicators that align with proposed program strategies and activities; and participate in a national cross-site evaluation that will describe outcomes for children, adults, and families enrolled in RPG projects as well as the outcomes of the partnerships. (2) Community Collaboratives to Strengthen and Preserve Families: these grants will support the development, implementation, and evaluation of primary prevention strategies to improve the safety, stability, and well-being of all families through a continuum of community-based services and supports. Projects will include both process and outcome evaluations.
8.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • ACF’s template (see p. 14 in Attachment C) for competitive grant announcements includes standard language instructing grantees to conduct evaluation efforts. Program offices may use this template to require grantees to collect performance data or conduct a rigorous evaluation. Applicants are instructed to include third-party evaluation contracts in their proposed budget justifications.
  • ACF’s 2020 Healthy Marriage and Responsible Fatherhood (HMRF) Grants establish required evidence activities by scope of grantee services (p.4). For example, large scope services (requesting funding between $1M-$1.5M) “must propose a rigorous impact evaluation (i.e., randomized-controlled trial (RCT) or high-quality, quasi-experimental design (QED) study)…and must allocate at least 15 percent, but no more than 20 percent, of their total annual funding for evaluation” (p. 19) Regardless of their scope of services, all 2020 HMRF grantees must plan for and carry out continuous quality improvement activities (p. 18) and conduct a local evaluation (p. 18) or participate in a federally led evaluation or research effort (p. 22).
  • ACF’s 2018 Preschool Development Grants funding announcement notes that “it is intended that States or territories will use a percentage of the total amount of their [renewal] grant award during years 2 through 4 to conduct the proposed process, cost, and outcome evaluations, and to implement a data collection system that will allow them to collect, house, and use data on the populations served, the implementation of services, the cost of providing services, and coordination across service partners.”
  • ACF’s rules (section 1351.15) allow Runaway and Homeless Youth grant awards to be used for “data collection and analysis.
  • Regional Partnership Grants (RPG) (p. 1) require a minimum of 20 percent of grant funds to be spent on evaluation elements. ACF has supported the evaluation capacity of RPG grantees by providing technical assistance for data collection, performance measurement, and continuous quality improvement; implementation of the cross-site evaluation; support for knowledge dissemination; and provision of group TA via webinars and presentation.
  • Community Collaboratives to Strengthen and Preserve Families (CCSPF) grants (p. 7) require a minimum of 10 percent of grant funds to be used on data collection and evaluation activities. ACF has supported the evaluation capacity of CCSPF grantees by providing technical assistance for developing research questions, methodologies, process and outcome measures; implementing grantee-designed evaluations and continuous quality improvement activities; analyzing evaluation data; disseminating findings; and supporting data use in project and organizational decision-making processes.
  • ACF also provides evaluation technical assistance to:
Score
6
Use of Evidence in Non-Competitive Grant Programs

What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?

9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?

In FY20, the five largest non-competitive grant programs are:

  1. Temporary Assistance for Needy Families (TANF) ($16.4 billion; eligible entities: states);
  2. Child Care and Development Fund (Block Grant and Entitlement to States combined) ($8.72 billion; eligible entities: states);
  3. Foster Care ($5.3 billion; eligible entities: states);
  4. Child Support Enforcement Payments to States ($4.6 billion; eligible entities: states);
  5. Low Income Home Energy Assistance ($3.7 billion; eligible entities: states, tribes, territories).
9.2 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • The Family First Prevention Services Act (FFPSA) (Division E, Title VII of the Bipartisan Budget Act of 2018), funded under the Foster Care budget, newly enables States to use Federal funds available under parts B and E of Title IV of the Social Security Act to provide enhanced support to children and families and prevent foster care placements through the provision of evidence-based mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services. FFPSA requires an independent systematic review of evidence to designate programs and services as “promising,” “supported,” and “well-supported” practices. Only interventions designated as evidence-based will be eligible for federal funds.
  • Most of ACF’s non-competitive grant programs are large block grants without the legislative authority to use evidence of effectiveness to allocate funds. Several programs do have performance-based payment incentive programs, however. For example, The Adoption and Legal Guardianship Incentive Payments program, most recently reauthorized through FY 2021 as part of the Family First Prevention Services Act (Social Security Act §473A), provides incentive payments to jurisdictions for improved performance in both adoptions and legal guardianship of children in foster care. Additionally, the Child Support program (p. 6) has an outcome-based performance management system established by the Child Support Performance and Incentive Act of 1998 (CSPIA; Social Security Act § 458). Under CSPIA, states are measured in five program areas: paternity establishment, support order establishment, current support collections, cases paying towards arrears, and cost effectiveness. This performance-based incentive and penalty program is used to reward states for good or improved performance and to impose penalties when state performance falls below a specified level and has not improved.
9.3 Did the agency use its largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • TANF Grant Program: The TANF statute gives HHS responsibility for building evidence about the TANF program: “Evaluation of the Impacts of TANF- The Secretary shall conduct research on the effect of State programs funded under this part and any other State program funded with qualified State expenditures on employment, self-sufficiency, child well-being, unmarried births, marriage, poverty, economic mobility, and other factors as determined by the Secretary.” (§413(a)). Since FY17, Congress has designated 0.33% of the TANF Block Grant for research, evaluation, and technical assistance related to the TANF Block Grant.
  • ACF has a long-standing and ongoing research portfolio in service of building evidence for the TANF Grant Program. ACF conducts research and evaluation projects in collaboration with TANF grantees, typically in areas where TANF grantees are facing challenges, innovating, or carrying out demonstration projects. This ongoing work includes building evidence around career pathways training programs, subsidized employment approaches, job search approaches, and employment coaching interventions. These are all program approaches used by state and county TANF grantees to meet their employment goals. ACF widely disseminates information from its research and evaluation activities to TANF grantees and provides extensive training and technical assistance.
  • ACF’s TANF Data Innovation (TDI) project, launched in 2017, supports the innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. In 2019, the TANF Data Collaborative (TDC), an initiative of the TDI project, conducted a needs assessment survey of all states and is now supporting a TANF agency Pilot program with 8 Pilot sites. To support state and local efforts and build strategic partnerships, Pilot agencies are receiving funding and intensive training and technical assistance.
  • Child Care Development Block Grant Program: While the Child Care Development Block Grant Act (p. 34) does not allocate funding for States to independently build evidence, the Act allows for up to one-half of one percent of CCDBG funding for a fiscal year to be reserved for HHS to conduct research and evaluation of the CCDBG grant program and to disseminate the key findings of those evaluations widely and on a timely basis. ACF manages this ongoing research portfolio to build evidence for the Child Care and Development Block Grant Program (CCDBG), conducting research and evaluation projects in collaboration with CCDBG grantees, typically in areas where CCDBG grantees are facing challenges, innovating, or carrying out demonstration projects. Major projects in recent years include the National Survey of Early Care and Education; assessment of evidence on ratings in Quality Rating and Improvement Systems(QRIS); and several research partnerships between CCDF lead agencies and researchers. ACF widely disseminates information from its research and evaluation activities to CCDF grantees and provides extensive training and technical assistance.
  • Foster Care and Related Child Welfare Grant Programs: ACF administers several foster care and related child welfare grant programs that do not possess the funding authority for States to conduct independent evidence-building activities. Some of these programs have set-asides for federal research; the Foster Care Independence Act of 1999, for instance, sets aside 1.5% of the John H. Chafee Foster Care Program for Successful Transition to Adulthood program (Chafee program) for evaluations of promising independent living programs.
  • As such, ACF has an ongoing research portfolio on the Title IV-E foster care grant program and related grant programs. ACF conducts research and evaluation in collaboration with child welfare grantees, typically focusing on areas in which grantees are facing challenges, innovating, or conducting demonstrations. Examples include strategies for prevention of maltreatment, meeting service needs, and improving outcomes for children who come to the attention of child welfare. Major projects include the National Survey of Child and Adolescent Well-Being (NSCAW) and a Supporting Evidence Building in Child Welfare project to increase the number of evidence-supported interventions grantees can use to serve the child welfare population.
  • ACF has begun work on conducting formative evaluations of independent living programs of potential national significance in preparation for possible future summative evaluations. This work builds off of the multi-site evaluation of foster youth programs, a rigorous, random assignment evaluation of four programs funded under the Chafee program completed in 2011.
  • Also, ACF’s Community-Based Child Abuse Prevention (CBCAP) formula grants, with a focus on supporting community-based approaches to prevent child abuse and neglect, are intended to inform the use of other child welfare funds more broadly.
  • Child Support Enforcement Research and Evaluation Grant Program: Section 1115 of the Social Security Act provides unique authority for research and evaluation grants to child support enforcement grantees to “improve the financial well-being of children or otherwise improve the operation of the child support program.” For instance, ACF awarded Digital Marketing Grants to test digital marketing approaches and partnerships to reach parents that could benefit from child support services, and create or improve two-way digital communication and engagement with parents.
  • ACF continues to manage a broad child support enforcement research portfolio and administers a variety of research/evaluation components to understand more about cost and program effectiveness. Research and evaluation within the portfolio have consisted of 1) supporting large multi-state demonstrations which include random assignment evaluations (described in criteria question 7.4), 2) funding a supplement to the Census Bureau’s Current Population survey, and 3) supporting research activities of other government programs and agencies by conducting matches of their research samples to the NDNH.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs in FY20 (besides its 5 largest grant programs)?
  • States applying for funding from ACF’s Community Based Child Abuse Prevention (CBCAP) grant program must “demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” The Children’s Bureau defines evidence-based and evidence-informed programs and practices along a continuum with four categories: Emerging and Evidence-Informed; Promising; Supported; and Well Supported. Programs determined to fall within specific program parameters will be considered to be “evidence informed” or “evidence-based” practices (EBP), as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds directed towards evidence-based and evidence-informed practices.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • In Section 413 of the Social Security act where Congress gives HHS primary responsibility for building evidence about the TANF program, Congress also commissions HHS to develop “a database (which shall be referred to as the “What Works Clearinghouse of Proven and Promising Projects to Move Welfare Recipients into Work”) of the projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects” (413(g)). In April of 2020, ACF officially launched the Pathways to Work Evidence Clearinghouse, a user-friendly website that shares the results of the systematic review and provides web-based tools and products to help state and local TANF administrators, policymakers, researchers and the general public make sense of the results and better understand how this evidence might apply to questions and contexts that matter to them.
  • Additionally, ACF has continued to produce findings from numerous randomized controlled trials providing evidence on strategies that TANF agencies can use such as subsidized employment, coaching, career pathways and job search strategies. Ongoing ACF efforts to build evidence for what works for TANF recipients and other low-income individuals include the Building Evidence on Employment Strategies for Low-Income Families (BEES) project and the Next Generation of Enhanced Employment Strategies (NextGen) project; these projects are evaluating the effectiveness of innovative programs designed to boost employment and earnings among low-income individuals.
  • ACF’s Office of Child Care drew on research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of dollars to improve the quality of programs, and more to inform regulations related to Child Care and Development Block Grant reauthorization.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • Community-Based Child Abuse Prevention (CBCAP) programs are authorized as part of the Child Abuse Prevention and Treatment Act (CAPTA). CAPTA promotes the use of evidence-based and evidence-informed programs and practices that effectively strengthen families and prevent child abuse and neglect. This includes efforts to improve the evaluation capacity of the states and communities to assess progress of their programs and collaborative networks in enhancing the safety and wellbeing of children and families. The 2020 Program Instruction for the Community-Based Child Abuse Prevention (CBCAP) grant program states that CBCAP funds made available to states must be used for the financing, planning, community mobilization, collaboration, assessment, information and referral, startup, training and technical assistance, information management and reporting, and reporting and evaluation costs for establishing, operating, or expanding community-based and prevention-focused programs and activities designed to strengthen and support families and prevent child abuse and neglect, among other things.
  • Child Care and Development Block Grant Act of 2014 says states are required to spend not less than 7, 8, and 9 percent of their CCDF awards (“quality funds”) (for years 1-2, 3-4, and 5+ after 2014 CCDBG enactment, respectively – see 128 STAT. 1987) on activities to improve the quality of child care services provided in the state, including:
    • 1B: Supporting the training and professional development of the child care workforce through…incorporating the effective use of data to guide program improvement (see 128 STAT 1988)
    • 3: Developing, implementing, or enhancing a quality rating system for child care providers and services, which may support and assess the quality of child care providers in the State (A) and be designed to improve the quality of different types of child care providers (C) (see 128 STAT 1988)
    • 7: Evaluating and assessing the quality and effectiveness of child care programs and services offered in the State, including evaluating how such programs positively impact children (see 128 STAT 1990)
  • ACF requires all CCDF lead agencies to annually report on how their CCDF quality funds were expended, including the activities funded and the measures used by states and territories to evaluate progress in improving the quality of child care programs and services. ACF released a Program Instruction for state and territorial lead agencies to provide guidance on reporting the authorized activities for the use of quality funds.
  • ACF also provides evaluation technical assistance to:
  • support grantees who are conducting their own local evaluations (MIECHV grantees (in collaboration with HRSA), Tribal MIECHV grantees, and ACF staff directly supporting Section 1115 child support demonstration grantees); and
  • build the evaluation capacity of grantees (e.g., the TANF Data Innovation Project, the Tribal Early Childhood Research Center, and the Center for States).
Score
6
Repurpose for Results

In FY20, did the agency shift funds away from or within any practice, policy, or program that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; incentivizing or urging grant applicants to stop using ineffective practices in funding announcements; proposing the elimination of ineffective programs through annual budget requests; incentivizing well-designed trials to fill specific knowledge gaps; supporting low-performing grantees through mentoring, improvement plans, and other forms of assistance; using rigorous evaluation results to shift funds away from a program)

10.1 Did the agency have policy(ies) for determining when to shift funds away from grantees, practices, policies, interventions, and/or programs that consistently failed to achieve desired outcomes, and did the agency act on that policy?
  • The Family First Prevention Services Act of 2018 only allows federal matching funds only for evidence-based prevention services offered by states, thereby incentivizing states to shift their spending from non-evidence based approaches.
  • For ACF’s Child and Family Services Reviews (CFSRs) of state child welfare systems, states determined not to have achieved substantial conformity in all the areas assessed must develop and implement a Program Improvement Plan addressing the areas of nonconformity. ACF supports the states with technical assistance and monitors implementation of their plans. States must successfully complete their plans to avoid financial penalties for nonconformance.
  • The ACF Head Start program significantly expanded its accountability provisions with the establishment of five-year Head Start grant service periods and the Head Start Designation Renewal System (DRS). The DRS was designed to determine whether Head Start and Early Head Start programs are providing high quality comprehensive services to the children and families in their communities. Where they are not, grantees are denied automatic renewal of their grant and must apply for funding renewal through an open competition process. Those determinations are based on seven conditions, one of which looks at how Head Start classrooms within programs perform on the Classroom Assessment Scoring System(CLASS), an observation-based measure of the quality of teacher-child interactions. Data from ACF’s Head Start Family and Child Experiences Survey (FACES) and Quality Features, Dosage, Thresholds and Child Outcomes (Q-DOT) study were used to craft the regulations that created the DRS and informed key decisions in its implementation. This included where to set minimum thresholds for average CLASS scores, the number of classrooms within programs to be sampled to ensure stable program-level estimates on CLASS, and the number of cycles of CLASS observations to conduct. At the time the DRS notification letters were sent out to grantees in 2011, there were 1,421 non-tribal active grants, and of these, 453 (32%) were required to re-compete (p. 19).
  • Findings from the evaluation of the first round Health Profession Opportunity Grants (HPOG) program influenced the funding opportunity announcement for the second round of HPOG funding. Namely, the scoring criteria used to select HPOG 2.0 grantees incorporated knowledge gained about challenges experienced in the HPOG 1.0 grant program. For example, based on those challenges, applicants were asked to clearly demonstrate—and verify with local employers—an unmet need in their service area for the education and training activities proposed. Applicants were also required to provide projections for the number of individuals expected to begin and complete basic skills education. Grantees must submit semi-annual and annual progress reports to ACF to show their progress in meeting these projections. If they have trouble doing so, grantees are provided with technical assistance to support improvement or are put on a corrective action plan so that ACF can more closely monitor their steps toward improvement.
10.2 Did the agency identify and provide support to agency programs or grantees that failed to achieve desired outcomes?
  • In an effort to create operational efficiencies and increase grantee capacity for mission-related activities,ACF implemented a process in 2019 in which the grants management office completes annual risk modeling of grantee financial administrative datasets, which helps identify organizations that would benefit from targeted technical assistance. The grants management office provides TA to these grantees to improve their financial management and help direct resources toward effective service delivery.
  • As mentioned in 10.1, states reviewed by a Child and Family Services Review (CFSR) and determined not to have achieved substantial conformity in all the areas assessed must develop and implement a Program Improvement Plan addressing the areas of nonconformity. ACF supports the states with technical assistance and monitors implementation of their plans. ACF also provides broad programmatic technical assistance to support grantees in improving their service delivery, including the Child Welfare Capacity Building Collaborative. The Collaborative is designed to help public child welfare agencies, Tribes, and courts enhance and mobilize the human and organizational assets necessary to meet Federal standards and requirements; improve child welfare practice and administration; and achieve safety, permanency, and well-being outcomes for children, youth, and families. ACF also sponsors the Child Welfare Information Gateway, a platform connecting child welfare, adoption, and related professionals as well as the public to information, resources, and tools covering topics on child welfare, child abuse and neglect, out-of-home care, adoption, and more.
Back to the Standard

Visit Results4America.org