2020 Federal Standard of Excellence


Use of Evidence Non-Competitive Grant Programs**

Did the agency use evidence of effectiveness when allocating funds from its non-competitive grant programs in FY20? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

Score
6
Administration for Children and Families (HHS)
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?

In FY20, the five largest non-competitive grant programs are:

  1. Temporary Assistance for Needy Families (TANF) ($16.4 billion; eligible entities: states);
  2. Child Care and Development Fund (Block Grant and Entitlement to States combined) ($8.72 billion; eligible entities: states);
  3. Foster Care ($5.3 billion; eligible entities: states);
  4. Child Support Enforcement Payments to States ($4.6 billion; eligible entities: states);
  5. Low Income Home Energy Assistance ($3.7 billion; eligible entities: states, tribes, territories).
9.2 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • The Family First Prevention Services Act (FFPSA) (Division E, Title VII of the Bipartisan Budget Act of 2018), funded under the Foster Care budget, newly enables States to use Federal funds available under parts B and E of Title IV of the Social Security Act to provide enhanced support to children and families and prevent foster care placements through the provision of evidence-based mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services. FFPSA requires an independent systematic review of evidence to designate programs and services as “promising,” “supported,” and “well-supported” practices. Only interventions designated as evidence-based will be eligible for federal funds.
  • Most of ACF’s non-competitive grant programs are large block grants without the legislative authority to use evidence of effectiveness to allocate funds. Several programs do have performance-based payment incentive programs, however. For example, The Adoption and Legal Guardianship Incentive Payments program, most recently reauthorized through FY 2021 as part of the Family First Prevention Services Act (Social Security Act §473A), provides incentive payments to jurisdictions for improved performance in both adoptions and legal guardianship of children in foster care. Additionally, the Child Support program (p. 6) has an outcome-based performance management system established by the Child Support Performance and Incentive Act of 1998 (CSPIA; Social Security Act § 458). Under CSPIA, states are measured in five program areas: paternity establishment, support order establishment, current support collections, cases paying towards arrears, and cost effectiveness. This performance-based incentive and penalty program is used to reward states for good or improved performance and to impose penalties when state performance falls below a specified level and has not improved.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • TANF Grant Program: The TANF statute gives HHS responsibility for building evidence about the TANF program: “Evaluation of the Impacts of TANF- The Secretary shall conduct research on the effect of State programs funded under this part and any other State program funded with qualified State expenditures on employment, self-sufficiency, child well-being, unmarried births, marriage, poverty, economic mobility, and other factors as determined by the Secretary.” (§413(a)). Since FY17, Congress has designated 0.33% of the TANF Block Grant for research, evaluation, and technical assistance related to the TANF Block Grant.
  • ACF has a long-standing and ongoing research portfolio in service of building evidence for the TANF Grant Program. ACF conducts research and evaluation projects in collaboration with TANF grantees, typically in areas where TANF grantees are facing challenges, innovating, or carrying out demonstration projects. This ongoing work includes building evidence around career pathways training programs, subsidized employment approaches, job search approaches, and employment coaching interventions. These are all program approaches used by state and county TANF grantees to meet their employment goals. ACF widely disseminates information from its research and evaluation activities to TANF grantees and provides extensive training and technical assistance.
  • ACF’s TANF Data Innovation (TDI) project, launched in 2017, supports the innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. In 2019, the TANF Data Collaborative (TDC), an initiative of the TDI project, conducted a needs assessment survey of all states and is now supporting a TANF agency Pilot program with 8 Pilot sites. To support state and local efforts and build strategic partnerships, Pilot agencies are receiving funding and intensive training and technical assistance.
  • Child Care Development Block Grant Program: While the Child Care Development Block Grant Act (p. 34) does not allocate funding for States to independently build evidence, the Act allows for up to one-half of one percent of CCDBG funding for a fiscal year to be reserved for HHS to conduct research and evaluation of the CCDBG grant program and to disseminate the key findings of those evaluations widely and on a timely basis. ACF manages this ongoing research portfolio to build evidence for the Child Care and Development Block Grant Program (CCDBG), conducting research and evaluation projects in collaboration with CCDBG grantees, typically in areas where CCDBG grantees are facing challenges, innovating, or carrying out demonstration projects. Major projects in recent years include the National Survey of Early Care and Education; assessment of evidence on ratings in Quality Rating and Improvement Systems(QRIS); and several research partnerships between CCDF lead agencies and researchers. ACF widely disseminates information from its research and evaluation activities to CCDF grantees and provides extensive training and technical assistance.
  • Foster Care and Related Child Welfare Grant Programs: ACF administers several foster care and related child welfare grant programs that do not possess the funding authority for States to conduct independent evidence-building activities. Some of these programs have set-asides for federal research; the Foster Care Independence Act of 1999, for instance, sets aside 1.5% of the John H. Chafee Foster Care Program for Successful Transition to Adulthood program (Chafee program) for evaluations of promising independent living programs.
  • As such, ACF has an ongoing research portfolio on the Title IV-E foster care grant program and related grant programs. ACF conducts research and evaluation in collaboration with child welfare grantees, typically focusing on areas in which grantees are facing challenges, innovating, or conducting demonstrations. Examples include strategies for prevention of maltreatment, meeting service needs, and improving outcomes for children who come to the attention of child welfare. Major projects include the National Survey of Child and Adolescent Well-Being (NSCAW) and a Supporting Evidence Building in Child Welfare project to increase the number of evidence-supported interventions grantees can use to serve the child welfare population.
  • ACF has begun work on conducting formative evaluations of independent living programs of potential national significance in preparation for possible future summative evaluations. This work builds off of the multi-site evaluation of foster youth programs, a rigorous, random assignment evaluation of four programs funded under the Chafee program completed in 2011.
  • Also, ACF’s Community-Based Child Abuse Prevention (CBCAP) formula grants, with a focus on supporting community-based approaches to prevent child abuse and neglect, are intended to inform the use of other child welfare funds more broadly.
  • Child Support Enforcement Research and Evaluation Grant Program: Section 1115 of the Social Security Act provides unique authority for research and evaluation grants to child support enforcement grantees to “improve the financial well-being of children or otherwise improve the operation of the child support program.” For instance, ACF awarded Digital Marketing Grants to test digital marketing approaches and partnerships to reach parents that could benefit from child support services, and create or improve two-way digital communication and engagement with parents.
  • ACF continues to manage a broad child support enforcement research portfolio and administers a variety of research/evaluation components to understand more about cost and program effectiveness. Research and evaluation within the portfolio have consisted of 1) supporting large multi-state demonstrations which include random assignment evaluations (described in criteria question 7.4), 2) funding a supplement to the Census Bureau’s Current Population survey, and 3) supporting research activities of other government programs and agencies by conducting matches of their research samples to the NDNH.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs in FY20 (besides its five largest grant programs)?
  • States applying for funding from ACF’s Community Based Child Abuse Prevention (CBCAP) grant program must “demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” The Children’s Bureau defines evidence-based and evidence-informed programs and practices along a continuum with four categories: Emerging and Evidence-Informed; Promising; Supported; and Well Supported. Programs determined to fall within specific program parameters will be considered to be “evidence informed” or “evidence-based” practices (EBP), as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds directed towards evidence-based and evidence-informed practices.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • In Section 413 of the Social Security act where Congress gives HHS primary responsibility for building evidence about the TANF program, Congress also commissions HHS to develop “a database (which shall be referred to as the “What Works Clearinghouse of Proven and Promising Projects to Move Welfare Recipients into Work”) of the projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects” (413(g)). In April of 2020, ACF officially launched the Pathways to Work Evidence Clearinghouse, a user-friendly website that shares the results of the systematic review and provides web-based tools and products to help state and local TANF administrators, policymakers, researchers and the general public make sense of the results and better understand how this evidence might apply to questions and contexts that matter to them.
  • Additionally, ACF has continued to produce findings from numerous randomized controlled trials providing evidence on strategies that TANF agencies can use such as subsidized employment, coaching, career pathways and job search strategies. Ongoing ACF efforts to build evidence for what works for TANF recipients and other low-income individuals include the Building Evidence on Employment Strategies for Low-Income Families (BEES) project and the Next Generation of Enhanced Employment Strategies (NextGen) project; these projects are evaluating the effectiveness of innovative programs designed to boost employment and earnings among low-income individuals.
  • ACF’s Office of Child Care drew on research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of dollars to improve the quality of programs, and more to inform regulations related to Child Care and Development Block Grant reauthorization.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • Community-Based Child Abuse Prevention (CBCAP) programs are authorized as part of the Child Abuse Prevention and Treatment Act (CAPTA). CAPTA promotes the use of evidence-based and evidence-informed programs and practices that effectively strengthen families and prevent child abuse and neglect. This includes efforts to improve the evaluation capacity of the states and communities to assess progress of their programs and collaborative networks in enhancing the safety and wellbeing of children and families. The 2020 Program Instruction for the Community-Based Child Abuse Prevention (CBCAP) grant program states that CBCAP funds made available to states must be used for the financing, planning, community mobilization, collaboration, assessment, information and referral, startup, training and technical assistance, information management and reporting, and reporting and evaluation costs for establishing, operating, or expanding community-based and prevention-focused programs and activities designed to strengthen and support families and prevent child abuse and neglect, among other things.
  • Child Care and Development Block Grant Act of 2014 says states are required to spend not less than 7, 8, and 9 percent of their CCDF awards (“quality funds”) (for years 1-2, 3-4, and 5+ after 2014 CCDBG enactment, respectively – see 128 STAT. 1987) on activities to improve the quality of child care services provided in the state, including:
    • 1B: Supporting the training and professional development of the child care workforce through…incorporating the effective use of data to guide program improvement (see 128 STAT 1988)
    • 3: Developing, implementing, or enhancing a quality rating system for child care providers and services, which may support and assess the quality of child care providers in the State (A) and be designed to improve the quality of different types of child care providers (C) (see 128 STAT 1988)
    • 7: Evaluating and assessing the quality and effectiveness of child care programs and services offered in the State, including evaluating how such programs positively impact children (see 128 STAT 1990)
  • ACF requires all CCDF lead agencies to annually report on how their CCDF quality funds were expended, including the activities funded and the measures used by states and territories to evaluate progress in improving the quality of child care programs and services. ACF released a Program Instruction for state and territorial lead agencies to provide guidance on reporting the authorized activities for the use of quality funds.
  • ACF also provides evaluation technical assistance to:
  • support grantees who are conducting their own local evaluations (MIECHV grantees (in collaboration with HRSA), Tribal MIECHV grantees, and ACF staff directly supporting Section 1115 child support demonstration grantees); and
  • build the evaluation capacity of grantees (e.g., the TANF Data Innovation Project, the Tribal Early Childhood Research Center, and the Center for States).
Score
3
Administration for Community Living (HHS)
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in its largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • Authorizing legislation for ACL’s largest non-competitive grant programs requires consideration of evidence-based programming as a requirement of funding. The Developmental Disabilities Assistance and Bill of Rights Act of 2000 allows for the withholding of funding if (1) the Council or agency has failed to comply substantially with any of the provisions required by section 124 to be included in the State plan, particularly provisions required by paragraphs (4)(A) and (5)(B)(vii) of section 124(c), or with any of the provisions required by section 125(b)(3); or (2) the Council or agency has failed to comply substantially with any regulations of the Secretary that are applicable.” As a condition of funding non-competitive grandees are required to “determine the extent to which each goal of the Council was achieved for that year” and report that information to ACL.
  • States that receive Older Americans Act Home and Community-Based Supportive Services Title III-D funds are required to spend those funds on evidence-based programs to improve health and well-being, and reduce disease and injury. In order to receive funding, states must utilize programs that meet ACL’s definition of evidence-based or are defined as evidence-based by another HHS operating division. Under the Older American Act, caregiver support programs are required to track and report on their use of evidence-based caregiver support services.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • FY12 Congressional appropriations included an evidence-based requirement for the first time. OAA Title III-D funding may be used only for programs and activities demonstrated to be evidence-based. Consistent with the Administrator’s focus on identifying new ways to efficiently improve direct service programs, ACL is using its 1% Nutrition authority to fund $3.5 million for nutrition innovations and to test ways to modernize how meals are provided to a changing senior population. One promising demonstration currently being carried out by the Georgia State University Research Foundation (entitled Double Blind Randomized Control Trial on the Effect of Evidence-Based Suicide Intervention Training on the Home-Delivered and Congregate Nutrition Program through the Atlanta Regional Commission) which has drawn widespread attention is an effort to train volunteers who deliver home-delivered meals to recognize and report indicators of suicidal intent and other mental health issues so that they can be addressed.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs in FY20 (besides its five largest grant programs)?
  • The 2020 reauthorization of the Older Americans Act requires that Assistive technology programs are “aligned with evidence-based practice;” that person-centered, trauma informed programs “incorporate evidence-based practices based on knowledge about the role of trauma in trauma victims’ lives;” and that a newly authorized Research, Demonstration, and Evaluation Center for the Aging Network increases “the repository of information on evidence based programs and interventions available to the aging network, which information shall be applicable to existing programs and interventions, and help in the development of new evidence-based programs and interventions.”
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Since 2017, ACL has awarded Innovations in Nutrition grants to 11 organizations to develop and expand evidence-based approaches to enhance the quality and effectiveness of nutrition programming. ACL is currently overseeing five grantees for innovative projects that will enhance the quality, effectiveness, and outcomes of nutrition services programs provided by the national aging services network. The grants total $1,197,205 for this year with a two-year project period. Through this grant program, ACL aims to identify innovative and promising practices that can be scaled across the country and to increase the use of evidence-informed practices within nutrition programs.
  • In 2020, ACL expects to award grants for demonstrations in Innovations in Nutrition Programs and Services to support the documentation of innovative projects that enhance the quality, effectiveness, and other proven outcomes of nutrition services programs within the aging services network.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  •  All funding opportunity announcements published by ACL include language about generating and reporting evidence about their progress towards the specific goals set for the funds. Grantee manuals include information about the importance of and requirements for evaluation (see the Administration on Aging: Title VI Resource Manual). The National Ombudsman Resource Center, funded by ACL, provides self-evaluation materials for Long-Term Care Ombudsman Programs (LTCOP) funded under Title VII of the Older Americans Act.
Score
7
U.S. Agency for International Development
  • USAID does not administer non-competitive grant programs (relative score for Criteria #8 applied).
Score
3
AmeriCorps
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
  • In FY20, the five largest non-competitive grant programs are:
  1.  AmeriCorps State formula grants program ($142,892,106 eligible grantees: states);
  2. AmeriCorps National Civilian Community Corps (NCCC) ($32.5 million; eligible grantees: nonprofit organizations);
  3. AmeriCorps VISTA ($93 million; eligible grantees: nonprofit organizations, state, tribal, and local governments, institutions of higher education);
  4. Senior Corps Foster Grandparents ($118 million; eligible grantees: nonprofit organization, local governments)
  5. Senior Corps Senior Companion Program ($50 million; eligible grantees: nonprofit organizations, local governments).
9.2 Did the agency use evidence of effectiveness to allocate funds in its largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • In FY18, Senior Corps Foster Grandparents and Senior Companion Program embedded evidence into their grant renewal processes by offering supplemental funding, “augmentation grants,” to grantees interested in deploying volunteers to serve in evidence-based programs. More than $3.3 million of Senior Corps program dollars were allocated, over three years, toward new evidence-based programming augmentations. Grantees will be operating with their augmentations through fiscal year 2021.
  • In a survey completed in FY20, Senior Corps grantees reported that 4,043 volunteer stations and 20,320 volunteers (10% of all volunteers) were engaged in evidence-based programming.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • In FY19, Senior Corps completed an evaluation with an independent firm to produce case studies and comparative analyses of select grantees that received an evidence-based programming augmentation to understand successes, challenges, and other issues. This report is being used to inform Senior Corps’ approach to replicating this augmentation initiative, as well as the training/technical assistance needs of grantees.
  • Senior Corps and the Administration for Community Living have continued a dialogue about how to build and broaden the evidence base for various programs designed for older adults, particularly for aging and disability evidence-based programs and practices. AmeriCorps previously utilized ACL’s list of evidence-based programs for its augmentation grants and is encouraging Senior Corps grantees to move toward more evidence-based programming.
  • For FY20, Senior Corps continued funding five demonstration grants, totaling $2,579,475, which authorize organizations to implement Senior Corps program model with certain modifications to standard AmeriCorps policies. Demonstration grants allow Senior Corps to analyze potential policy changes.
  • AmeriCorps NCCC invested in a Service Project Database that provides staff access to data on all NCCC projects completed since 2012. The database thematically organizes projects, classifies project frameworks, and categorizes the outcomes of these service initiatives. NCCC is investing in an evaluation of NCCC’s impact. This research project was initiated in FY18 and is focused on evaluating member retention, studying how NCCC develops leadership skills in its members and teams, and the program’s ability to strengthen communities. Finally, NCCC will continue to invest in research grants to better understand the outcomes of its disaster response efforts.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs in FY20 (besides its five largest grant programs)?
  • CNCS only administers five non-competitive grant programs, as described above.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Senior Corps and the Office of Research and Evaluation completed a longitudinal evaluation of the Foster Grandparents and Senior Companion Programs in FY19 that demonstrated the positive health outcomes associated with volunteering. A 50 year retrospective review of the research conducted on Senior Corps programs was completed at the end of FY19 and was posted on the Evidence Exchange in FY20.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • AmeriCorps does not prohibit the use of formula dollars for evaluation but each State Commission may have its own guidelines. Further, formula grantees over $500,000 have to perform evaluations using their grant funds.
Score
7
U.S. Department of Education
 9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in its five largest non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • ED worked with Congress in FY16 to ensure that evidence played a major role in ED’s large non-competitive grant programs in the reauthorized ESEA. As a result, section 1003 of ESSA requires states to set aside at least 7% of their Title I, Part A funds for a range of activities to help school districts improve low-performing schools. School districts and individual schools are required to create action plans that include “evidence-based” interventions that demonstrate strong, moderate, or promising levels of evidence.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • ESEA requires a National Assessment of Title I– Improving the Academic Achievement of the Disadvantaged. In addition, Title I Grants require state education agencies to report on school performance, including those schools identified for comprehensive or targeted support and improvement.
  • Federal law (ESEA) requires states receiving funds from 21st Century Community Learning Centers to “evaluate the effectiveness of programs and activities” that are carried out with federal funds (section 4203(a)(14)), and it requires local recipients of those funds to conduct periodic evaluations in conjunction with the state evaluation (section 4205(b)).
  • The Office of Special Education Programs (OSEP), the implementing office for IDEA grants to states, has revised its accountability system to shift the balance from a system focused primarily on compliance to one that puts more emphasis on results through the use of Results Driven Accountability.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs in FY20 (besides its five largest grant programs)?
  • Section 4108 of ESEA authorizes school districts to invest “safe and healthy students” funds in Pay for Success initiatives. Section 1424 of ESEA authorizes school districts to invest their Title I, Part D funds (Prevention and Intervention Programs for Children and Youth Who are Neglected, Delinquent, or At-Risk) in Pay for Success initiatives; under the section 1415 of the same program, a State agency may use funds for Pay for Success initiatives.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • States and school districts are beginning to implement the requirements in Title I of the ESEA regarding using evidence-based interventions in school improvement plans. Some States are providing training or practice guides to help schools and districts identify evidence-based practices.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • In 2016, ED released non-regulatory guidance to provide state educational agencies, local educational agencies (LEAs), schools, educators, and partner organizations with information to assist them in selecting and using “evidence-based” activities, strategies, and interventions, as defined by ESSA, including carrying out evaluations to “examine and reflect” on how interventions are working. However, the guidance does not specify that federal non-competitive funds can be used to conduct such evaluations.
Score
4
U.S. Dept. of Housing & Urban Development
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?In FY20, the five largest non-competitive grant programs are:
  • In FY20, the five largest non-competitive grant programs are
  1. Public Housing Operating Fund ($4.55 billion; eligible applicants: Public housing authorities);
  2. Public Housing Capital Grants ($2.87 billion; Public housing authorities);
  3. Housing Choice Voucher (HCV) Administrative Fees ($1.98 billion; eligible applicants: Public housing agencies that administer Housing Choice Vouchers);
  4. Community Development Block Grant Entitlement/Non-Entitlement ($3.43 billion; eligible applicants: entitlement cities and counties and state allocation agencies);
  5. HOME Investment Partnerships ($1.35 billion; eligible applicants: participating jurisdictions).
9.2 Did the agency use evidence of effectiveness to allocate funds in its five largest non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • Although the funding formulas are prescribed in statute, evaluation-based interventions are central to each program. HUD used evidence from a 2015 Administrative Fee study of the costs that high-performing PHAs incur in administering a HCV program to propose a new FY17 approach for funding Administrative Fees while strengthening PHA incentives to improve HCV outcomes by providing tenant mobility counseling.
  • HUD’s funding of public housing is being radically shifted through the evidence-based Rental Assistance Demonstration (RAD), which enables accessing private capital to address the $26 billion backlog of capital needs funding. Based on demonstrated success of RAD, for FY20 HUD proposed to transfer $95 million from the Operating Fund and Capital Fund to the Tenant-Based Rental Assistance fund to support RAD conversions. For FY21 HUD is proposing to remove the cap on the number of public housing developments to be converted to Section 8 contracts. HUD is beginning to evaluate RAD’s impacts on children. HUD is also conducting a Rent Reform demonstration and a Moving To Work (MTW) demonstration to test efficiencies of changing rent rules and effects on tenant outcomes.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • Evidence-building is central to HUD’s funding approach through the use of prospective program demonstrations. These include the Public Housing Operating Fund’s Rental Assistance Demonstration (RAD), the Public Housing Capital Grants’ Rent Reform demonstration, and the Housing Choice Voucher program’s Moving To Work (MTW) demonstration grants. As Congress moved to expand MTW flexibilities to additional public housing authorities (PHAs), HUD sought authority to randomly assign cohorts of PHAs to provide ability to rigorously test specific program innovations.
  • Program funds are provided to operate demonstrations through the HCV account, Tenant-Based Rental Assistance. These include the Tribal HUD-VA Supportive Housing (Tribal HUD-VASH) demonstration of providing permanent supportive housing to Native American veterans and the FSS-Family Unification Program demonstration that tests the effect of providing vouchers to at-risk young adults who are aging out of foster care.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs in FY20 (besides its five largest)?
  • HUD-Veterans Affairs Supportive Housing (HUD-VASH) vouchers are allocated in part on the administrative performance of housing agencies as measured by their past utilization of HUD-VASH vouchers in HUD’s Voucher Management System (Notice PIH-2019-15 (HA)). The performance information helps ensure that eligible recipients are actually able to lease units with the vouchers that HUD funds. The HUD-VASH Exit Study documented that 87,864 VASH vouchers were in circulation in April 2017, contributing substantially to the 47-percent decline in the number of homeless Veterans since 2010.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • To address a severe backlog of capital needs funding for the nation’s public housing stock, the Rental Assistance Demonstration was authorized in 2011 to convert the properties to project-based Section 8 contracts to attract an infusion of private capital. The 2019 final report on the RAD evaluation showed that conversions successfully raised $12.6 billion of funding, an average of $121,747 per unit to improve physical quality and stabilize project finances. Based on the program’s successes, the limit on the number of public housing conversions was increased to 455,000 units in 2018, nearly half of the stock, and HUD has been proposing to eliminate the cap. Additionally, HUD extended the conversion opportunity to legacy multifamily programs through RAD 2.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • Communities receiving HUD block grant funding through Community Development Block Grants, HOME block grants, and other programs are required to consult local stakeholders, conduct housing needs assessments, and develop needs-driven Consolidated Plans to guide their activities. They then provide Consolidated Annual Performance and Evaluation Reports (CAPERs) to document progress toward their Consolidated Plan goals in a way that supports continued community involvement in evaluating program efforts.
  • HUD’s Community Development Block Grant program, which provides formula grants to entitlement jurisdictions, increases local evaluation capacity. Specifically, federal regulations (Section 24 CFR 570.200) authorize CDBG recipients (including city and state governments) to use up to 20% of their CDBG allocations for administration and planning costs that may include evaluation-capacity building efforts and evaluations of their CDBG-funded interventions (as defined in 570.205 and 570.206).
Score
7
U.S. Department of Labor
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
  • In FY20, the five largest non-competitive grant programs at DOL are in the Employment and Training Administration:
    1. Adult Employment and Training Activities ($845,000,000; eligible grantees: city, county, and/or state governments);
    2. Youth Activities ($903,416,000; eligible grantees: city, county, and/or state governments);
    3. Dislocated Worker Employment and Training activities ($1,040,860,000; eligible grantees: city, county, and/or state governments);
    4. UI State Administration ($2,137,945,000; eligible grantees: city, county, and/or state governments);
    5. Employment Security grants to States ($663,052,000; eligible grantees: city, county, and/or state governments).
9.2 Did the agency use evidence of effectiveness to allocate funds in its five largest non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • A signature feature of the Workforce Innovation and Opportunity Act (WIOA) (Pub. L. 113-128), is its focus on the use of data and evidence to improve services and outcomes, particularly in provisions related to states’ role in conducting evaluations and research, as well as in requirements regarding data collection, performance standards, and state planning. Conducting evaluations is a required statewide activity, but there are additional requirements regarding coordination (with other state agencies and federal evaluations under WIOA), dissemination, and provision of data and other information for Federal evaluations.
  • WIOA’s evidence and performance provisions: (1) increased the amount of WIOA funds states can set aside and distribute directly from 5-10% to 15% and authorized them to invest these funds in Pay for Performance initiatives; (2) authorized states to invest their own workforce development funds, as well as non-federal resources, in Pay for Performance initiatives; (3) authorized local workforce investment boards to invest up to 10% of their WIOA funds in Pay for Performance initiatives; and (4) authorized states and local workforce investment boards to award Pay for Performance contracts to intermediaries, community based organizations, and community colleges.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • Section 116(e) of WIOA describes how the state, in coordination with local workforce boards and state agencies that administer the programs, shall conduct ongoing evaluations of activities carried out in the state under these state programs. These evaluations are intended to promote, establish, implement, and utilize methods for continuously improving core program activities in order to achieve high-level programs within, and high-level outcomes from, the workforce development system.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs in FY20 (besides its five largest)?
  • Reemployment Services and Eligibility Assessments (RESEA) funds must be used for interventions or service delivery strategies demonstrated to reduce the average number of weeks of unemployment insurance a participant receives by improving employment outcomes. The law provides for a phased implementation of the new program requirements over several years. In FY19, DOL awarded $130 million to states to conduct RESEA programs that met these evidence of effectiveness requirements. Beginning in FY23, states must also use no less than 25% of RESEA grant funds for interventions with a high or moderate causal evidence rating that show a demonstrated capacity to improve outcomes for participants; this percentage increases in subsequent years until after FY26, when states must use no less than 50 percent of such grant funds for such interventions.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Institutional Analysis of American Job Centers: the goal of the evaluation was to understand and systematically document the institutional characteristics of American Job Centers (AJCs), and to identify variations in service delivery, organization structure, and administration across AJCs.
  • Career Pathways Descriptive and Analytical Study: WIOA requires DOL to “conduct a multistate study to develop, implement, and build upon career advancement models and practices for low-wage healthcare providers or providers of early education and child care.” In response, DOL conducted the Career Pathways Design Study to develop evaluation design options that could address critical gaps in knowledge related to the approach, implementation, and success of career pathways strategies generally, and in early care and education specifically. The Chief Evaluation Office (CEO) has recently begun the second iteration of this study. The purpose of this project is to build on the evaluation design work CEO completed in 2018 to build evidence about the implementation and effectiveness of career pathways approaches and meet the WIOA statutory requirement to conduct a career pathways study. It will include a meta-analysis of existing impact evaluation results as well as examine how workers advance through multiple, progressively higher levels of education and training, and associated jobs, within a pathway over time, and the factors associated with their success.
  • Analysis of Employer Performance Measurement Approaches: the goal of the study was to examine the appropriateness, reliability and validity of proposed measures of effectiveness in serving employers required under WIOA. It included knowledge development to understand and document the state of the field, an analysis and comparative assessment of measurement approaches and metrics, and the dissemination of findings through a report, as well as research and topical briefs.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • The Employment & Training Administration’s (ETA) RESEA grantees may use up to 10% of their grant funds for evaluations of their programs. ETA released specific evaluation guidance to help states understand how to conduct or cause to conduct evaluations of their RESEA interventions with these grant funds. The goal of the agency guidance, along with the evaluation technical assistance being provided to states and their partners, is to build states’ capacity to understand, use, and build evidence.
  • Section 116 of WIOA establishes performance accountability indicators and performance reporting requirements to assess the effectiveness of states and local areas in achieving positive outcomes for individuals served by the workforce development system’s core programs. Section 116(e) of WOIA requires states to “employ the most rigorous analytical and statistical methods that are reasonably feasible, such as the use of control groups” and requires that states evaluate the effectiveness of their WOIA programs in an annual progress which includes updates on (1) current or planned evaluation and related research projects, including methodologies used; (2) efforts to coordinate the development of evaluation and research projects with WIOA core programs, other state agencies and local boards; (3) a list of completed evaluation and related reports with publicly accessible links to such reports; (4) efforts to provide data, survey responses, and timely visits for Federal evaluations; (5) any continuous improvement strategies utilizing results from studies and evidence-based practices evaluated. States are permitted to use WOIA grant funds to perform the necessary performance monitoring and evaluations to complete this report.
Score
10
Millennium Challenge Corporation
  • MCC does not administer non-competitive grant programs (relative score for criteria #8 applied).
Score
5
Substance Abuse and Mental Health Services Administration
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in its five largest non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • In FY20, Congress maintained the 10 percent set-aside for evidence-based programs in SAMHSA’s Mental Health Grant Block (MHBG) grant to address the needs of individuals with early serious mental illness, including psychotic disorders, regardless of the age of the individual at onset (see p. 48 of the FY20-FY21 Block Grant Application). In the FY21 budget request (p. 348), SAMHSA expressed its desire to reduce the set-aside in half to 5%.
  • The FY20-FY21 Block Grant Application requires states seeking Mental Health Block Grant (MHBG) and Substance Abuse and Treatment Prevention Block Grant (SAGB) funds to identify specific priorities. For each priority, states must identify the relevant goals, measurable objectives, and at least one-performance indicator for each objective, which must include strategies to deliver evidence-based individualized treatment plans (p. 21); evidence-based interventions for substance use or dependence (p. 21); building provider capacity to deliver evidence-based, trauma-specific interventions (p. 22); evidence-based programs, policies, and practices in prevention efforts (p. 22); evidence-based models to prevent substance misuse (p. 23). 
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • The FY20-FY21 Block Grant Application requires states applying for Substance Abuse Prevention and Treatment funds to create an evaluation plan, which must include at least five specified evaluation elements. Additionally, the application specifies that SAMHSA will work with the National Institute of Mental Health (NIMH) to plan for program evaluation and data collection related to demonstrating program effectiveness of the Mental Health Block Grant.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant program (besides its five largest grant programs)?
  • No examples available.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • No examples available.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • The FY20-FY21 Block Grant Application clarified that “Section 1921 of the PHS [Public Health Services] Act (42 U.S.C.§ 300x-21) authorizes the States to obligate and expend SABG [Substance Abuse and Treatment Prevention Block Grant] funds to plan, carry out and evaluate activities and services designed to prevent and treat substance use disorders” (p. 16). The Application further clarifies that states “may utilize SABG funds to train personnel to conduct fidelity assessments of evidence-based practices” (p. 35).
Back to the Standard

Visit Results4America.org