2020 Federal Standard of Excellence


U.S. Department of Labor

Score
9
Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY20?

1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Chief Evaluation Officer serves as the U.S. Department of Labor (DOL) evaluation officer. The Chief Evaluation Officer oversees DOL’s Chief Evaluation Office (CEO), housed within the Office of the Assistant Secretary for Policy (OASP), and the coordination of Department-wide evaluations, including office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions.
  • CEO is directly appropriated $8.04 million and then, may receive up to 0.75% from statutorily specified program accounts, based on the discretion of the Secretary. In FY19, that number was .03% of funds, or $3.3 million, bringing the spending total to $11.34 million. The FY20 number is not known yet, because the Secretary has not determined the set-aside amount.
  • CEO includes nine full-time staff plus a small number of contractors and one to two detailees at any given time. This staff level is augmented by staff from research and evaluation units in other DOL agencies such as the Employment and Training Administration (ETA), which has nine FTE’s dedicated to research and evaluation activities with which the CEO coordinates extensively on the development of a learning agenda, management of studies, and dissemination of results.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • The Department has designated a Chief Data Officer. The Chief Data Officer chairs DOL’s data governance body, and leads data governance efforts, open data efforts, and associated efforts to collect, manage, and utilize data in a manner that best supports its use to inform program administration and foster data-informed decision-making and policymaking.
  • DOL has arranged for temporary staffing to support governance and open data efforts as well as compliance with the Evidence Act, the Federal Data Strategy, and DOL’s data governance goals. DOL is in the process of hiring permanent staff to support the office through customized position descriptions.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • DOL, through a Secretary’s Order, has created a structure that coordinates and leverages the important roles within the organization to accomplish objectives like those in the Evidence Act. The Secretary’s Order mandates collaboration between the Chief Data Officer, the Chief Performance Officer, Chief Evaluation Officer, Chief Information Officer, and Chief Statistical Officer.
  • The Secretary’s Order mandates a collaborative approach to reviewing IT infrastructure and data asset accessibility, developing modern solutions for managing, disseminating and generating data, coordinating statistical functions, supporting evaluation, research and evidence generation, and supporting all aspects of performance management including assurances that data are fit for purpose.
  • DOL continues to leverage current governance structures, such as the Chief Evaluation Officer continuing to play a role in the formation of the annual budget requests of DOL’s agencies, recommendations around including evidence in grant competitions, and providing technical assistance to the Department leadership to ensure that evidence informs policy design. There are a number of mechanisms set up to facilitate this: The Chief Evaluation Officer traditionally participates in quarterly performance meetings with DOL leadership and the Performance Management Center (PMC). The Chief Evaluation Officer reviews agency operating plans and works with agencies and the PMC to coordinate performance targets and measures and evaluation findings; quarterly meetings are held with agency leadership and staff as part of the Learning Agenda process; and meetings are held as needed to strategize around addressing new priorities or legislative requirements.
Score
7
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence-building plan), and did it publicly release the findings of all completed program evaluations in FY20?

2.1 Did the agency have an agency-wide evaluation policy? (Example: Evidence Act 313(d))
  • DOL has an Evaluation Policy that formalizes the principles that govern all program evaluations in the Department, including methodological rigor, independence, transparency, ethics, and relevance. The policy represents a commitment to using evidence from evaluations to inform policy and practice.
2.2 Did the agency have an agency-wide evaluation plan? (Example: Evidence Act 312(b))
  • The Chief Evaluation Office (CEO) develops, implements, and publicly releases an annual DOL evaluation plan. The evaluation plan is based on the agency learning agendas as well as the Department’s Strategic Plan priorities, statutory requirements for evaluations, and Secretarial and Administration priorities. The evaluation plan includes the studies CEO intends to undertake in the next year using set-aside dollars. Appropriations language requires the Chief Evaluation Officer to submit a plan to the U.S. Senate and House Committees on Appropriations outlining the evaluations that will be carried out by the Office using dollars transferred to CEO– the DOL evaluation plan serves that purpose. The evaluation plan outlines evaluations that CEO will use its budget to undertake. CEO also works with agencies to undertake evaluations and evidence building strategies to answer other questions of interest identified in learning agencies, but not undertaken directly by CEO.
2.3 Did the agency have a learning agenda (evidence-building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including, but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda? (Example: Evidence Act 312)
  • In FY20, the Department is developing its annual evaluation plan, building from individual agencies and learning agendas to create a combined document. DOL has leveraged its existing practices and infrastructure to develop the broad, four-year prospective research agenda, or Evidence-Building Plan, per the Evidence Act requirement. Both documents will outline the process for internal and external stakeholder engagement.
2.4 Did the agency publicly release all completed program evaluations?
  • All DOL program evaluation reports and findings funded by the CEO are publicly released and posted on the complete reports section of the CEO website. DOL agencies, such as the Employment & Training Administration (ETA), also post and release their own research and evaluation reports. CEO is also in the process of ramping up additional methods of communicating and disseminating CEO-funded studies and findings, and published its first quarterly newsletter in September 2020.
2.5 What is the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts? (Example: Evidence Act 315, subchapter II (c)(3)(9))
  • DOL’s Evaluation Policy touches on the agency’s commitment to high-quality, methodologically rigorous research through funding independent research activities. Further, CEO staff have expertise in research and evaluation methods as well as in DOL programs and policies and the populations they serve. The CEO also employs technical working groups on the majority of evaluation projects whose members have deep technical and subject matter expertise. The CEO has leveraged the FY20 learning agenda process to create an interim Capacity Assessment, per Evidence Act requirements, and will conduct a more detailed assessment of individual agencies’ capacity, as well as DOL’s overall capacity, in these areas for publication in 2022.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
  • DOL employs a full range of evaluation methods to answer key research questions of interest, including when appropriate, impact evaluations. Among DOL’s active portfolio of approximately 50 projects, the study type ranges from rigorous evidence syntheses to implementation studies to quasi-experimental outcome studies to impact studies. Examples of current DOL studies with a random assignment component include an evaluation of a Job Corps’ demonstration pilot, the Cascades Job Corps College and Career Academy. An example of a multi-arm randomized control trial is the Reemployment Eligibility Assessments evaluation, which assesses a range of strategies to reduce Unemployment Insurance duration and wage outcomes.
Score
6
Resources

Did the agency invest at least 1% of program funds in evaluations in FY20? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, rigorous evaluations, including random assignments)

3.1 ____ (Name of agency) invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY20 budget.
  • The Department of Labor invested $11.34 million on evaluations, evaluation technical assistance, and evaluation capacity-building, representing .1% of the agency’s $10.9 billion discretionary budget in FY20.
3.2 Did the agency have a budget for evaluation and how much was it? (Were there any changes in this budget from the previous fiscal year?)
  • CEO is directly appropriated $8.04 million and then, may receive up to 0.75% from statutorily specified program accounts, based on the discretion of the Secretary. In FY19, that number was .03% of funds, or $3.3 million, bringing the spending total to $11.34 million. The FY20 number is not known yet, because the Secretary has not determined the set-aside amount. CEO also collaborates with DOL program offices and other federal agencies on additional evaluations being carried out by other offices and/or supported by funds appropriated to other agencies or programs. In FY19, CEO oversaw approximately $9.94 million in evaluation and evidence building activities, and in FY18, CEO oversaw approximately $21 million in evaluation and evidence building activities. While in FY 17, DOL’s CEO oversaw an estimated $40 million in evaluation funding.
  • This amount only represents the dollars that are directly appropriated or transferred to the CEO. Additionally, many DOL evaluations and research studies are supported by funds appropriated to DOL programs and/or are carried out by other offices within DOL. In some programs, such as the America’s Promise grant evaluation and the Reentry Grant Evaluation, evaluation set asides exceed 1% (2.9% and 2.8% respectively for these programs).
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
  • Grantees and programs that participate in DOL evaluations receive technical assistance related to evaluation activities and implementation such as the Evaluation and Research Hub (EvalHub). DOL agencies, like ETA, are also making a concerted effort to help states and local areas build evaluation capacity to meet the program evaluation requirements for the Workforce Innovation and Opportunity Act and Reemployment Services and Eligibility Assessment (RESEA) through tools such as RESEA program evaluation technical assistance (RESEA EvalTA). A suite of evaluation technical assistance resources is being developed throughout FY20, including webinars and other tools and templates to help states understand, build, and use evidence. DOL’s evaluation technical assistance webinar series for states has been posted online to the RESEA community of practice. This series will ultimately hold 11 webinars. To date, most of the posted webinars have been viewed by the field between 2,000-4,000 times. Additional RESEA EvalTA products are being developed and will be posted on theRESEA community of practice, the DOL Chief Evaluation Office’s website, and in CLEAR, as appropriate.
Score
10
Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY20?
(Example: Performance stat systems, frequent outcomes-focused data-informed meetings)

4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • Using a performance and budget system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance to analyze progress and identify opportunities for performance improvements. Learning agendas updated annually by DOL agencies in collaboration with DOL’s CEO include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The annual Strategic Reviews with leadership include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement. Using a performance stat reporting and dashboard system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance by the Deputy Secretary to analyze progress and identify opportunities for performance improvements.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • DOL leverages a variety of continuous learning tools, including the learning agenda approach to conceptualize and make progress on substantive learning goals for the agency, as well as DOL’s Performance Management Center’s (PMC) Continuous Process Improvement (CPI) Program, which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
  • DOL leverages a variety of continuous learning tools, including the learning agenda approach to conceptualize and make progress on substantive learning goals for the agency, as well as DOL’s Performance Management Center’s (PMC) Continuous Process Improvement (CPI) Program, which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
Score
5
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY20?
(Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies; data-use policies)

5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • DOL’s open government plan was last updated in 2016, and subsequent updates are being considered after the formal release of the Federal Data Strategy and the Evidence Act.
  • DOL also has open data assets aimed at developers and researchers who desire data-as-a-service through application programming interfaces hosted by both the Office of Public Affairs and the Bureau of Labor Statistics(BLS). Each of these has clear documentation, is consistent with the open data policy, and offers transparent, repeatable, machine-readable access to data on an as-needed basis. The Department is currently developing a new API v3 which will expand the open data offerings, extend the capabilities, and offer a suite of user-friendly tools.
  • The Department has consistently sought to make as much data available to the public regarding its activities as possible. Examples of this include DOL’s Public Enforcement Database, which makes available records of activity from the worker protection agencies and the Office of Labor Management Standards’ online public disclosure room.
  • The Department also has multiple restricted-use access systems which go beyond what would be possible with simple open-data efforts. BLS has a confidential researcher access program, offering access under appropriate conditions to sensitive data. Similarly, the Chief Evaluation Office (CEO) has stood up a centralized research hub for evaluation study partners to leverage sensitive data in a consistent manner to help make evidence generation more efficient.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • The Department has conducted extensive inventories over the last ten years, in part to support common activities such as IT modernization, White House Office of Management and Budget (OMB) data calls, and the general goal of transparency through data sharing. These form the current basis of DOL’s planning and administration. Some sections of the Evidence Act have led to a different federal posture with respect to data, such as the requirement for data to be open by default, and considered shareable absent a legal requirement not to do so, or unless there is a risk that the release of such data might help constitute disclosure risk. The Department is currently re-evaluating its inventories and its public data offerings in light of this very specific requirement and re-visiting this issue among all its programs. Because this is a critical prerequisite to developing open data plans, as well as data governance and data strategy frameworks, the agency hopes to have a revised inventory completed by the end of FY19.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c)) 
  • DOL’s CEO, Employment & Training Administration (ETA), and Veterans Employment & Training Service (VETS) have worked with the U.S. Department of Health and Human Services (HHS) to develop a secure mechanism for obtaining and analyzing earnings data from the Directory of New Hires. In this past year DOL has entered into interagency data sharing agreements with HHS and obtained data to support 10 job training and employment program evaluations.
  • During FY20, the Department continued to expand efforts to improve the quality of and access to data for evaluation and performance analysis through the Data Analytics Unit in CEO, and through new pilots beginning in BLS to access and exchange state labor market and earnings data for statistical and evaluation purposes.
  • The Data Analytics unit also continued to leverage its Data Exchange and Analysis Platform (DEAP) with high processing capacity and privacy provisions to share, link, and analyze program data for recently separated veterans, public workforce outcomes, and sensitive worker protection data such as complaint filings. This work helps to identify trends and patterns in the data which become the foundation for future program improvements. The analysis also results in a feedback loop that can improve data quality and allows for inquiries to determine if the data from the program are appropriate to support more rigorous performance and evaluation approaches.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • DOL’s ETA has agreements with 50 states, the District of Columbia, and Puerto Rico for data sharing and exchange of interstate wage data for performance accountability purposes. Currently, ETA is finalizing an updated data sharing agreement with states that will facilitate better access to quarterly wage data by states for purposes of performance accountability and research and evaluation requirements under the Workforce Innovation and Opportunity Act (WIOA). This work aims to expand access to interstate wage data for the U.S. Department of Education’s Adult and Family Literacy Act programs (AEFLA) and Vocational Rehabilitation programs, among others.
  • ETA continues to fund and provide technical assistance to states under the Workforce Data Quality Initiative to link earnings and workforce data with education data longitudinally to support state program administration and evaluation. ETA and VETS also have modified state workforce program reporting system requirements to include data items for a larger set of grant programs, which will improve access to administrative data for evaluation and performance management purposes.
Score
9
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY20?
(Example: What Works Clearinghouses)

6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • DOL’s Clearinghouse for Labor Evaluation and Research’s (CLEAR) evidence guidelines, which describe quality standards for different types of studies, are applied to all independent evaluations, including all third party evaluations of DOL programs, determined eligible for CLEAR’s evidence reviews across different topic areas. Requests for proposals also indicate these CLEAR standards should be applied to all Chief Evaluation Office (CEO) evaluations when considering which designs are the most rigorous and appropriate to answer specific research questions.
  • In addition, the DOL Evaluation Policy principles and standards for evaluation planning and dissemination. Additionally, DOL collaborates with other agencies (U.S. Department of Health and Human Services (HHS), the U.S. Department of Education’s Institute of Education Sciences (IES), the National Science Foundation (NSF), and the Corporation for National and Community Service (CNCS)) to develop technological procedures to link and share reviews across clearinghouses.
6.2 Did the agency have a common evidence framework for funding decisions?
  • DOL uses the CLEAR evidence guidelines and standards to make decisions about discretionary program grants awarded using evidence-informed or evidence-based criteria. The published guidelines and standards are used to identify evidence-based programs and practices and to review studies to assess the strength of their causal evidence or to do a structured evidence review in a particular topic area or timeframe to help inform agencies what strategies appear promising and where gaps exist.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • DOL’s CLEAR is an online evidence clearinghouse. CLEAR’s goal is to make research on labor topics more accessible to practitioners, policymakers, researchers, and the public more broadly, so that it can inform their decisions about labor policies and programs. CLEAR identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation studies, and causal impact studies. For causal impact studies, CLEAR assesses the strength of the design and methodology in studies that look at the effectiveness of particular policies and programs. CLEAR’s study summaries and icons, found in each topic area, can help users quickly and easily understand what studies found and how much confidence to have in the results.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • DOL promotes the utilization of evidence-based practices in a variety of ways. For example, the Employment & Training Administration (ETA) maintains a user friendly technical assistance tool that promotes state and local service providers’ use of evidence-based interventions through Workforce System Strategies, a comprehensive database of over 1,000 profiles that summarize a wide range of findings from reports, studies, technical assistance tools and guides that support program administration and improvement. Additionally, recognizing that research over the past four decades has found subsidized on-the-job training strategies like apprenticeship improve participants’ employment and earnings outcomes, DOL has awarded or announced several apprenticeship grant opportunities this fiscal year in addition to the State Apprenticeship Expansion Grants awarded in 2018. These include the ETA’s Scaling Apprenticeship Through Sector-Based Strategies and Apprenticeships: Closing the Skills Gap opportunities and the Women’s Bureau’s Women in Apprenticeship and Nontraditional Occupations grant program.
Score
5
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY20?
(Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with rigorous evaluation requirements)

7.1 Did the agency engage leadership and staff in its innovation efforts to improve the impact of its programs?
  • DOL’s Chief Data Officer and Chief Evaluation Office (CEO) Data Analytics team developed a secure data analysis platform accessible to all DOL staff, pre-loaded with common statistical packages and offering the capability to access and merge various administrative data for analysis. DOL supports staff in executing limitless web-based A/B testing and other behaviorally-informed trials, with the shared service of the advanced Granicus platform’s GovDelivery communications tool, including free technical support. This tool enhances the Department’s ability to communicate with the public, such as through targeted email campaigns, and to adjust these communications, informed by testing and data, to increase engagement on relevant topics. The CEO also has developed toolkits and detailed resources for staff to effectively design behaviorally informed tests.
7.2 Did the agency have policies, processes, structures, or programs to promote innovation to improve the impact of its programs?
  • The CEO uses a variety of communication tools to share rigorous research results, lessons learned, promising practices, and other implications of its research. These include internal briefings from independent contractors and researchers, a brownbag series that features evidence-based promising practices and results shared by DOL staff, for DOL staff, and an external expert seminar series featuring new findings or innovations in relevant areas of work. CEO staff consistently use research findings in the development of new research, and DOL agencies use findings to design and guide new discretionary grant programs, to refine performance measures for grantees, and to make decisions on compliance and enforcement practices.
  • DOL is strongly committed to promoting innovation in our policies and practices. For example, the Employment & Training Administration’s (ETA) competitive funding routinely funds innovative programming, since grantees typically bundle various program services and components to best meet the needs of the people being served by them in their local contexts. A particularly good example of where this innovation is happening is in the Administration’s high priority area of apprenticeships. DOL is funding $150 million to support sector-based innovations in apprenticeship. DOL has invested more than $95 million through the ApprenticeshipUSA initiative – a national campaign bringing together a broad range of stakeholders, including employers, labor, and states as well as education and workforce partners, to expand and diversify registered apprenticeships in the United States. This includes more than $60 million for state-led strategies to grow and diversify apprenticeship, and state Accelerator Grants to help integrate apprenticeship into education and workforce systems; engage industry and other partners to expand apprenticeship to new sectors and new populations at scale; conduct outreach and work with employers to start new programs; promote greater inclusion and diversity in apprenticeship; and develop statewide and regional strategies aimed at building state capacity to support new apprenticeship programs. All of these grants include funding for data collection; additionally, ETA and CEO are conducting an evaluation of the American Apprenticeship Initiative.
  • In addition, CEO’s Behavioral Insights team works with a number of DOL agencies on a continuous basis to identify and assess the feasibility of conducting studies where insights from behavioral science can be used to improve the performance and outcomes of DOL programs. The Wage and Hour Division’s (WHD) Transformation Team is one such example where continuous improvement efforts are driving innovation. Their work has identified potential areas where behavioral interventions and trials may inform program improvement. CEO is also working across agencies – including WHD, ETA, Women’s Bureau, Veterans Employment & Training Service (VETS), Office of Federal Contract Compliance Programs (OFCCP), and International Labor Affairs Bureau (ILAB) – to identify and assess the feasibility of other areas where insights from behavioral science can be used to improve the performance and outcomes of DOL programs.
  • DOL has also built capacity for staff innovation through the Performance Management Center’s Continuous Process Improvement (CPI) Program, an agency-wide opportunity which trains and certifies agency staff on Lean Six Sigma (LSS) methodologies through real-time execution of DOL process improvement projects. The program includes classroom sessions that prepare participants for LSS Black Belt certification examinations, including the American Society for Quality (ASQ) as well as DOL’s own certification.
7.3 Did the agency evaluate its innovation efforts, including using rigorous methods?
  • DOL, through the annual Learning Agenda process, systematically identifies gaps in the use of evidence. Innovation is about filling known gaps via dissemination, further research, or generation of quick turnaround assessments, like those offered to the Department by CEO’s Behavioral Insights Program.
  • DOL typically couples innovation with rigorous evaluation to learn from experiments. For example, DOL is participating in the Performance Partnership Pilots (P3) for innovative service delivery for disconnected youth which includes not only waivers and blending and braiding of federal funds, but gives bonus points in application reviews for proposing “high tier” evaluations. DOL is the lead agency for the evaluation of P3. A final report is available on the CEO’s completed studies website.
  • DOL routinely uses Job Corps’ demonstration authority to test and evaluate innovative and promising models to improve outcomes for youth. Currently, the CEO is sponsoring a rigorous impact evaluation to examine the effectiveness of one of these pilots, the Job Corps Experimental Center Cascades, with results expected in FY21.
Score
6
Use of Evidence in Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its competitive grant programs in FY20? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring for evidence; Pay for Success provisions)

8.1 What were the agency’s 5 largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
  • In FY20, the five largest competitive programs and their appropriation amounts were:
    1. Scaling Apprenticeship Through Sector-Based Strategies ($183,800,000; eligible grantees: public/private partnerships where lead applicants are institutions of higher education (IHE) representing a consortium of IHEs, or a state system of higher education, such as a community college system office or single state higher education board);
    2. Expanding Opportunity Through Industry Recognized Apprenticeship Programs (IRAP) ($150,000,000; expected Funding Opportunity Announcement (FOA) release FY19);
    3. Apprenticeships: Closing the Skills Gap ($100,000,000; eligible grantees: public/private partnerships where lead applicants are IHEs or an IHE representing a consortium of IHEs, a state system of higher education, such as a community college system office or a single state higher educational board, a nonprofit trade, industry, or employer association, labor unions, or labor-management organizations);
    4. Reentry Projects ($82,000,000; eligible grantees: non-profit organizations, state or local governments, Indian and Native American entities eligible for grants under Section 166 of the Workforce Innovation and Opportunity Act (WIOA)); and
    5. YouthBuild ($80,000,000; eligible grantees: private non-profit or public agencies including community and faith-based organizations, local workforce development boards or one-stop center partner programs, educational institutions, community action agencies, state or local housing development agencies, any Indian and Native American entity eligible for grants under Section 166 of WIOA, community development corporations, state or local youth service conservation corps, and any other public or private non-profit entity that is eligible to provide education or employment training under a federal program and can meet the required elements of the grant).
8.2 Did the agency use evidence of effectiveness to allocate funds in its five largest competitive grant programs? (e.g., Were evidence-based interventions/practices required or suggested? Was evidence a significant requirement?)
  • The Reentry Projects funding opportunity provides up to eight points (out of 100) for past performance. Grant applicants must specifically provide information on their performance goals. The application states, “[a]ll applicants must specifically address the placement in education or employment and certificate/degree attainment outcomes.”
  • The Employment & Training Administration’s (ETA) YouthBuild applicants are also awarded points based on past performance (a possible 28 points out of 100), viewing these metrics as important to demonstrating successful career outcomes for youth.
8.3 Did the agency use its five largest competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • All five of DOL’s largest grant programs have been or will be involved in evaluations designed by the Chief Evaluation Office (CEO) and the relevant DOL agencies. In each case DOL required or encouraged (through language in the funding announcement and proposal review criteria) grantees to use evidence-based models or strategies in grant interventions and/or to participate in an evaluation, especially to test new interventions that theory or research suggest are promising.
  • For example, DOL will conduct an implementation evaluation of the Sector Based Apprenticeship Program. This evaluation will also include the development of an impact evaluation design options paper that identifies areas of opportunity for testing the impacts of apprenticeship strategies on employment or other outcomes. The objective of this study is to identify innovative and promising models, practices, and partnership strategies to expand apprenticeship opportunities in high-growth occupations and industries to build the evidence on apprenticeship. There are options for more rigorous evaluations in the contract as appropriate.
  • Additionally, DOL currently has an evaluation underway of the Reentry Projects grant program. The Reentry Projects grant program used a tiered evidence framework to require applicants to propose evidence-based and informed interventions, or new interventions that theory or research suggests are promising, (or a combination of both) that lead to increased employment outcomes for their target populations and must frame their goals and objectives to address this issue. The evaluation will identify and evaluate promising practices used in reentry employment programs through both an implementation and impact study among select grantees to understand their effectiveness in improving participant outcomes such as employment, earnings, and recidivism.
8.4 Did the agency use evidence of effectiveness to allocate funds in any other competitive grant programs (besides its five largest grant programs)?
  • DOL includes requirements of demonstrated effectiveness in the allocation of funds, as well as the commitment to building new evidence in order to receive funds, both of which are of equal importance given the fact that many DOL-funded programs lack a sufficient body of evidence to only support those that are already evidence-based. For example, among current Employment & Training Administration (ETA) competitive grant programs, this has involved requiring: (1) a demonstration of an approach as being evidence-based or promising for receipt of funds (i.e., Reentry Funding Opportunity Announcement) or for potential to receive additional funds (i.e., TechHire); (2) an independent third-party local or grantee evaluation with priority incentives for rigorous designs (e.g., tiered funding, scoring priorities, bonus scoring for evidence-based interventions or multi-site rigorous tests); or (3) full participation in an evaluation as well as rigorous grantee (or local) evaluations. Additionally, applicants for the International Labor Bureau’s (ILAB) competitive funding opportunities are required to conduct and/or participate in evaluations as a condition of award.
8.5 What are the agency’s 1-2 strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • In 2015, DOL funded an evaluation of the 36-month Linking Employment Activities Pre-Release (LEAP) Programwhich included an implementation study of LEAP pilot programs that provided jail-based American Job Centers (AJCs) to individuals preparing to re-enter society after time in jail. The findings of the evaluation identified many promising practices for offering both pre- and post-release services and were published in 2018 (see the Final Report and Issue Brief Compendium). In 2020, DOL funded a 42-month Pathway Home Pilot Project and accompanying evaluation that builds on lessons learned from the LEAP program by providing workforce services to incarcerated individuals pre- and post-release. For example, the requirement in the Pathway Home grant for participants to maintain the same caseworker pre- and post-release, was suggested as a promising practice in the LEAP Implementation Study.
  • DOL funded a national evaluation of the Trade Adjustment Assistance Community College and Career Training (TAACCCT) grant program, which was a $1.9 billion initiative consisting of four rounds of grants, from 2011 to 2018. The grants were awarded to institutions of higher education (mainly community colleges) to build their capacity to provide workforce education and training programs. The implementation study assessed the grantees’ implementation of strategies to better connect and integrate education and workforce systems, address employer needs, and transform training programs and services to adult learners. The synthesis identified key implementation and impact findings based on a review of evaluation reports completed by grantees’ third-party evaluators. The outcomes study examined the training, employment, earnings, and self-sufficiency outcomes of nearly 2,800 participants from nine grants in Round 4. Findings from these studies provide evidence-based practices and insights that are being applied to the new Strengthening Community College Initiative Funding Opportunity Announcement, as well as future DOL investments.
8.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • DOL has a formal Evaluation Policy. Guidance on using funds to conduct and or participate in program evaluations and/or to strengthen their evaluation–building efforts can be found in each grant funding opportunity, and is a condition of each grant. The “Special Program Requirements” section of the respective grant funding opportunity notifies grantees of this responsibility. Generally, this section states: “As a condition of grant award, grantees are required to participate in an evaluation, if undertaken by DOL. The evaluation may include an implementation assessment across grantees, an impact and/or outcomes analysis of all or selected sites within or across grantees, and a benefit/cost analysis or assessment of return on investment. Conducting an impact analysis could involve random assignment (which involves random assignment of eligible participants into a treatment group that would receive program services or enhanced program services, or into control group(s) that would receive no program services or program services that are not enhanced). We may require applicants to collect data elements to aid the evaluation. As a part of the evaluation, as a condition of award, grantees must agree to: (1) make records available to the evaluation contractor on participants, employers, and funding; (2) provide access to program operating personnel, participants, and operational and financial records, and any other pertaining documents to calculate program costs and benefits; (3) in the case of an impact analysis, facilitate the assignment by lottery of participants to program services (including the possible increased recruitment of potential participants); and (4) follow evaluation procedures as specified by the evaluation contractor under the direction of DOL, including after the period of operation. After award, grantees will receive detailed guidance on ETA’s evaluation methodology, including requirements for data collection. Grantees will receive technical assistance to support their participation in these activities.
Score
7
Use of Evidence in Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its non-competitive grant programs in FY20?
(Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
  • In FY20, the five largest non-competitive grant programs at DOL are in the Employment and Training Administration:
    1. Adult Employment and Training Activities ($845,000,000; eligible grantees: city, county, and/or state governments);
    2. Youth Activities ($903,416,000; eligible grantees: city, county, and/or state governments);
    3. Dislocated Worker Employment and Training activities ($1,040,860,000; eligible grantees: city, county, and/or state governments);
    4. UI State Administration ($2,137,945,000; eligible grantees: city, county, and/or state governments);
    5. Employment Security grants to States ($663,052,000; eligible grantees: city, county, and/or state governments).
9.2 Did the agency use evidence of effectiveness to allocate funds in its five largest non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • A signature feature of the Workforce Innovation and Opportunity Act (WIOA) (Pub. L. 113-128), is its focus on the use of data and evidence to improve services and outcomes, particularly in provisions related to states’ role in conducting evaluations and research, as well as in requirements regarding data collection, performance standards, and state planning. Conducting evaluations is a required statewide activity, but there are additional requirements regarding coordination (with other state agencies and federal evaluations under WIOA), dissemination, and provision of data and other information for Federal evaluations.
  • WIOA’s evidence and performance provisions: (1) increased the amount of WIOA funds states can set aside and distribute directly from 5-10% to 15% and authorized them to invest these funds in Pay for Performance initiatives; (2) authorized states to invest their own workforce development funds, as well as non-federal resources, in Pay for Performance initiatives; (3) authorized local workforce investment boards to invest up to 10% of their WIOA funds in Pay for Performance initiatives; and (4) authorized states and local workforce investment boards to award Pay for Performance contracts to intermediaries, community based organizations, and community colleges.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • Section 116(e) of WIOA describes how the state, in coordination with local workforce boards and state agencies that administer the programs, shall conduct ongoing evaluations of activities carried out in the state under these state programs. These evaluations are intended to promote, establish, implement, and utilize methods for continuously improving core program activities in order to achieve high-level programs within, and high-level outcomes from, the workforce development system.
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
  • Reemployment Services and Eligibility Assessments (RESEA) funds must be used for interventions or service delivery strategies demonstrated to reduce the average number of weeks of unemployment insurance a participant receives by improving employment outcomes. The law provides for a phased implementation of the new program requirements over several years. In FY19, DOL awarded $130 million to states to conduct RESEA programs that met these evidence of effectiveness requirements. Beginning in FY23, states must also use no less than 25% of RESEA grant funds for interventions with a high or moderate causal evidence rating that show a demonstrated capacity to improve outcomes for participants; this percentage increases in subsequent years until after FY26, when states must use no less than 50 percent of such grant funds for such interventions.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Institutional Analysis of American Job Centers: the goal of the evaluation was to understand and systematically document the institutional characteristics of American Job Centers (AJCs), and to identify variations in service delivery, organization structure, and administration across AJCs.
  • Career Pathways Descriptive and Analytical Study: WIOA requires DOL to “conduct a multistate study to develop, implement, and build upon career advancement models and practices for low-wage healthcare providers or providers of early education and child care.” In response, DOL conducted the Career Pathways Design Study to develop evaluation design options that could address critical gaps in knowledge related to the approach, implementation, and success of career pathways strategies generally, and in early care and education specifically. The Chief Evaluation Office (CEO) has recently begun the second iteration of this study. The purpose of this project is to build on the evaluation design work CEO completed in 2018 to build evidence about the implementation and effectiveness of career pathways approaches and meet the WIOA statutory requirement to conduct a career pathways study. It will include a meta-analysis of existing impact evaluation results as well as examine how workers advance through multiple, progressively higher levels of education and training, and associated jobs, within a pathway over time, and the factors associated with their success.
  • Analysis of Employer Performance Measurement Approaches: the goal of the study was to examine the appropriateness, reliability and validity of proposed measures of effectiveness in serving employers required under WIOA. It included knowledge development to understand and document the state of the field, an analysis and comparative assessment of measurement approaches and metrics, and the dissemination of findings through a report, as well as research and topical briefs.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • The Employment & Training Administration’s (ETA) RESEA grantees may use up to 10% of their grant funds for evaluations of their programs. ETA released specific evaluation guidance to help states understand how to conduct or cause to conduct evaluations of their RESEA interventions with these grant funds. The goal of the agency guidance, along with the evaluation technical assistance being provided to states and their partners, is to build states’ capacity to understand, use, and build evidence.
  • Section 116 of WIOA establishes performance accountability indicators and performance reporting requirements to assess the effectiveness of states and local areas in achieving positive outcomes for individuals served by the workforce development system’s core programs. Section 116(e) of WOIA requires states to “employ the most rigorous analytical and statistical methods that are reasonably feasible, such as the use of control groups” and requires that states evaluate the effectiveness of their WOIA programs in an annual progress which includes updates on (1) current or planned evaluation and related research projects, including methodologies used; (2) efforts to coordinate the development of evaluation and research projects with WIOA core programs, other state agencies and local boards; (3) a list of completed evaluation and related reports with publicly accessible links to such reports; (4) efforts to provide data, survey responses, and timely visits for Federal evaluations; (5) any continuous improvement strategies utilizing results from studies and evidence-based practices evaluated. States are permitted to use WOIA grant funds to perform the necessary performance monitoring and evaluations to complete this report.
Score
4
Repurpose for Results

In FY20, did the agency shift funds away from or within any practice, policy, or program that consistently failed to achieve desired outcomes?
(Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; incentivizing or urging grant applicants to stop using ineffective practices in funding announcements; proposing the elimination of ineffective programs through annual budget requests; incentivizing well-designed trials to fill specific knowledge gaps; supporting low-performing grantees through mentoring, improvement plans, and other forms of assistance; using rigorous evaluation results to shift funds away from a program)

10.1 Did the agency have policy(ies) for determining when to shift funds away from grantees, practices, policies, interventions, and/or programs that consistently failed to achieve desired outcomes, and did the agency act on that policy?
  • The Employment & Training Administration’s (ETA) prospective YouthBuild and Job Corps grant applicants are selected, in part, based on their past performance. These programs consider the entity’s past performance of demonstrated effectiveness in achieving critical outcomes for youth. For the Job Corps reform, the Department’s FY21 budget request also proposes new legislative flexibilities that would enable the Department to more expediently close low-performing centers, target the program to groups more likely to benefit, and make the necessary capital investments to ensure successful pilot programs. These reforms would save money and improve results by eliminating ineffective centers and finding better ways to educate and provide skills instruction to youth.
  • Reforming Job Corps provides an example of such efforts to repurpose resources based upon a rigorous analysis of available data. As part of this reform effort, DOL’s FY20 budget request ends the Department of Agriculture’s (USDA) involvement in the program, unifying responsibility in DOL. Workforce development is not a core USDA role, and the 25 centers it operates are overrepresented in the lowest performing cohort of centers.
  • A rigorous 2012 evaluation of the Trade Readjustment Assistance (TAA) Program demonstrated that workers who participated in the program had lower earnings than the comparison group at the end of a four-year follow-up period, in part because they were more likely to participate in long-term job training programs rather than immediately reentering the workforce. However, this training was not targeted to in-demand industries and occupations, and, as found in Mathematica’s evaluation of the TAA program, only 37% of participants became employed in the occupations for which they trained. In the FY21 budget request, the Department addresses these issues by continuing to propose reauthorization of the TAA program that focuses on apprenticeship and on-the-job training, earn-as-you-learn strategies that ensure that participants are training for relevant occupations.
  • DOL’s FY20 budget request eliminates funding for the Senior Community Service Employment Program (SCSEP). SCSEP has a goal of transitioning half of participants into unsubsidized employment within the first quarter after exiting the program, but has struggled to achieve even this modest goal.
10.2 Did the agency identify and provide support to agency programs or grantees that failed to achieve desired outcomes?
  • The Department’s Employment and Training Administration sponsors the WorkforceGPS, which is a community point of access to support workforce development professionals in their use of evaluations to improve state and local workforce systems. Professionals can access a variety of resources and tools, including a learning cohort community to help leaders improve their research and evaluations capacities. The WorkforceGPS includes links to resources on assessment readiness, evaluation design, performance data all focused on improving the public workforce system.
Back to the Standard

Visit Results4America.org