2020 Federal Standard of Excellence


Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY20?

Score
9
Administration for Children and Families (HHS)
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Deputy Assistant Secretary for Planning, Research, and Evaluations at the Office of Planning, Research, and Evaluation (OPRE) serves as the Administration for Children and Families Chief Evaluation Officer. The Deputy Assistant Secretary oversees ACF’s Office of Planning, Research, and Evaluation (OPRE) which supports evaluation and other learning activities across the agency. ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation oversees a research and evaluation budget of approximately $180 million in FY20. OPRE has 68 federal staff positions; OPRE staff are experts in research and evaluation methods and data analysis as well as ACF programs, policies, and the populations they serve. In August 2019, the Department of Health and Human Services’ (HHS) Assistant Secretary for Planning and Evaluation was named the Chief Evaluation Officer of HHS.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • The HHS Chief Information Officer serves as the HHS Chief Data Officer. In August 2019, the HHS Chief Information Officer was named the acting Chief Data Officer of HHS. In September of 2019, the Assistant Secretary for Children and Families designated the Deputy Assistant Secretary for Planning, Research, and Evaluation as the primary ACF member to serve on the HHS Data Council, the body responsible for advising the HHS Chief Data Officer on implementation of Evidence Act activities across HHS.
  • Additionally, in 2016, ACF established a new Division of Data and Improvement (DDI) providing federal leadership and resources to improve the quality, use, and sharing of ACF data. The Director of DDI reports to the Deputy Assistant Secretary for Planning, Research, and Evaluation and oversees work to improve the quality, usefulness, interoperability, and availability of data and to address issues related to privacy and data security and data sharing. DDI has 12 federal staff positions and an FY20 budget of approximately $6.4M (not including salaries).
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, performance improvement officer, and other related officials in order to support, improve, and evaluate the agency’s major programs?
  • As of September 2019, ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation serves as the primary ACF representative to HHS’ Leadership Council, Data Council, and Evidence and Evaluation Council — the HHS bodies responsible for implementing Evidence Act activities across HHS. These cross-agency councils meet regularly to discuss agency-specific needs and experiences and to collaboratively develop guidance for department-wide action.
  • Within ACF, the 2016 reorganization that created the Division of Data and Improvement (DDI) endowed ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation with oversight of the agency’s strategic planning; performance measurement and management; research and evaluation; statistical policy and program analysis; synthesis and dissemination of research and evaluation findings; data quality, usefulness, and sharing; and application of emerging technologies to improve the effectiveness of programs and service delivery. ACF reviews program office performance measures and associated data three times per year in sync with the budget process; OPRE has traditionally worked with ACF program offices to develop research plans on an annual basis and has worked to integrate the development of program-specific learning agendas into this process. In addition, OPRE holds regular and ad hoc meetings with ACF program offices to discuss research and evaluation findings, as well as other data topics.
Score
9
Administration for Community Living (HHS)
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Director of the Office of Performance and Evaluation (OPE) serves as the Administration for Community Living (ACL) evaluation officer. OPE, which oversees the agency’s performance and evaluation work, has six full time staff positions and three full-time onsite contractors. In FY20 it had a budget of approximately $10.1 million. The Director of OPE has the education, skill, and experience to meet the Evaluation Officer requirements listed in the Evidence Act and routinely gauges the coverage, quality, methods, consistency, effectiveness, independence, and balance of the portfolio of evaluations, policy research, and ongoing evaluation activities of the agency and assesses agency capacity to support the development and use of evaluation. The Director is also the designated ACL Performance Officer.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • The Director of the Office of Performance and Evaluation (OPE) serves as the Administration of Community Livings’ Chief Data Officer. OPE, which oversees the agency’s performance and evaluation work, has six full time staff positions and three full-time onsite contractors. In FY20 it had a budget of approximately $10.1 million. The Director of OPE leads ACL’s Data Governance Body, including facilitating collaborative activities among the numerous actors with responsibilities and needs for data within the agency and has demonstrated training and experience in data management, governance, collection, analysis, protection, use, and dissemination and fulfills the aspects of this role which are relevant to ACL. These include coordinating with ACL’s CIO and Chief Privacy Officer on use, protection, dissemination, and generation of data to ensure that the data needs of the agency are met; ensuring that agency data conform with data management best practices; engaging agency employees, the public, and contractors in using public data assets; and encouraging collaborative approaches on improving data use. The Director of OPE serves as the agency liaison to other federal entities through, for example, serving as the ACL representative to the HHS data council, and serving on the Federal Interagency Council on Evaluation Policy as well as the HHS Data Governance Board.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • The Director of ACL’s Office of Performance and Evaluation serves the functions of evaluation officer, chief data officer, and performance officer. In order to coordinate activities relevant to these positions, the OPE Directors and staff coordinate the support, improvement, and evaluation of agency programs through implementation of an agency performance strategy, learning agenda, annual agency wide evaluation plan, and additional long range and evaluation plans for the Administration on Aging (in development) and the National Institute for Disability, Independent Living, and Rehabilitation Research. The structure requires semi-annual meetings with ACL leadership and management staff and annual consultation with all program managers. In FY19 ACL instituted a council to improve ACL’s data governance and quality, including the development of improved processes and standards for defining, collecting, reviewing, certifying, analyzing, and presenting data that ACL collects through its evaluations, grant reporting, and other administrative data collections. Taken together, this robust governance structure ensures cohesive collection and use of evidence across ACL regarding program performance, evaluation, and improvement and to ensure that data are gathered, processed, and curated so as to produce evidence that program staff and agency leadership use for program and operational improvement. As an operating division without a statistical unit, ACL does not have a statistical officer.
Score
9
U.S. Agency for International Development
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Director of the Office of Learning, Evaluation, and Research (LER) serves as the USAID evaluation officer. In compliance with the Foundations for Evidence-Based Policymaking Act, the Administrator of USAID designated the Agency’s Evaluation Officer (AEO) through an internal Executive Message that was shared with the Agency on June 4, 2019.
  • USAID’s AEO works in conjunction with the Office of Learning, Evaluation, and Research (LER) in the Bureau for Policy, Planning, and Learning (PPL) to help the Agency build a body of evidence from which to learn and adapt programs. The LER Director is a senior staff member with the authority, staff, and budget to ensure agency evaluation requirements are met, including that all projects are evaluated at some level, and that decision-making is informed by evaluation and evidence. The LER Director oversaw approximately 25 staff and an estimated $6.6 million budget in FY2019.
  • USAID has proposed creating a Bureau for Policy, Resources, and Performance (PRP), which will align policy, resources and evidence-based programming, and elevate the evaluation function by creating an Office for Learning and Evaluation that will manage the Agency’s Evaluation Policy. The office will also create and update the Agency Learning and Evaluation Plans, and commission or conduct cross-cutting evaluations. If approved by Congress, the estimated timeline for establishing the bureau is approximately a year and a half. In the meantime, working groups for each new office are developing work plans and focus areas for the new bureau to ensure PRP will be able to meet its mandate.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • The Agency’s Chief Data Officer (CDO) serves as the USAID Chief Data Officer. The Chief Data Officer reports to the Chief Information Officer in the Bureau for Management. In compliance with the Foundations for Evidence-Based Policymaking Act, the Administrator of USAID re-affirmed the designation of the Chief Data Officer through an internal Executive Message that was shared with the Agency on June 4, 2019. The CDO manages the USAID Data Services team which focuses exclusively on improving the usage of data and information to ensure the Agency’s development outcomes are supported and enhanced by evidence. The CDO’s team includes several direct hire data science and IT professionals along with a budget for contract professionals who provide a comprehensive portfolio of data services in support of the Agency’s mission. The CDO oversaw approximately 80staff and an estimated $11.7 million budget in 2020. The CDO is a senior career civil servant, and the USAID Data Services team is regularly called upon to generate products and services to support the Agency’s highest priorities. USAID also invests in roles including the Chief Innovation Officer, Chief Geographer, Chief Economist, Chief Scientist, and other key roles that drive the use of evidence across the agency.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • The Agency uses several governance structures and processes currently and will be updating these in accordance with OMB guidance related to the Foundations for Evidence-Based Policymaking Act. Some notable current examples include:
    1. Data Board: In September 2019, USAID established a Data Administration and Technical Advisory (DATA) Board, as mandated by the Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act) and subsequent guidance from the Office of Management and Budget (OMB) in Memoranda M-19-18 and M-19-23.The DATA Board acts as USAID’s data governance body. It serves as a central venue for seeking input from Agency stakeholders regarding data-related priorities and best practices to support Agency objectives. The DATA Board informs data-related policy, procedures and standards for the Agency. The DATA Board supports the work of the Agency Evaluation Officer by directing data services to facilitate evaluations. In addition to the Agency Evaluation Officer, Chief Data Officer and Statistical Officer, its membership includes the Performance Improvement Officer, the Chief Financial Officer, the Chief Technology officer, the Senior Accountable Official for Privacy and the USAID Geographer as well as representation from across the Agency. The USAID Chief Data Officer, Agency Evaluation Officer, and Statistical Official confer monthly to coordinate policy and activities.
    2. Management Operations Council: USAID also uses a Management Operations Council (MOC) as the platform for Agency leadership to assess progress toward achieving the strategic objectives in USAID’s Strategic Plan and cross-agency priority goals and additional management issues. Established in 2014, the MOC provides Agency-wide leadership for initiatives and investments to reform USAID business systems and operations worldwide. The MOC also provides a platform for senior leaders to learn about and discuss improving organizational performance, efficiency, and effectiveness. The Assistant Administrator for the Bureau for Management and the Agency’s Chief Operating Officer co-chair the MOC. Membership includes, among others, all the Agency’s Chief Executive Officers (e.g., Senior Procurement Executive, Chief Human Capital Officer, Chief Financial Officer, Chief Information Officer, Performance Improvement Officer and Project Management Improvement Officer). Depending on the agenda, it also includes the Chief Data Officer, Agency Evaluation Officer, and the Agency Senior Statistical Official.
    3. Weekly/Monthly Meetings between the Chief Data Officer, Chief Evaluation Officer, and Statistical Official: USAID established a standing meeting between the three officials named in the Evidence Act to coordinate on mandatory actions and milestones, evaluate resource requirements, and reconcile any potential discrepancies. The meeting includes leadership from the Office of Learning, Evaluation and Research which manages Agency requirements on performance monitoring, evaluation and organizational learning. As this meeting pre-dated the first Chief Data Officer council and Chief Evaluation Officer council meetings, it was critical for information sharing and addressing priorities.
    4. Privacy Council Meetings: USAID holds monthly Privacy Council meetings to address necessary actions and raise any privacy and confidentiality concerns. Representation includes the Senior Agency Official for Privacy, the Agency Statistical Official, and the Chief Privacy Officer, among others.
Score
7
AmeriCorps
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Director of the Office of Research & Evaluation serves as the AmeriCorps evaluation officer. The Director of Research and Evaluation (R&E) oversees R&E’s FY20 $4 million budget and a staff of eight. On average, the agency has invested ~$1 million in the Office of Research and Evaluation staff over the past eight years.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • AmeriCorps hired a new Chief Information Officer (CIO) in FY19. The CIO was appointed by the agency’s CEO as the Acting Chief Data Officer (CDO) and remains the Acting CDO in FY20. The agency has a long-term plan for hiring a CDO and standing up a department overseen by a permanent Chief Data Officer. The plan will likely be formalized in FY21.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • AmeriCorps established a Research & Evaluation Council in FY20, which was approved and is receiving internal clearance. The AmeriCorps Executive Team will appoint members from the agency’s departments to serve on this Council. A charter will be developed. Members of the Council will likely include the Director of R&E, the CIO/Acting CDO, the Chief of Staff, as well as representatives from the Chief of Program Operations and the Chief Operating Officer.
Score
9
U.S. Department of Education
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • USED has a designated Chief Data Officer (CDO). The Office of Planning, Evaluation and Policy Development’s (OPEPD) Office of the Chief Data Officer (OCDO) has a staff of 18 and is actively hiring additional staff. The Evidence Act provides a framework for OCDO’s responsibilities, which include lifecycle data management and developing and enforcing data governance policies. The OCDO has oversight over ED’s information collections approval and associated OMB clearance process. It is responsible for developing and enforcing ED’s open data plan, including management of a centralized comprehensive data inventory accounting for all data assets across ED. The OCDO is also responsible for developing and maintaining a technological and analytical infrastructure that is responsive to ED’s strategic data needs, exploiting traditional and emerging analytical methods to improve decision making, optimize outcomes, and create efficiencies. These activities are carried out by the Governance and Strategy Division, which focuses on data governance, lifecycle data management, and open data; and the Analytics and Support Division, which provides data analytics and infrastructure responsive to ED’s strategic data. The current OCDO budget reflects the importance of these activities to ED leadership, with S&E funding allocated for data governance, data analytics, open data, and information clearances.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • The EO, CDO, and SO meet monthly for the purposes of ensuring ongoing coordination of Evidence Act work. Each leader, or their designee, also participate in the PIO’s Strategic Planning and Review process. In FY20, the CDO is the owner of Goal 3 in ED’s strategic plan: “Strengthen the quality, accessibility, and use of education data through better management, increased privacy protections and transparency.” Leaders of the three embedded objectives come from OCDO, OCIO, and NCES.
  • The Evidence Leadership Group (ELG) supports program staff that run evidence-based grant competitions and monitor evidence-based grant projects. It advises ED leadership and staff on how evidence can be used to improve ED programs and provides support to staff in the use of evidence. It is co-chaired by the Evaluation Officer and the OPEPD Director of Grants Policy. Both co-chairs sit on ED’s Policy Committee (described below). The SO, EO, CDO, and Performance Improvement Officer (PIO) are ex officio members of the ELG.
  • The ED Data Governance Board (DGB) sponsors agency-wide actions to develop an open data culture, and works to improve ED’s capacity to leverage data as a strategic asset for evidence building and operational decisions, including developing the capacity of data professionals in program offices. It is chaired by the CDO, with the SO, EO, and PIO as ex officio members.
  • The ED CDO sits in OPEPD and the Evaluation Officer (EO) and the Statistical Official (SO) sit in the Institute for Education Sciences (IES). Both OPEPD and IES participate in monthly Policy Committee meetings which often address evidence-related topics. OPEPD manages the Secretary’s policy priorities including evidence, while IES is focused on (a) bringing extant evidence to policy conversations and (b) suggesting how evidence can be built as part of policy initiatives. OPEPD plays leading roles in the formation of ED’s policy positions as expressed through annual budget requests, grant competition priorities, including evidence. Both OPEPD and IES provide technical assistance to Congress to ensure evidence appropriately informs policy design.
Score
9
U.S. Dept. of Housing & Urban Development
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The General Deputy Assistant Secretary of the Office of Policy Development & Research (PD&R) serves as the Department of Housing and Urban Development (HUD) evaluation officer. HUD’s Office of Policy Development & Research (PD&R) is led by an Assistant Secretary and the career General Deputy Assistant Secretary. PD&R comprises six offices, 153 staff including a team of field economists in HUD’s 10 regional offices, and a budget of $98 million in FY20. The Assistant Secretary and Evaluation Officer ensures that evidence informs policy development through frequent personal engagement with other principal staff, the Secretary, and external policy officials including consultation with Congress, speeches to policy audiences, sponsorship of public research briefings, and policy implications memoranda.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • The Chief Technology Officer in the Office of the Chief Information Officer serves as the acting Chief Data Officerfor HUD. The FY21 Budget requested funding to stand up the CDO’s office with 13 staff. The PD&R General Deputy Assistant Secretary and Statistical Official are responsible for numerous data infrastructure functions such as the collection and analysis of national housing market data (including survey collaborations with the Census Bureau); developing income limits and factors to support program operations; advising and assisting program offices with the development and analysis of administrative data collections; and supporting data linkages and developing open data products from administrative data, including geospatial data products that are crucial for addressing housing and urban development policy challenges.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • HUD has engaged and coordinated within the Department its evidence-building efforts, which in FY20 included developing HUD’s learning agenda and conducting the first agency-wide assessment of evidence-building capacity. In FY21, HUD will be focused on establishing an enterprise data governance model, which will include a data governance board consisting of key decision-makers from across the agency, which will include the Evaluation Officer, Chief Data Officer, Statistical Official, and Performance Improvement Officer. HUD’s enterprise data governance model will bring together evaluation, statistical, performance, and data activities and focus on growing the agency’s evidence-based practices in order to improve HUD’s organizational performance.
Score
9
U.S. Department of Labor
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Chief Evaluation Officer serves as the U.S. Department of Labor (DOL) evaluation officer. The Chief Evaluation Officer oversees DOL’s Chief Evaluation Office (CEO), housed within the Office of the Assistant Secretary for Policy (OASP), and the coordination of Department-wide evaluations, including office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions.
  • CEO is directly appropriated $8.04 million and then, may receive up to 0.75% from statutorily specified program accounts, based on the discretion of the Secretary. In FY19, that number was .03% of funds, or $3.3 million, bringing the spending total to $11.34 million. The FY20 number is not known yet, because the Secretary has not determined the set-aside amount.
  • CEO includes nine full-time staff plus a small number of contractors and one to two detailees at any given time. This staff level is augmented by staff from research and evaluation units in other DOL agencies such as the Employment and Training Administration (ETA), which has nine FTE’s dedicated to research and evaluation activities with which the CEO coordinates extensively on the development of a learning agenda, management of studies, and dissemination of results.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • The Department has designated a Chief Data Officer. The Chief Data Officer chairs DOL’s data governance body, and leads data governance efforts, open data efforts, and associated efforts to collect, manage, and utilize data in a manner that best supports its use to inform program administration and foster data-informed decision-making and policymaking.
  • DOL has arranged for temporary staffing to support governance and open data efforts as well as compliance with the Evidence Act, the Federal Data Strategy, and DOL’s data governance goals. DOL is in the process of hiring permanent staff to support the office through customized position descriptions.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • DOL, through a Secretary’s Order, has created a structure that coordinates and leverages the important roles within the organization to accomplish objectives like those in the Evidence Act. The Secretary’s Order mandates collaboration between the Chief Data Officer, the Chief Performance Officer, Chief Evaluation Officer, Chief Information Officer, and Chief Statistical Officer.
  • The Secretary’s Order mandates a collaborative approach to reviewing IT infrastructure and data asset accessibility, developing modern solutions for managing, disseminating and generating data, coordinating statistical functions, supporting evaluation, research and evidence generation, and supporting all aspects of performance management including assurances that data are fit for purpose.
  • DOL continues to leverage current governance structures, such as the Chief Evaluation Officer continuing to play a role in the formation of the annual budget requests of DOL’s agencies, recommendations around including evidence in grant competitions, and providing technical assistance to the Department leadership to ensure that evidence informs policy design. There are a number of mechanisms set up to facilitate this: The Chief Evaluation Officer traditionally participates in quarterly performance meetings with DOL leadership and the Performance Management Center (PMC). The Chief Evaluation Officer reviews agency operating plans and works with agencies and the PMC to coordinate performance targets and measures and evaluation findings; quarterly meetings are held with agency leadership and staff as part of the Learning Agenda process; and meetings are held as needed to strategize around addressing new priorities or legislative requirements.
Score
9
Millennium Challenge Corporation
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Monitoring and Evaluation (M&E) Managing Director serves as the Millennium Challenge Corporation’s (MCC) Evaluation Officer. The Managing Director is a career civil service position with the authority to execute M&E’s budget, an estimated $17.4 million in due diligence funds in FY20, with a staff of 30 people. In accordance with the Foundations for Evidence-Based Policymaking Act, MCC designated an Evaluation Officer.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • The Director of Product Management in the Office of the Chief Information Officer is MCC’s Chief Data Officer. The Chief Data Officer manages a staff of eight and an estimated FY20 budget of $1.5 million in administrative funds. In accordance with the Foundations for Evidence-Based Policymaking Act, MCC designated a Chief Data Officer.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • The MCC Evaluation Management Committee (EMC) oversees decision-making, integration, and quality control of the agency’s evaluation and programmatic decision-making in accordance with the Foundations for Evidence-Based Policymaking Act. The EMC integrates evaluation with program design and implementation to ensure that evaluations are designed and implemented in a manner that increases their utility, to both MCC and in-country stakeholders as well as external stakeholders. The EMC includes the agency’s evaluation officer, Chief Data Officer, representatives from M&E, the project lead, sector specialists, the economist, and gender and environmental safeguards staff. For each evaluation the EMC has between 11-16 meetings or touchpoints, from evaluation scope-of-work to final evaluation publication. The EMC plays a key role in coordinating MCC’s Evidence Act implementation.
Score
6
Substance Abuse and Mental Health Services Administration (HHS)
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The director of the Substance Abuse and Mental Health Services Administration’s (SAMHSA) Center for Behavioral Health Statistics and Quality (CBHSQ) Office of Evaluation serves as the agency’s evaluation lead with key evaluation staff housed in this division. According to the SAMHSA website: “The Office of Evaluation is responsible for providing centralized planning and management of program evaluation across SAMHSA in partnership with program originating Centers.” SAMHSA evaluations are funded from program funds that are used for service grants, technical assistance, and for evaluation activities. Evaluations have also been funded from recycled funds from grants or other contract activities, as described in the FY21 Congressional Justification.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • CBHSQ, led by its Director, designs and carries out special data collection and analytic projects to examine issues for SAMHSA and other federal agencies and is the government’s lead agency for behavioral health statistics, as designated by the Office of Management and Budget.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • The SAMHSA website states: “The Office of Evaluation is responsible for providing centralized planning and management of program evaluation across SAMHSA in partnership with program originating Centers, providing oversight and management of agency quality improvement and performance management activities and for advancing agency goals and objectives related to program evaluation, performance measurement, and quality improvement.” The Evaluation Office describes 10 areas of support it provides to the Centers, including:
    1. Develops evaluation language for Request for Proposals (RFPs), Request for Applications (RFAs), and other funding announcements to ensure a clear statement of evaluation expectations in the announcements;
    2. Develops and implements standard measures for evaluating program performance and improvement of services;
    3. Manages the design of SAMHSA program evaluations in collaboration with the relevant Center(s);
    4. Monitors evaluation contracts to ensure implementation of planned evaluation and provides early feedback regarding program start-up for use in agency decision-making;
    5. Works collaboratively with the National Mental Health and Substance Use Policy Laboratory to provide support for SAMHSA evaluations;
    6. Oversees the identification of a set of performance indicators to monitor each SAMHSA program in collaboration with program staff and the development of periodic program profiles for use in agency planning, program change, and reporting to departmental and external organizations;
    7. Provides collaboration, guidance, and systematic feedback on SAMHSA’s programmatic investments to support the agency’s policy and program decisions;
    8. Analyzes data in support of agency needs and develops evaluation and performance related reports in response to internal and external request;
    9. Utilizes SAMHSA’s Performance Accountability and Reporting System (SPARS) which serves as a mechanism for the collection of performance data from agency grantees; and
    10. Responds to agency and departmental requests for performance measurement data and information; and conducts a range of analytic and support activities to promote the use of performance data and information in the monitoring and management of agency programs and initiatives
  • While evaluation authority, staff, and resources are decentralized and found throughout the agency, SAMHSA is composed of four Centers, the Center for Mental Health Services (CMHS), the Center for Substance Abuse Treatment (CSAT), the Center for Substance Abuse Prevention (CSAP) and the Center for Behavioral Health Statistics and Quality (CBHSQ).
  • As such, CMHS, CSAT, and CSAP oversee grantee portfolios and evaluations of those portfolios with the support of the Office of Evaluation. Evaluation decisions within SAMHSA are made within each Center specific to their program priorities and resources. Each of the three program Centers uses their program funds for conducting evaluations of varying types. CBHSQ, SAMHSA’s research arm, provides varying levels of oversight and guidance to the Centers for evaluation activities. CBHSQ also provides technical assistance related to data collection and analysis to assist in the development of evaluation tools and clearance package
Back to the Standard

Visit Results4America.org