2020 Federal Standard of Excellence


Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY20? (Example: Performance stat systems, frequent outcomes-focused data-informed meetings)

Score
6
Administration for Children and Families (HHS)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • ACF was an active participant in the development of the FY 2018-2022 HHS Strategic Plan, which includes several ACF-specific objectives. ACF regularly reports on progress associated with those objectives as part of the FY 2020 HHS Annual Performance Plan/Report, including the ten total performance measures from ACF programs that support this Plan. ACF performance measures primarily support Goal Three: “Strengthen the Economic and Social Well-Being of Americans Across the Lifespan.” ACF supports Objective 3.1 (Encourage self-sufficiency and personal responsibility, and eliminate barriers to economic opportunity), Objective 3.2 (Safeguard the public against preventable injuries and violence or their results), and Objective 3.3 (Support strong families and healthy marriage, and prepare children and youth for healthy, productive lives) by reporting annual performance measures. ACF is also an active participant in the HHS Strategic Review process, which is an annual assessment of progress on the subset of ten performance measures that ACF reports on as part of the HHS Strategic Plan. 
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • OPRE currently reviews all ACF funding opportunity announcements and advises program offices, in accordance with their respective legislative authorities, on how to best integrate evidence into program design. Similarly, program offices have applied ACF research to inform their program administration. For example, ACF developed the Learn Innovate Improve (LI2) model — a systematic, evidence-informed approach to program improvement — which has since informed targeted TA efforts for the TANF program and the evaluation requirement for the child support demonstration grants.
  • ACF programs also regularly analyze and use data to improve performance. For example, two ACF programs (Health Profession Opportunity Grants & Healthy Marriage and Responsible Fatherhood programs) have developed advanced web-based management information systems (PAGES and nFORM, respectively) that are used to track grantee progress, produce real-time reports so that grantees can use their data to adapt their programs, and record grantee and participant data for research and evaluation purposes. 
  • ACF also uses the nFORM data to conduct the HMRF Compliance Assessment and Performance (CAPstone) Grantee Review: a process by which federal staff and technical assistance providers assess grantee progress toward and achievement in meeting programmatic, data, evaluation, and implementation goals. The results of the CAPstone process guide federal directives and future technical assistance. 
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
Score
7
Administration for Community Living (HHS)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • As part of the U.S. Department of Health and Human Services Annual Performance Plan and Report, ACL reports on the following two HHS Agency Priority Goals: (1) Increase the success rate of the Protection and Advocacy Program’s individual or systemic advocacy, thereby advancing individuals with developmental disabilities’ right to receive appropriate community based services, resulting in community integration and independence, and have other rights enforced, retained, restored and/or expanded; and (2) Improve dementia capability of long-term support systems to create dementia-friendly, livable communities (Lead Agency ACL). ACL’s strategy focuses on five pillars: supporting families and caregivers, protecting rights and preventing abuse, connecting people to resources, expanding employment opportunities, and strengthening the aging and disability networks. These pillars provide structure and focus for ACL’s work. ACL’s outcomes measures are available, by program, in its annual Congressional Budget Justification, and include measures of program efficiency. 
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • ACL employs a moderate approach for analyzing evidence to find ways to improve return on investment that addresses multiple parts of the agency. In FY20, as part of its ongoing effort to ensure that agency funds are used effectively, ACL funded a contract, focused on ACL’s Administration in Aging, to identify approaches to measure how and to what extent parts of the Aging Network leverage Older Americans Act funds to increase their available resources as well as how the Aging Network uses resources to measure and improve the quality of services available/provided. NIDILRR conducts research as part of their new employment research agenda to continue development of return-on-investment models that can be used by Vocational Rehabilitation agencies to optimize the services they provide. In addition, in March 2020 ACL launched a Challenge Competition to spur development of the interoperable, statewide referral and analytics platforms needed to enable the type of partnerships between health care and community-based social services organizations have been shown to improve health outcomes and lower costs.
  • In June 2020 ACL launched MENTAL Health Challenge to create an online tool that connects socially isolated people to resources. In November 2020, ACL launched two competitions. The Inventive Solutions to Address the Direct Support Professional Crisis has the goal of improving the overall quality of home- and community-based services (HCBS) for individuals with intellectual and developmental disabilities (ID/DD). The Disability Employment Challenge sought innovative models that can be shared to help businesses across the country reach a wider talent pool and to create more opportunities for employment for people with disabilities. The goal of all the prize competitions is to encourage effective and efficient methods for meeting ACL’s mission and improving services to its target populations.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • As part of ACL’s performance strategy and learning agenda approach OPE staff present performance data to ACL leadership several times a year. In addition, ACL leadership review performance data as part of the budget justification process that informs program funding decisions. OPE staff conduct annual meetings with ACL staff to report performance measure data and results to discuss methods for incorporating performance and evaluation findings into funding and operational decision-making. As part of annual evaluation planning efforts, staff from ACL’s Office of Performance and Evaluation consult with ACL center directors to identify evaluation priorities and review proposed evaluation approaches to ensure that the evaluation questions identified will provide information that will be useful for program improvement.
Score
10
U.S. Agency for International Development
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • Most of USAID’s innovation or co-created programs and those done in partnerships reflect a data-driven “pay for results” model, where milestones are agreed by all parties, and payments are made when milestones are achieved. This means that, for some programs, if a milestone is unmet, funds may be re-applied to an innovation or intervention that is achieving results. This rapid and iterative performance model means that USAID more quickly understands what is not working and can move resources away from it and toward what is working.
  • Approaches such as prizes, Grand Challenges, and ventures can also be constructed to be “pay for results only” where interventions such as “Development Impact Bonds” are used to create approaches where USAID only pays for outcomes and not inputs or attempts only. The Agency believes this model will pave the way for much of USAID’s work to be aligned with a “pay for results” approach. USAID is also piloting the use of the impact per dollar of cash transfers as a minimum standard of cost-effectiveness for applicable program designs. Most innovations funded at USAID have a clear “cost per impact” ratio. 
  • Additionally, USAID Missions develop Country Development Cooperation Strategies (CDCSs) with clear goals and objectives and a Performance Management Plan (PMP) that identifies expected results, performance indicators to measure those results, plans for data collection and analysis, and regular review of performance measures to use data and evidence to adapt programs for improved outcomes. USAID also promotes data-informed operations performance management to ensure that the Agency achieves its development objectives and aligns resources with priorities. USAID uses its Management Operations Council to conduct an annual Strategic Review of progress toward achieving the strategic objectives in the Agency’s strategic plan. 
  • To improve linkages and break down silos, USAID continues to develop and pilot the Development Information Solution (DIS)—an enterprise-wide management information system that will enable USAID to collect, manage, and visualize performance data across units, along with budget and procurement information, to more efficiently manage and execute programming.  Several USAID field missions are testing the system prior to world-wide deployment: El Salvador, Peru, Rwanda, Ethiopia, South Africa, Vietnam, and Nepal. 
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • USAID’s Program Cycle policy (ADS 201.3.2.18) requires that Missions conduct at least one portfolio review per year that focuses on progress toward strategy-level results. Missions must also conduct a CDCS mid-course stocktaking at least once during the course of implementing their Country Development Cooperation Strategy, which typically spans five years.
  • USAID developed an approach to explicitly ensure adaptation through learning called Collaborating, Learning, and Adapting (CLA). It is incorporated into USAID’s Program Cycle guidance (ADS 201.3.5.19) where it states: “Strategic collaboration, continuous learning, and adaptive management link together all components of the Program Cycle.” Through CLA, USAID ensures its programming is coordinated with others, grounded in a strong evidence base, and iteratively adapted to remain relative throughout implementation. 
  • In addition to this focus through its programming, USAID has two senior bodies which oversee Enterprise Risk Management, and meet regularly to improve the accountability and effectiveness of USAID programs and operations through holistic risk management. USAID tracks progress toward strategic goals and annual performance goals during data-driven reviews at Management Operations Council meetings. Also, through input from the Management Operations Council, an annual Agency-wide customer service survey, and other analysis, USAID regularly identifies opportunities for operational improvements at all levels of the Agency as part of its operational learning agenda as well as the agency-wide learning agenda, the Self-Reliance Learning Agenda. SRLA’s questions 8, 9, 12, and 13 focus on operational aspects of the agency’s work which influence everything from internal policy, design and procurement processes, program measurement, and staff training
Score
4
AmeriCorps
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • AmeriCorps continued to implement its Transformation and Sustainability Plan (TSP) in FY20. This plan aims to ensure AmeriCorps is maximizing its resources to achieve results. One of the key components of the TSP is creating a new regional office structure that relies on a new grant management model. The agency is conducting a process evaluation/rapid cycle assessment for each implementation phase of this aspect of the TSP. A staff survey was conducted and findings were used to improve onboarding, orientation, and training processes as well as training content. A focus group was conducted with leadership from the first phase of implementation and findings were used to inform and improve management practices. 
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • AmeriCorps started the fiscal year with internal acquisition planning and budget formulation meetings that asked each office to identify in their budget proposals how evidence-based activities and evidence-building activities would be prioritized. All program offices are using data/evidence to improve performance. For example:
    • AmeriCorps launched a grant management tool (the “Portfolio Navigator”) that allows Portfolio Managers to access data about grantee organizations in real time to facilitate improved oversight and support.
    • AmeriCorps NCCC invested in a Service Project Database that provides staff access to data on all NCCC projects completed since 2012. The database thematically organizes projects, classifies project frameworks, and categorizes the outcomes of these service initiatives. NCCC is investing in an evaluation of NCCC’s impact. This research project was initiated in FY18 and is focused on evaluating member retention, studying how NCCC develops leadership skills in its members and teams, and the program’s ability to strengthen communities. Finally, NCCC will continue to invest in research grants to better understand the outcomes of its disaster response efforts.
    • NCCC made significant strides in improving the utility of the data it gathers regularly for continuous improvement.
      • Five years of service project data were digitized. A sample of projects was coded for outcomes achieved and these codes were applied to over 5,000 service projects completed to date to improve future project development and better assess community outcomes. The final analysis will be completed this fiscal year.
      • The Disaster Services Unit converted data collected from State Service Commissions into state-specific scorecards to illustrate readiness to respond to disasters and guide improvement efforts. 
      • The Disaster Services Unit is using a new data collection instrument for improving the agency’s situational awareness of how State Service Commissions and national service programs are responding to COVID-19 with the goal of using this information to improve internal and external (e.g., with FEMA) coordination and communication in responding to the pandemic. This unit provides weekly metrics on the number of national service members focused on various COVID-19 related activities (e.g., food distribution, wellness checks, on-line learning support) as well the estimated numbers served as part of a standing Senior Leadership Briefing document.
    • AmeriCorps and the State of Colorado, in partnership with local public health authorities, are building a statewide corps of contact tracers to contain the spread of the novel coronavirus as the state gradually reopens. Approximately 800 NCCC, VISTA Summer Associates, and Senior Corps RSVP volunteers will be trained as contact tracers. The agency will assess the successes and challenges of this partnership and project with the goal of sharing lessons learned with other states as a promising role for national service in addressing the pandemic.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • The Office of the Chief Financial Officer meets quarterly to assess progress toward the goals of its performance plan. The plan includes strategic objectives, strategies for achieving the objectives, milestones, measures, and targets. Quarterly meetings are used to discuss actuals versus targets and identify promising practices used to achieve targets as well as areas for better optimizing the delivery of budget, procurement, grants, and financial management.
  • AmeriCorps Office of the Chief Risk Officer piloted a grantee risk assessment tool and is coordinating with the agency’s new Office of Monitoring to establish an improved, more data-driven business process that better supports the new grant management model (e.g., delineating the roles of ensuring grantee compliance and providing grantees with programmatic coaching between distinct offices and positions within those offices). This tool was used to select a pool of grantees for monitoring in FY20. These monitoring activities are underway and are expected to increase compliance and improve performance during the grant life cycle.
  • Performance management and continuous improvement takes place primarily at the grantee management level and at the agency level. For the former, portfolio managers use various tools to ensure performance is on track and opportunities for continuous improvement are in place as needed. The Portfolio Navigator, FFRs, and progress reports are used in conjunction with regular check-ins and on occasion, site visits. At the agency level, evidence-building activities led by ORE are the primary mechanism for informing project development/innovation and improvement in grant making. The agency is in the process of strengthening its own systems and practices for establishing and managing office-level performance.
  • Over the past decade AmeriCorps and its grantees have invested significant resources in evaluating different agency programs and supported program models designed to improve a range of outcomes for national service members and volunteers, children, families, organizations, and communities. These investments have established the evidence base for both the effectiveness of the interventions implemented by its grantees, and more generally for the impact of national service. As this body of evidence has emerged, ORE has broadened its perspective to include innovative methodologies to assess the impact of its national service programs. One methodology to extend the measurement of impact involves rigorously assessing and estimating a return on agency investments. Initial FY20 findings are promising and positive. Final findings will be available at the end of the fiscal year. The agency is in the process of procuring a contract to support annual, targeted return on investment analyses for the next five years.
  • The agency’s emphasis on evidence is meant to support, not inhibit, innovation, improvement, and learning. The intent is to integrate the use of evidence and opportunities for further learning into all activities. Where an evidence base is lacking, evidence will be developed through systematic analysis. Where evidence exists, it will be used to encourage replication and expansion of effective solutions. As a learning organization, AmeriCorps uses many types of evidence and understands that a culture of continual improvement relies on multiple sources of information. AmeriCorps plans to procure a contract in FY20 to design and implement program evaluations that will have different purposes and uses such as program development, improvement, accountability, or replicability. Furthermore, in order to leverage limited evaluation resources and build the evaluation capacity of the national service field, the Contractor shall plan for the collaborative development of multi-site evaluations that pool (or “bundle”) AmeriCorps grantees delivering the same or similar programs targeting the same or similar outcomes and that have a shared program evaluation purpose
Score
8
U.S. Department of Education
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • ED’s FY18-22 Strategic Plan includes two parallel goals, one for P-12 and one for higher education (Strategic Objectives 1.4 and 2.2, respectively), that focus on supporting agencies and educational institutions in the identification and use of evidence-based strategies and practices. The OPEPD ELG co-chair is responsible for both strategic objectives. 
  • All Department Annual Performance Reports (most recent fiscal year) and Annual Performance Plan (upcoming fiscal year) are located on ED’s website. This includes the FY19 Annual Performance Report and the FY21 Annual Performance Plan, which includes FY19 performance results as well as planned targets for FY20, FY21, and FY22. In FY20, ED published new Agency Priority Goals in performance.gov emphasizing (1) education freedom, (2) multiple pathways to student success, (3) federal student aid customer service, (4) student privacy and cybersecurity, and (5) regulatory reform.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • The Grants Policy Office in the Office of Planning, Evaluation and Policy Development (OPEPD) works with offices across ED to ensure alignment with the Secretary’s priorities, including evidence-based practices. The Grants Policy Office looks at where ED and the field can continuously improve by building stronger evidence, making decisions based on a clear understanding of the available evidence, and disseminating evidence to decision makers. Specific activities include: strengthening the connection between the Secretary’s policies and grant implementation from design through evaluation; supporting a culture of evidence-based practices; providing guidance to grant-making offices on how to integrate evidence into program design; and identifying opportunities where ED and field can improve by building, understanding, and using evidence. During the past year, the Grants Policy Office has collaborated with offices across the Department on a variety of activities, including reviews of efforts used to determine grantee performance.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • The Department conducted after-action reviews after the FY 2019 competition cycle to reflect on successes of the year as well as opportunities for improvement. The reviews resulted in process updates for FY 2020. In addition, the Department updated an optional internal tool to inform policy deliberations and progress on the Secretary’s policy priorities, including the use of evidence and data.
Score
9
U.S. Dept. of Housing & Urban Development
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • HUD uses data and evidence extensively to improve outcomes and return on investment. The primary means are through PD&R’s investments in data collection, program demonstrations and evaluations, and research guided by a multi-year learning agenda; HUD’s extensive use of outcome-oriented performance metrics in the Annual Performance Plan; and senior staff oversight and monitoring of key outcomes and initiatives through the Prescription for HUD, the Advancing Economic Opportunity Task Force, and the Agency-Wide Integrity Task Force, which bring together senior staff for quarterly performance management meetings. 
  • In 2019, HUD expanded the Standards for Success data collection and reporting framework for discretionary grant programs to cover Resident Opportunities and Self-Sufficiency Service Coordinator (ROSS) grants, Multifamily Housing Service Coordinator grants, and Multifamily Housing Budget-Based Service Coordinator Sites. The framework supports better outcomes by providing a more standardized performance measurement framework, better alignment with Departmental strategies, and more granular reporting to support analytics.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
Score
10
U.S. Department of Labor
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • Using a performance and budget system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance to analyze progress and identify opportunities for performance improvements. Learning agendas updated annually by DOL agencies in collaboration with DOL’s CEO include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The annual Strategic Reviews with leadership include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement. Using a performance stat reporting and dashboard system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance by the Deputy Secretary to analyze progress and identify opportunities for performance improvements.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • DOL’s performance reporting and dashboard system support quarterly reviews of each agency’s program performance by the Deputy Secretary to analyze progress and identify opportunities for performance improvements. These performance reviews connect to DOL’s broader performance and evaluation activities. DOL’s OCIO developed a new dashboard last year for agency leadership use only – the CXO Dashboard – to interactively assess progress on performance by providing instant access to key administrative data to enable data-driven decisions.
  • DOL leverages a variety of continuous learning tools, including the learning agenda approach to conceptualize and make progress on substantive learning goals for the agency, as well as DOL’s Performance Management Center’s (PMC) Continuous Process Improvement (CPI) Program, which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
Score
5
Millennium Challenge Corporation
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • MCC is committed to using high-quality data and evidence to drive its strategic planning and program decisions. The Monitoring and Evaluation plans for all programs and tables of key performance indicators for all projects are available online by compact and threshold program and by sector, for use by both partner countries and the general public. Prior to investment, MCC performs a Cost-Benefit Analysis to assess the potential impact of each project, and estimates an Economic Rate of Return (ERR). MCC uses a 10% ERR hurdle to more effectively prioritize and fund projects with the greatest opportunity for maximizing impact. MCC then recalculates ERRs at investment closeout, drawing on information from MCC’s monitoring data (among other data and evidence), to test original assumptions and assess the cost effectiveness of MCC programs. This year, MCC has also pushed to undertake and publish evaluation-based ERRs. As a part of the independent evaluation, the evaluators analyze the MCC-produced ERR five or more years after investment close to understand if and how benefits actually accrued. These evaluation-based ERRs add to the evidence base by better understanding the long-term effects and sustainable impact of MCC’s programs.
  • In addition, MCC produces periodic reports that capture the results of MCC’s learning efforts in specific sectors and translate that learning into actionable evidence for future programming. In FY20, MCC produced two new Principles into Practice reports. MCC compiled evidence and learning on its technical and vocational education and training activities in the education sector, called Training Service Delivery for Jobs and Productivity. MCC is also finalizing a report on its research related to learning in the water, sanitation, and hygiene sector through a new report called Lessons from Evaluations of MCC Water, Sanitation, and Hygiene Programs.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • MCC continues to implement and expand a new reporting system that enhances MCC’s credibility around results, transparency, learning, and accountability. The Star Report and its associated quarterly business process captures key information to provide a framework for results and improve the ability to promote and disseminate learning and evidence throughout the compact and threshold program lifecycle. For each compact and threshold program, evidence is collected on performance indicators, evaluation results, partnerships, sustainability efforts, and learning, among other elements. Critically, this information is available in one report after each program ends. Each country will have a Star Report published roughly seven months after completion.
  • Continual learning and improvement is a key aspect of MCC’s operating model. MCC continuously monitors progress towards compact and threshold program results on a quarterly basis using performance indicators that are specified in the Monitoring and Evaluation (M&E) Plan for each country’s investments. The M&E Plans specify indicators at all levels (process, output, and outcome) so that progress towards final results can be tracked. Every quarter each partner country submits an Indicator Tracking Table that shows actual performance of each indicator relative to the baseline that was established before the activity began and the performance targets that were established in the M&E Plan. Key performance indicators and their accompanying data by country are updated every quarter and published online. MCC management and the relevant country team review this data in a formal Quarterly Performance Review meeting to assess whether results are being achieved and integrate this information into project management and implementation decisions.
  • Also in FY20, MCC is producing and publishing a new product called MCC Sector Packages. For each sector in which MCC works, MCC will have a one-stop, interactive repository of sector-level common indicators, research questions, evaluation findings, and applied learnings. These documents will also show how past evidence is being used in developing new investments
Score
6
Substance Abuse and Mental Health Services Administration (HHS)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • The SAMHSA Strategic Plan FY2019-FY2023 outlines five priority areas with goals and measurable objectives to carry out the vision and mission of SAMHSA. For each priority area, an overarching goal and series of measurable objectives are described followed by examples of key performance and outcome measures SAMHSA will use to track progress. 
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • According to the SAMHSA website, the Office of Evaluationoversees the identification of a set of performance indicators to monitor each SAMHSA program in collaboration with program staff and the development of periodic program profiles for use in agency planning, program change, and reporting to departmental and external organizations” and “utilizes SAMHSA’s Performance Accountability and Reporting System (SPARS) which serves as a mechanism for the collection of performance data from agency grantees.”
  • According to the FY2019-FY2023 Strategic Plan (pp. 21-22), SAMHSA will modernize the Performance Accountability and Reporting System (SPARS) by 1) capturing real-time data for discretionary grant programs in order to monitor their progress, impact, and effectiveness, and 2) developing benchmarks and disseminating annual Performance Evaluation Reports for all SAMHSA discretionary grant programs.
  • The Centers have historically managed internal performance review boards to periodically review grantee performance and provide corrective actions as needed. The SAMHSA website states that the Office of Evaluation is charged with “providing oversight and management of agency quality improvement and performance management activities and for advancing agency goals and objectives related to program evaluation, performance measurement, and quality improvement.”
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance) As described on the SAMHSA website, the Office of Evaluation supports continuous improvement and learning in several ways: Analyzes data in support of agency needs and develops evaluation and performance related reports in response to internal and external request;
  • As described on the SAMHSA website, the Office of Evaluation supports continuous improvement and learning in several ways:
    • Analyzes data in support of agency needs and develops evaluation and performance related reports in response to internal and external request;
    • Oversees the identification of a set of performance indicators to monitor each SAMHSA program in collaboration with program staff and the development of periodic program profiles for use in agency planning, program change, and reporting to departmental and external organizations;
    • Utilizes SAMHSA’s Performance Accountability and Reporting System (SPARS) which serves as a mechanism for the collection of performance data from agency grantees; and
    • Responds to agency and departmental requests for performance measurement data and information; and conducts a range of analytic and support activities to promote the use of performance data and information in the monitoring and management of agency programs and initiatives.
  • In 2016, SAMHSA’s Office of Financial Resources (OFR) established a Program Integrity Review Team (PIRT) staffed by representatives from each of its four Centers and managed by OFR. However, information about PIRT is no longer publicly available as of November 2020.
Back to the Standard

Visit Results4America.org