Evaluation of the Certified Community Behavioral Health Clinic Demonstration in Accordance with the Bipartisan Safer Communities Act
Supporting Statement – Section A
Program Officials/Project Officers
Judith Dey, Ph.D. and Laura Jacobus-Kantor, Ph.D.
U.S. Department of Health and Human Services
Office of the Assistant Secretary for Planning and Evaluation
200 Independence Avenue SW, Washington DC 20201
The Office of the Assistant Secretary for Planning and Evaluation (ASPE) at the U.S. Department of Health and Human Services (HHS) is requesting Office of Management and Budget (OMB) approval for qualitative and survey data collection activities to support the evaluation of the Certified Community Behavioral Health Clinic (CCBHC) demonstration program in accordance with the Bipartisan Safer Communities Act. Quantitative data collection activities with their respective OMB approval numbers are described further in the final paragraph of this section.
Section 223 of the Protecting Access to Medicare Act (Pub.L. 113–93; PAMA) authorized the Certified Community Behavioral Health Clinic (CCBHC) demonstration to allow states to test a different strategy for delivering and reimbursing a comprehensive array of services provided in community behavioral health clinics. The demonstration aims to improve the availability, quality, and outcomes of outpatient services provided in these clinics by establishing a standard definition for CCBHCs and develops a new Medicaid prospective payment system (PPS) in each state that accounts for the total cost of providing nine types of services to all people who seek care. The PPS in each state is designed to provide CCBHCs with the financial support and stability necessary to deliver these required services. The demonstration also aims to incentivize quality through quality bonus payments to clinics and requires CCBHCs to report quality measures and costs. The demonstration was originally authorized for two years.
In December 2016, the U.S. Department of Health and Human Services (HHS) selected eight states to participate in the demonstration based on the ability of their CCBHCs to (1) provide the complete scope of services described in the certification criteria and (2) improve the availability of, access to, and engagement of clients with a range of services. The demonstration was originally authorized for two years and scheduled to end in July 2019, but Congress has extended it multiple times. In 2020, the Coronavirus Aid, Relief, and Economic Security Act (CARES Act; Public Law 116-136) authorized HHS to add two states to the demonstration from among 24 original planning grant states. As a result, in August 2020, HHS announced that Kentucky and Michigan would begin participating in the demonstration. The demonstration is currently authorized through September 2025 for the six remaining original states and for six years from the beginning of demonstration participation for Kentucky and Michigan. The Bipartisan Safer Communities Act (BSCA), enacted in June 2022, authorizes all states to apply to participate in the demonstration beginning in 2024 (Public Law No: 117-159). HHS awarded planning grants to 15 states in March 2023 to develop proposals to participate in the demonstration. HHS plans to award additional planning grants to states in 2025. Beginning July 1, 2024, and every two years thereafter, HHS can select up to 10 additional states to participate in the demonstration so that all states eventually have an opportunity to participate.
PAMA mandates that HHS submit reports to Congress about the Section 223 demonstration that assess (1) access to community-based mental health services under Medicaid in the area or areas of a state targeted by a demonstration program as compared to other areas of the state, (2) the quality and scope of services provided by certified community behavioral health clinics as compared to community-based mental health services provided in states not participating in a demonstration program and in areas of a demonstration state that are not participating in the demonstration, and (3) the impact of the demonstration on the federal and state costs of a full range of mental health services (including inpatient, emergency, and ambulatory services). The ability of ASPE to provide this information to Congress requires a rigorously designed and independent evaluation of the CCBHC demonstration.
In 2016, ASPE began a five-year mixed-methods evaluation of the first two years of the demonstration to address the PAMA requirements and describe implementation successes and challenges to inform annual reports to Congress. Data collection activities for the original evaluation were covered under a separate OMB approval (0990-0461). As the demonstration continued in the original states and expanded to others, ASPE has further evaluated the implementation and outcomes of the demonstration in accordance with PAMA, with data collection for the continued evaluation activities covered under OMB approval 0990-0485. The present information collection request covers additional data collection activities beginning in 2025 which will continue data collection in the six remaining original demonstration states; Kentucky and Michigan; and two new cohorts of states that will join the demonstration in 2024 and 2026, respectively. The primary goals of the evaluation are to provide information to inform future reports to Congress and to further assess implementation of the demonstration in each state.
To learn about the implementation of the CCBHC program and address the topics of access, quality and scope of services, and costs as required by PAMA, the study team will use a mixed-methods approach to assess:
The structures and processes states and CCBHCs implemented to increase access to care, and the impact of these investments on access to care relative to care provided elsewhere in demonstration states;
Changes in the quality and scope of services and care coordination provided by CCBHCs as a result of the demonstration and how the scope of services provided to CCBHC clients compares with the services available in other service settings; and
The impact of the CCBHC demonstration on Medicaid services and their associated state and federal costs.
The qualitative and survey components of the evaluation are designed to help ASPE understand the services offered by CCBHCs; state and clinic perceptions on costs of these services; the extent to which CCBHCs experience challenges with demonstration implementation; their experience reporting and using quality measures and improving quality of care; and the perspectives of state officials, CCBHCs, and people receiving services on the overall successes and challenges associated with the CCBHC model. The evaluation will also include an impact study that examines changes in service use and costs of serving CCBHC clients relative to comparison groups and examine impacts of the CCBHC demonstration on access and quality. The evaluation will address specific questions regarding costs, quality, and impacts through analysis of Medicaid claims data obtained from CMS, cost reports that states will submit to CMS under a separate OMB approval (OMB 0938-1148 CMS 10398), quality measures that states are required to submit to SAMHSA under a separate OMB approval (OMB 0938-1148 CMS 10398); data from the National Substance Use and Mental Health Services Survey collected under a separate OMB approval (0930-0386); and CMS-64 forms states submit to CMS under a separate OMB approval (0938-1265); thus, this submission focuses solely on new primary qualitative and survey data collection efforts.
Section 223 of PAMA requires the Secretary of HHS to provide annual reports to Congress that include an assessment of access to community-based mental health services under Medicaid, the quality and scope of CCBHC services, and the impact of the demonstration on federal and state costs of a full range of mental health services. ASPE’s evaluations of the first six years of the CCBHC demonstration informed past reports to Congress. The data collected under this submission will help ASPE further address the PAMA topics and new questions that have emerged since the demonstration started to inform future reports to Congress. Each proposed data collection instrument is described below, along with how, by whom, and for what purpose the collected information will be used. Table A.1 provides additional detail about how the content areas in each data collection instrument will be used to address the PAMA requirements.
All of the primary data collection efforts for this evaluation will be virtual because it 1) is the most efficient method of data collection (more resources can be dedicated to data collection as there are no travel expenses). It also provides additional flexibility in sampling (there is no need to select CCBHCs that are close together because traveling to a specific geographic region for site visits is not necessary. Increased use of virtual platforms during the COVID-19 pandemic has improved respondents’ comfort levels with communicating virtually/telephonically and navigating technological platforms. It also may be more convenient for respondents as they can participate in interviews or focus groups from any location. The instruments ASPE is seeking approval for are as follows:
State official interview protocols. The evaluation team will use the interview protocol to conduct one interview per state per year with state officials to gather information about states’ progress at different stages of demonstration implementation. Each round of interviews will include state Medicaid and behavioral health officials and will have a slightly different focus, reflecting the stage of implementation at the time of the interviews.
To capitalize on the specific perspectives state leadership can offer, a key focus of these interviews in each year of data collection will be states’ demonstration plans, investments, and support as well as external policy and program context. Interviews conducted in late 2025 will include questions that align with the PAMA topic areas of access, quality, and scope of services. For example, for all states we will inquire about the types of care management and care coordination provided and changes in the quality of care over time.
Interviews conducted in late 2026 will also address questions from all PAMA areas (access, quality and scope of services, costs, and cross-cutting issues). The final round of interviews, conducted in late 2027 focus largely on outcomes and final implementation experiences. For example, they will ask for state officials’ perceptions regarding CCBHCs’ impacts on the use of services, seek final reflections and updates on the steps CCBHC took to increase access to care, and ask about CCBHCs’ ability to maintain the required services over time and about any challenges encountered.
CCBHC interview protocol. In evaluation years 3 and 5, the evaluation team will conduct virtual interviews with CCBHC leadership (CCBHC project director/chief executive officer, CCBHC medical director, and any other key leaders suggested by the CCBHC) per CCBHC at up to 15 CCBHCs per year. Interviews will include CCBHCs in a diverse group of states and that encompass the demonstration’s different payment models, organizational structures, and implementation contexts. Interviews will cover most evaluation questions, including those related to access, quality and scope of services, costs, and cross-cutting topics and will gather reflections on the CCBHC model, including implementation successes and challenges, from those directly implementing the model and serving clients.
Client focus group protocol. The evaluation team will conduct up to four virtual focus groups with clients receiving CCBHC services toward the beginning of the fifth evaluation year (fall 2027). Each focus group will include up to eight clients for a total of approximately 32 clients. Recruitment efforts will target clients from the same CCBHCs that are included in the CCBHC leadership interviews. Recruiting clients from the same CCBHCs that participate in virtual interviews is an efficient way to help ensure this data collection activity also reflects a balance of CCBHC characteristics. The evaluation team will hear firsthand from clients regarding the PAMA topics—for example, respondents will reflect on whether they have perceived improvements in access to care and whether they are satisfied with their current access to care.
CCBHC survey. The CCBHC project director or other designated staff at each CCBHC will complete an online survey in the fourth and fifth evaluation years (fall 2026 and 2027). The survey will gather key information over time about clinics’ operations and how their structures, procedures, and services align with the CCBHC certification criteria and support the goals of the demonstration. For example, the survey will collect information about staffing, scope of services, and accessibility; use of health information technology; relationships with other providers for the purposes of service delivery and care coordination; and quality reporting and improvement activities. The survey will include structured fields to gather comparable information from each CCBHC, using prompts and preset response categories such as check boxes. The survey will include skip patterns to reduce burden on CCBHC respondents and improve consistency of data collection.
Table A.1. Data collection activities, by data source
Data source |
Mode, timing, and respondent |
PAMA topic area (PTs) |
Content |
Analysis |
Qualitative data sources |
||||
State official interviews |
In Years 3, 4, and 5 of the evaluation, the evaluation team will conduct virtual interviews with state Medicaid and behavioral health officials. |
PT1, PT2, PT3 |
(1) Successes and challenges in improving access to care; (2) staff hiring and retention; (3) collaborations and relationships with other organizations; ;(4) scope of services and coordination of care;(5) quality of care; (6) reporting and use of data to improve quality; (7) cost, payment, and PPS; (8) community needs assessments; (9) sustainment/expansion activities |
Descriptive analyses |
CCBHC interviews |
In Years 3 and 5 of the evaluation, the team will conduct virtual interviews with CCBHC leadership (CCBHC project director/chief executive officer, CCBHC medical director, and any other key leaders suggested by the CCBHC). |
PT1, PT2, PT3 |
(1) Access to care; (2) service use patterns; (3) scope of services; (4) collaborations and relationships with other organizations; (5) care management and care coordination; (6) quality of care; (7) reporting and use of data to improve quality; (8) cost, payment, and PPS; (9) state supports and policy initiatives; (10) implementation successes and challenges; (11) sustainability plans |
Descriptive analyses |
Focus groups with people receiving services |
In Year 5 of the evaluation, the team will conduct focus groups with people receiving CCBHC services. |
PT1, PT2 |
(1) Access to care; (2) service use patterns; (3) scope of services; (4) care management and coordination; (5) quality of care |
Descriptive analyses |
CCBHC survey |
During Years 4 and 5 of the demonstration, all CCBHCs will submit the CCBHC survey to the evaluation team. |
PT1, PT2, PT3 |
(1) CCBHC staffing; (2) accessibility; (3) care coordination; (4) scope of services; (5) data sharing, quality, and other reporting, (6) costs, (7) sustainability |
Descriptive analyses |
The evaluation is expected to be completed in 2028. Table A.2 shows the schedule of data collection activities covered by this OMB request.
Table A.2. Timeline for the data collection
Data source |
Dates |
State official interviews |
Fall 2025, Fall 2026, and Fall 2027 |
CCBHC interviews |
Fall 2025 and Fall 2027 |
CCBHC client focus groups |
Fall 2027 |
CCBHC survey |
Fall 2026, Fall 2027 |
CCBHCs will submit survey responses through Confirmit, a secure, online survey platform. Focus groups with people receiving CCBHC services will be held virtually using WebEx or Zoom in virtual meeting rooms. The evaluation team will monitor the attendance and participation of all individuals in the virtual meeting room during each focus group.
In formulating the evaluation design, ASPE has carefully considered ways to minimize burden by supplementing existing data sources with targeted primary data collection. To this end, the evaluation incorporates the following approach:
Using data from existing sources while conducting supplemental primary data collection: To the extent possible, information regarding demonstration implementation will be gathered through a review of available sources, including, for example, state demonstration and planning grant applications; Section 1115 demonstration waiver applications; the annual National Substance Use and Mental Health Services Survey; CMS-64 reports; cost reports and quality measures states will submit as part of demonstration requirements; and Medicaid claims. However, the level of detail and consistency of the information provided in these source documents and other data sources will likely vary from state to state and may not fully address the PAMA requirements. To supplement data gathered from these sources, ASPE is requesting OMB clearance to conduct virtual interviews with state officials and CCBHC staff, conduct an online survey, and hold virtual focus groups with CCBHC clients. The evaluation team will use the information gathered from virtual interviews to clarify and fill in gaps in the data gathered from the survey and document review. The team will conduct virtual interviews with clinic leadership in up to 30 CCBHCs to have in-depth discussions about access, quality and scope of services, costs, and cross-cutting topics. To minimize respondent burden, interview and survey questions for each data source will be tailored to reflect the expertise and insight offered by each respondent type.
The CCBHCs in the participating states vary in size, from small entities to large provider organizations. The qualitative data collection protocols have been designed to minimize burden on these entities and on people receiving CCBHC services who participate in focus groups. The evaluation team will make every effort to schedule virtual interviews and focus groups at the convenience of these respondents and participants. The evaluation team will request the minimum amount of information from CCBHCs that is required to evaluate the CCBHC demonstration effectively.
Each of the data sources provides information needed for the evaluation. If the data are not collected, the evaluation team will not have adequate information to address the PAMA requirements. The inclusion of all planned data sources is needed to obtain information about demonstration implementation and impacts on quality and costs.
CCBHC leadership interviews will take place only twice. If they are not conducted, the evaluation team will not have adequate information to evaluate whether implementation is consistent with PAMA and/or ensure that the Secretary has the information necessary to provide Congress with the information mandated by PAMA. CCBHCs will submit surveys twice; repeated reports are needed to examine changes in access, scope of services, and other demonstration requirements over time. Similarly, the evaluation team will conduct interviews with state officials annually to understand the evolution of demonstration administration and investments across cohorts; implementation successes and challenges; and changes in access to, costs, and quality of care over time. Lastly, it is essential to obtain information directly from the people receiving CCBHC services to understand how implementation of the model affects their access to care and experiences with care.
This information collection fully complies with 5 CFR 1320.5(d)(2).
This is a new data collection. The 60-day notice was published in the Federal Register on July 22, 2024 (89 FR 140; pages 59121-59122). No comments were received.
If a CCBHC is willing to assist with the logistics of participant recruitment for client focus groups, the clinic will receive an honorarium of $1,000 for their assistance. Each client who participates in a focus group will receive a $50 gift card. State official and CCBHC staff participation in other data collection activities will be carried out in the course of their employment; no additional compensation will be provided outside of their normal pay.
The Privacy Act does not apply to this data collection. Participants will not be asked about, nor will they provide, individually identifiable health information.
Before the start of state official interviews, CCBHC interviews, and focus groups with people receiving services, the evaluation team will remind all respondents that the information gathered will be used for evaluation purposes only and not be attributable to any individual. Responses should not contain private information but will be aggregated to the extent possible so that individual answers will not be identifiable. Because of the limited number of respondents interviewed per state and CCBHC, however, it might be possible to infer individual responses from reports. (For example, there may be only one state Medicaid official participating per state. Similarly, for states with few CCBHCs, it may be possible to infer which CCBHCs were selected for interviews.) For each state official and CCBHC leader interviewed, the evaluation team will collect name, professional affiliation, and title, but not Social Security numbers, home contact information, and similar information that could identify the respondent directly. The reports from the evaluation will not contain the names of respondents or the names of CCBHCs that participated in the interviews.
For people participating in the focus groups, the evaluation team will provide instructions for them to log into the technological platform in a way that ensures only their first name will be displayed. The team will only use first names during the focus group (or, in the case of groups with two members with the same first name, will only use the first initial of a participant’s last name). Prior to agreeing to participate in the focus group, clients will have reviewed and signed consent forms that explain and assure confidentiality. The evaluation team will revisit confidentiality when they cover the ground rules at the beginning of the focus group. Further, the team will emphasize that respondents should not share anything they hear during the discussion with anyone outside of the focus group. In order to provide gift cards and communicate with respondents, the evaluation team will gather email addresses. They will also obtain participants’ names and signatures as part of the consent process, but will not collect any other information that could identify the respondent directly.
Before each interview and focus group, the evaluation team will ask all respondents to give permission to allow the evaluation team to record the conversation, solely for the purpose of filling in any gaps in the research notes. Only the evaluation team will have access to the recording; it will be destroyed at the conclusion of the evaluation. If the respondent does not wish to have the interview recorded, the interviewer will take notes instead. The evaluation team will maintain the recording and interview and focus group notes in a secure electronic folder that only a minimum number of evaluation staff members may access.
Information will be kept private to the extent allowed by law.
The evaluation team will not ask state officials or CCBHC leaders any questions of a sensitive nature. However, they will ask them for their honest viewpoints on aspects of the demonstration that may or may not be working as planned. The evaluation team will assure them that answers to the interview questions will not be attributed to them in reports. All confidentiality and security procedures described in the previous section will apply to the information collected.
By definition, focus group respondents will be receiving care for mental health conditions and/or substance use disorders and be asked to reflect on the care they receive to help manage these conditions. However, the evaluation team will not ask sensitive questions such as questions about diagnoses or symptoms. Recruitment materials will give participants a sense of the topics to be covered and the steps the evaluation team will take to protect the confidentiality and security of the information shared, so respondents will be able to make an informed decision to participate based on their comfort levels at the outset. Additionally, many people receiving mental health and substance use disorder treatment participate in group therapies, and are therefore accustomed to sharing information in group settings.
As part of an opening discussion of ground rules for the focus groups, the evaluation team will explain that 1) respondents do not need to answer any question they do not want to, 2) the evaluation team will not discuss the conversation with the organizations that provide care or services, so participation and responses will not have any effect on respondents’ care, 3) there are no “right” or “wrong” answers, and 4) while there are no formal breaks, respondents should feel free to get up any time they need to (among other ground rules). The ground rules should help ensure everyone feels comfortable and safe participating and help to reinforce the voluntary nature of participation throughout the entire conversation.
Table A.3 provides estimates of the average annual burden for collecting the proposed information. Below we provide details on the time and cost burdens for each of the separate data collection activities.
Interviews with state officials: The evaluation team will conduct semi-structured virtual interviews with state Medicaid and behavioral health officials in each demonstration state in three evaluation years:
Interview with state officials, each lasting ninety minutes (year three: 18 states x 3 officials x 90 minutes; year four: 28 states x 3 officials x 90 minutes; year five: 28 states x 3 officials x 90 minutes)
CCBHC interviews: The evaluation team will conduct virtual interviews with CCBHC staff in year three and five of the evaluation (fall 2025 and 2027) for up to 15 CCBHCs each year.
Interview with CEO/Medical Director and any other key leadership staff recommended by CCBHC, lasting ninety minutes (15 clinics x 2 respondents x 90 minutes x 2 interview years)
Focus groups with people receiving services: The evaluation team will conduct four virtual focus groups with up to eight people receiving CCBHC services in the fifth evaluation year (fall 2027).
Focus group with 8 clients of CCBHC services, lasting ninety minutes (4 focus groups x 8 clients x 90 minutes x 1 interview years)
CCBHC survey: The evaluation team will ask all CCBHCs that have participated in the demonstration for at least one year at the time of each survey’s fielding to participate in a brief survey in the fourth and fifth evaluation years.
Survey with CCBHCs with one respondent per site, lasting approximately four hours (year four: 1 survey x 281 CCBHCs x 4 hours, year five: 1 survey x 411 CCBHCs x 4 hours)
Table A.3. Estimated annualized burden hours
Respondents/ activity |
Number of sites |
Number of respondents per site |
Responses per respondent |
Total responses |
Hours per response |
Total hour burden |
Average hourly wage |
Total hour cost burden ($) |
|
||||||||
State official interviews |
25 |
3 |
1 |
75 |
1.5 |
113 |
$69.25 a |
$7,825.25 |
CCBHC interviews |
10 |
2 |
1 |
20 |
1.5 |
30 |
$50.40b |
$1,512.00 |
CCBHC survey |
231e |
1 |
1 |
231 |
4 |
924 |
$51.16c |
$47,271.84 |
CCBHC client focus groups |
1 |
8 |
1 |
8 |
1.5 |
12 |
$34.55d |
$414.60 |
Total |
|
|
|
|
|
1,079 |
|
$57,023.69 |
aState government, professional and related category (https://www.bls.gov/news.release/ecec.t03.htm)
bOccupational Outlook Handbook: Medical and Health Services Managers (https://www.bls.gov/ooh/management/medical-and-health-services-managers.htm#tab-1)
cBLS category of clinical and counseling psychologists at outpatient care centers (https://www.bls.gov/oes/current/oes193033.htm).
dAverage hourly and weekly earnings of all employees on private nonfarm payrolls by industry sector, seasonally adjusted (https://www.bls.gov/news.release/empsit.t19.htm)
eEstimated number of CCBHCs for the states not yet selected for the demonstration is based on the average number of CCBHCs per state in states currently participating in the demonstration multiplied by the number of states expected to join the demonstration. We also included the number of clinics states currently participating in the demonstration project they will be adding to the demonstration in coming years.
There will be no capital, start-up, operation, maintenance, or purchase costs incurred by the respondents participating in data collection for the evaluation.
We estimate that two ASPE employees will be involved for 10 percent of their time. Annual costs of ASPE staff time are estimated to be $24,000. Additional costs include the contract awarded for these evaluation activities by ASPE ($547,419 over three years, or an annualized cost of $124,209.50). The total estimated average cost to the government per year is $182,473.
This is a new data collection.
The evaluation team will incorporate aggregate results from the evaluation in text and charts in the following documents which will serve as the basis for annual reports to Congress developed by ASPE:
An annual report in 2024 that will cover: 1) impacts of the CCBHC demonstration on measures of access, quality, and costs; and 2) complementary qualitative findings related to access, scope of services, quality, and cross-cutting issues, due in August 2024.
An annual report in 2025 that will cover: clinic-level findings related to access, scope of services, quality, and costs, due in August 2025.
An annual report in 2026 that will cover: 1) performance on quality measures and changes in performance over time; 1) implementation findings related to access, scope of services, and quality; 2) costs; and 3) complementary qualitative and survey findings related to access, scope of services, quality, costs, and cross-cutting issues.
An annual report in 2027 that will cover: 1) changes in caseload characteristics and Medicaid costs over time; and 2) new state experiences related to access, scope of services, quality, and costs.
A final summative report in 2028.
ASPE may also incorporate the aggregate results from the cross-site evaluation into journal articles, scholarly presentations, and congressional testimony related to the outcomes of the CCBHC demonstration.
We are requesting no exemption.
There are no exceptions to the certification. These activities comply with the requirements in 5 CFR 1320.9.
List of attachments
A. CCBHC Demonstration Evaluation State Official Interview Protocol
B. CCBHC Demonstration Evaluation Clinic Interview Protocol
C. CCBHC Demonstration Evaluation Client Focus Group Protocol
D. CCBHC Demonstration Evaluation Client Focus Group Consent Form
E. CCBHC Demonstration Evaluation Clinic Survey Template
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Mathematica Report Template |
Author | Allison Wishon Siegwarth |
File Modified | 0000-00-00 |
File Created | 2025-05-19 |