Supporting Statement Part B 02.05.25

Supporting Statement Part B 02.05.25.docx

Pathways to Partnerships Program Evaluation

OMB:

Document [docx]
Download: docx | pdf

Tracking and OMB Number: 1820-NEW

Revised: 2/5/25


SUPPORTING STATEMENT (PART B)

FOR PAPERWORK REDUCTION ACT SUBMISSION

B. Collection of Information Employing Statistical Methods

The U.S. Department of Education’s Rehabilitation Services Administration (RSA) requests clearance for new data collection activities to support the evaluation of the 84.421E Federal fiscal year 2023 Disability Innovation Fund (DIF), Pathways to Partnerships Innovative Model Demonstration Project. The purpose of the DIF, as provided by the Consolidated Appropriations Act, 2022 (Pub. L. 117-103), is to support innovative activities aimed at increasing competitive integrated employment as defined in section 7 of the Rehabilitation Act of 1973 (Rehabilitation Act) (29 U.S.C. 705(5)), for children, youth and other individuals with disabilities. The program aims to create systematic and seamless approaches to offering services to children with disabilities, ages 10 to13 and youth with disabilities ages 14 to 24 through collaborations among State vocational rehabilitation (VR) agencies, State education agencies (SEAs), local education agencies (LEAs), Federally funded Centers for Independent Living (CILs), and other organizations offering services to this population. RSA is investing a total of $198,975,322 in grant funding to the 20 states through the Federal fiscal year 2023 DIF program.

This request covers primary data collection activities for the National Evaluation of the Pathways to Partnerships Program. These activities include the following:

  • Surveys and interviews with program participants or their parent or guardian

  • Surveys and interviews with project staff

  • Surveys with State VR, SEA, and CIL directors

  • Collecting project administrative data (staff rosters, cost worksheets, and website use data) from project directors

B.1 Respondent Universe and Sampling Methods

Describe the potential respondent universe (including a numerical estimate) and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, state and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

For each administrative and primary data source proposed, Exhibit B.1 summarizes the respondent universe, sampling method, sample size, and expected response rate.

Exhibit B.1. Respondent universe, sample, and expected response rate for study data sources

Data sources and respondent

Respondent universe (estimated)

Type of sample

Sample size per round

Expected response rate

Interviews

Interviews with project and partner staff

1,000

Purposive Sample

200a

100%

Interviews with youth participants with disabilities

121,000

Purposive Sample

100

100%

Interviews with parents or guardians of child participants with disabilities

121,000

Purposive Sample

100

100%

Survey data

Survey of State VR, SEA, and CIL directors

570

Census

570a

80%

Survey of project and partner staff

1,000

Census

1,000a

80%

Survey of child and youth participants with disabilities

121,000

Purposive Sample

48,000

100%

Administrative data

Project staff rosters

20

Census

20a

100%

Cost data

20

Census

20

100%

Website use data

20

Census

20b

100%

a This data collection occurs twice over the course of the evaluation.

b This data collection occurs twelve times over the course of the evaluation.

CIL = Center for Independent Living; SEA = State education agency; VR = vocational rehabilitation.

B.2 Statistical Methods for Sample Selection and Degree of Accuracy Needed

Describe the procedures for the collection of information, including:

  • Statistical methodology for stratification and sample selection.

  • Estimation procedure.

  • Degree of accuracy needed for the purpose described in the justification.

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

B.2.1 Sample Selection

Each data collection effort will include all projects. The sampling approach, which will vary by data collection effort, is described in more detail below.

Interviews

All projects will be included in qualitative data collection efforts, which involve two site visits: one in Year 2 of the study and one in Year 4. During each site visit, the study team will interview project leaders and staff of partner organizations and observe project activities, such as staff training sessions or communities of practice. The activity the team will observe will be selected based on the project schedule and activities underway when the site visit is scheduled. Interviews with child and youth participants with disabilities and their parents or guardians will occur only in Year 4 and will be scheduled outside of the site visit. The study team will purposively select interview sample members. The study team will work with the project director to select project leader and staff participants based on their role, the population they serve, and their length of time involved in the project. The team will select child and youth participants with disabilities based on information they provide via the survey of child and youth participants with disabilities. Participants will be asked to opt into the qualitative data collection effort. The team will seek to ensure selected participants include a range of people with different characteristics. The team expects to interview up to 30 people in each of the 20 projects during or outside the site visits.

Survey Data Collection

Survey of State VR, SEA, and CIL directors. The study team will not use sampling methods for the State VR, SEA, and CIL director survey. It will survey directors of State VR agencies (including combined, general, blind, and tribal VR agencies), SEA special education divisions, and Federally funded CILs in all fifty states and the District of Columbia during Project Years 2 and 4 (570 staff in total per survey administration). The purpose of the State VR, SEA, and CIL director survey is to capture data about the service environment in grantee and non-grantee states, including information about the extent to which there is collaboration across organizations, features of the service system available to children and youth with disabilities, and indicators of a seamless delivery system. The survey will allow the study team to compare the service environment in grantee and non-grantee locations and changes in those environments over time. Some of questions in this survey will overlap with those of the project and partner staff survey discussed below.

Survey of project and partner staff. The study team will survey all staff who are implementing the 20 Pathways to Partnerships projects. Project and partner staff include project staff whose salaries are paid by the grant. These staff also include staff in SEAs, CILs, VRs, and LEAs who are implementing or coordinating with project activities but whose salaries are not paid by the grant. Such staff may include special education teachers and specialists in LEAs, VR counselors who serve children and youth with disabilities, and direct service and management staff of partner CILs. The study team anticipates surveying up to 50 staff in each of the 20 projects during Project Years 2 and 4 (100 staff in total per project). The study team will use staff rosters collected from the projects to identify the sample universe. At this time, the universe for this survey is unknown but estimated to equal 1,000. The project and partner staff surveys will help capture the nature, frequency, and quality of cross-agency collaborations, staff members’ perceptions of the projects’ operations, the service environment, and changes in these factors over time.

Survey of child and youth participants with disabilities. At the time of project enrollment in Project Years 2 to 4, project staff will ask enrolling adult-age youth or the parents and guardians of enrolling minor children to complete a survey for the study to capture information about enrollees’ characteristics and experiences before they begin participating in project interventions. Such information will include demographic characteristics, awareness and use of available services, and expectations about children and youth with disabilities’ education and employment. The information will help characterize children and youth participants for the participation analysis and enable comparisons between early and late enrollees in their pre-enrollment experiences to identify systems-level changes over time. Child and youth participants with disabilities will also receive an opportunity to opt into the Year 4 interview data collection effort. If they choose to opt into the interview sample, the survey will also collect their contact information.

All 20 projects will have procedures for enrolling children and youth with disabilities into project services. The 20 projects collectively expect to serve more than 100,000 children and youth with disabilities during Project Years 2 to 5. The survey will be administered to the first 800 participants who enroll in each project year during Project Years 2 to 4, resulting in a maximum sample of 48,000. If projects enroll fewer than 800 participants in a study year, the survey will be administered to all participants in that year.

Administrative Data Collection

The study team will not use sampling methods for administrative data; the team will collect the following administrative data from all projects.

Project staff rosters. Contact information, including the roles, affiliations, and geographic locations of project and partner staff, is necessary to describe projects’ organizational structures and to create the samples for data collection activities that involve project staff and partners (project and partner staff surveys and site visit interviews and observations). The study team will ask project directors or their designee to complete a template with the roster information twice: at the beginning of Year 2 and at the beginning of Year 4 (before administering the project staff surveys and scheduling the site visits). Collection at two points will ensure that the sample frame includes the correct people and accommodate staff turnover between the two data collection years.

Cost information. Estimates of each project’s cost will be developed to understand the projects’ priorities in terms of where they invested their resources and to estimate the average costs of key activities. The study team will ask project directors or their designee to complete a template to report cost information in Project Year 4. The form will focus on project cost data representing annual costs for each State’s Project Years 1 to 3 (October 2023 to September 2027), which represents the period of project start-up through a steady state when projects are neither ramping up nor winding down. In addition, the form will request information to help understand how the costs are allocated across specific key activities.

Website use data. The study team will collect website use data to measure the extent to which the project websites were used. This information will help in understanding the reach of the projects’ websites and the information needs of children and youth with disabilities, their families, and the youth service professionals that serve them. The study team will ask project directors or their designee to complete a template to report website use data quarterly in Project Years 2 to 4. The form will focus on website metrics from the prior quarter, such as total users, total unique users, and time spent on the site.

B.2.2 Estimation Procedures

Exhibit B.2 indicates the analysis types to be used for each research question.

Qualitative analyses. The study team will code qualitative data using NVivo or similar software package using emergent thematic codes to identify trends and themes in qualitative data.

Descriptive analyses. The study team will calculate the average values of continuous variables. For categorical variables, the study team will calculate the percentage of sample members in each category. The team will also report measures of the precision of these estimates, such as confidence intervals.

Regression analyses. The study team will use regression analyses to estimate the effect of the project innovations on staff members’ experiences with interagency collaboration and children and youth with disabilities’ uptake of services. For analyses that examine service use by children and youth with disabilities, the study team plans to control for covariates that represent baseline characteristics of enrollees and their families, including demographic characteristics and disability type. For analyses that examine staff members’ collaboration experiences, the study team plans to control for covariates that represent baseline characteristics of staff and their working environment, such as demographic characteristics, organization, role, and whether the staff’s agency previously collaborated with other organizations.

Comparative analyses. To compare groups, such as comparing the 20 project states with one another, the study team will report the magnitudes of differences between groups and assess their statistical significance.

Exhibit B.2. Estimation methods for each study research question 

Topic area and research questions

Qualitative analyses

Descriptive analyses

Regression analyses

Comparison group analyses

1. Implementation questions 





1.1. What are the primary innovation models that projects implemented?


X



1.2. Who are the key project partners, what are their roles, and in what ways are they collaborating?


X



1.3. What new services or resources are the projects offering to children, youth, and parents as part of the innovation models?


X



1.4. What new training or resources are the projects offering to youth service professionals who interact with children, youth, and parents?


X



1.5. What are the most significant facilitators and challenges the projects experienced when developing and implementing the new partnerships and services?

X

X



2. Participation 





2.1. What has been the projects’ experience with uptake of the new services and resources for children, youth, and parents?

X

X



2.2. What facilitators and challenges have the projects experienced when connecting with families and helping them use the projects’ services and resources? How have the projects addressed the challenges?

X

X



2.3. What has been the projects’ experience with uptake of the new training and resources for youth service professionals?

X

X



2.4. What facilitators and challenges have the projects experienced when connecting youth service professionals with trainings and other project resources? How have the projects addressed the challenges?

X

X



3. Outcomes and impacts 





3.1. What impacts have the projects had on support staff and agency partnerships?


X

X

X

3.2. How have the projects changed youth service professionals’ knowledge and skills as they interact with youth and parents?

X

X



3.3. To what extent do youth and parents know where to go to receive education and employment services and resources in their communities?

X

X

X

X

3.4. To what extent have children and youth used key education and employment services and resources in their communities?

X

X



3.5. To what extent do children and youth have unmet education or employment service needs? How has this changed over time?

X

X



3.6. Have the education and employment outcomes of youth with disabilities improved over the course of the project? Have they improved relative to comparable youth in other states or in non-pilot parts of the grantee states?

X

X

X

X

4. Costs 





4.1. How were funds allocated across specific activities during Year 3 (a steady state year after project implementation and before close-out)?


X



4.2. What was the average cost of providing services, resources, and training to participants?


X



5. Systems change 





5.1. To what extent have the projects achieved a seamless transition system for children and youth with disabilities?

X

X



5.2. Are the partnerships developed under the project likely to persist after the grant period?

X

X



5.3. After the grant ends, are projects likely to sustain any of the new services or resources they developed?

X

X



5.4. What lessons or advice would the projects offer to other states or local areas that want to achieve a seamless transition system for children and youth with disabilities?

X

X



B.2.3 Degree of Accuracy Needed for the Purpose Described

The degree of precision will vary across analyses depending on the topic and data source used. Exhibit B.3 presents estimated minimum differences the study will be able to detect in analyses that use the data sources described in this package under different sample size assumptions. These analyses include those in which the state is the unit of analysis (grantee states versus non-grantee states) and analyses that compare individuals, which may include children and youth enrolled early versus late in the project and staff surveyed in Project Year 2 versus staff surveyed in Project Year 4. The minimum detectable differences (MDDs) are shown as a percentage points of the outcomes.

The relevant sample size will differ depending on the data source used in the analysis and whether data are pooled or analyzed separately by project. Estimated sample sizes for key sources of data include the following:

  • Children and youth with disabilities enrolled in the projects and completing the survey of child and youth participants with disabilities: up to 48,000 in total and ranging from 500 to 2,400 per project

  • State VR, SEA, and CIL directors surveyed in Project Years 2 and 4 in 20 project and 20 comparison (non-project) states: 1,140 (570 per survey round)

  • Project and partner staff surveyed in Project Years 2 and 4: 1,600 (800 per survey round)

The MDDs will be largest for outcomes derived from the State VR, SEA, and CIL director survey. Nonetheless, we expect that the differences in outcomes relevant to this group (State policy and service system features) must be large to generate policy-relevant impacts on the ultimate outcomes of children and youth with disabilities.

The MDDs will also be large for subgroup analyses. If sample sizes permit, the study team will examine variations in selected children and youth with disabilities outcomes across four subgroups: (1) age, (2) race and ethnicity, (3) disability, and (4) geographic location (Exhibit B.4). These represent characteristics that are correlated with the employment of people with disabilities.1,2



Exhibit B.3. MDDs for selected combined treatment and comparison group sample sizes

Total number of people included in analysis

Minimum detectable differences (percentage points) for outcomes with:

50 percent prevalence

30 percent prevalence

10 percent prevalence

5 percent prevalence

Comparison of 20 project states with 20 comparison states

1,000

10

9.6

6.8

5.2

300

17

17.3

13.8

11.5

Comparison of a subgroup of five project states with another subgroup of five project states

2,500

16.3

16

12.1

9.8

250

20.5

20.6

16.1

13.3

100

25.7

26.3

21.4

18.5

Comparison of an early cohort with a later cohort

10,000

2.1

2

1.3

0.9

1,000

6.7

6.3

4.4

3.3

500

9.4

9

6.3

4.9

Note: The minimum detectable differences were calculated assuming (a) a one-tailed test; (b) 10-percent significance (α) level; (c) an 80-percent level of power; and, for State-level comparisons, (d) an intraclass correlation of 0.05. The outcome prevalence rates for each column pertain to the group with the lowest prevalence rate.


Exhibit B.4. Potential analytic subgroups for the National Evaluation of the Pathways to Partnerships Program

Subgroup

Age

  • Ages 10 to 13

  • Ages 14 to 24

Race and ethnicity

  • Hispanic or Latino

  • White, non-Hispanic

  • Black, non-Hispanic

  • Asian, non-Hispanic

  • More than one race

  • Other, non-Hispanic

Disability status

  • Physical

  • Learning

  • Intellectual or developmental

  • Behavioral or emotional

Geographic location

  • Rural

  • Suburban

  • Urban


B.2.4 Unusual Problems Requiring Specialized Sampling Procedures

The study team does not anticipate any unusual problems that require specialized sampling procedures.

B.2.5 Use of Periodic (Less Frequent than Annual) Data Collection Cycles to Reduce Burden

To minimize burden, the study team will collect the study’s data as infrequently as possible while fulfilling the study’s analytic requirements. The team will complete the following data collection activities only once:

  • Administer the survey of child and youth participants with disabilities (approximately March 2025 to December 2027).

  • Collect cost records (June 2027).

By necessity, the study team will collect other data, including the following, more frequently:

  • To inform survey and qualitative data collection, the team will collect staff records from Pathways to Partnership providers once in 2024 and once in 2026.

  • The study team will conduct the State VR, SEA, and CIL director and project staff and partner surveys—as well as qualitative data collection (interviews with youth, parents or guardians, and project staff)—twice, once late in 2024/early 2025 and once late in 2026. The baseline measure of experiences and practices captured in these efforts is critical to the analyses because it will allow the study to characterize the baseline practices of projects and experiences of participants. The two years of follow-up data will allow the study to examine changes in practice and experiences over time.

B.3 Methods to Maximize Response Rates and Deal with Nonresponse

Describe methods to maximize response and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.

B.3.1 Maximizing Response Rates

Across all aspects of data collection, the study team will use several strategies that have proven successful on other studies.

Instrument development. Before developing the instrument, the study team will clearly define research objectives so that the survey does not impose undue burden with questions that do not inform these objectives. The study team will keep surveys short and targeted. Because longer survey instruments increase respondent burden, anxiety, and fatigue, they can lead to lower completion rates and reduce data integrity and validity. For all questions, the study team will use plain language and avoid complexity.

Specific methods for maximizing response rates in the collection of data in each study component are as follows.

Survey programming. The study team will program skip patterns in all surveys to ensure respondents need not read or respond to questions inapplicable to them. The study team will also program the survey to provide respondents with information about the survey length and a progress bar that indicates completion percentage. Finally, the study team will program the survey so that there are minimal text instructions and use visual cues to signal to the respondent how to continue progressing through the survey.

The team will use Voxco survey software to design and administer surveys. Voxco Online’s standard question types have been tested and are compliant with section 508 of Web Content Accessibility Guidelines 2.0; the system works with screen readers making it accessible to those with vision impairments. The self-administered surveys will be online and can be completed on multiple devices at the respondent’s convenience, including a smart phone, tablet, and desktop PC. Voxco optimizes the user experience, so respondents do not have to resize their screens for maximum visibility. The survey software detects the type of device being used, and elements are reorganized and reformatted to provide an intuitive experience tailored to the device. The software supports ease of use and accessibility by adhering to the Web Content Accessibility Guidelines principles of being perceivable, operable, understandable, and robust. The study team will design the surveys with a high degree of visual appeal and intuitive flow which will necessitate few text instructions. Respondents will have the option to save their progress and continue later in time. The surveys will use drop-down response categories or radio button choice lists whenever appropriate so respondents can quickly select from a list. It will use dynamic questions, automated skip patterns, and choice restriction logic so respondents see only the questions that apply to them (including those based on answers provided previously in the survey), and their answers are restricted to only those intended by the question. 

Outreach. The study team will consistently brand mail and email outreach to assure recipients of the legitimacy of the data collection effort. The study team will personalize outreach to the desired respondent using their name. All outreach materials to State directors and project staff (Appendix H) and children and youth with disabilities and their parents (Appendix I) will stress the importance of the potential respondent’s participation and the confidentiality of their response. They will include a toll-free number to address any concerns or questions about the survey. The outreach materials for State directors will include endorsement letters from key Federal stakeholders (including staff at the Office of Special Education Programs, RSA, and the Administration for Community Living).

Incentives. The study team will not offer incentives for completing the survey of child and youth participants with disabilities because the survey will be administered as part of the projects’ enrollment processes. The team will provide a $30 incentive to eligible CIL staff who participate in the State director and project staff surveys. Children and youth with disabilities and their parents or guardians who participate in interviews will also receive a $30 gift card in appreciation of their time. All incentives will be delivered using Tango Cards. Tango Cards allow respondents to select the vendor gift card of their choice. The study team will create a personalized, project-specific email template that includes a thank you message, instructions, and a link for redeeming the e-gift card, a Tango help desk phone number, and email address and phone number for respondents that need help or have not received their gift cards in a timely manner. After choosing how they will redeem the e-gift card, the respondent will then receive a second email from the chosen vendor (or vendors). This email contains the actual gift card, which might include a PIN, a printable bar-coded gift card image, or both. For respondents that lose or cannot access the gift card redemption links, the team can retrieve and forward links.

B.3.2 Dealing with Non-Response

Monitoring nonresponse. The study team will closely monitor completion rates by Pathways to Partnerships project and respondent characteristics through weekly reports. At the respondent level, the team will follow up with nonrespondents via email and phone to encourage survey completion. If the study team finds that some projects have lower response rates, it will reach out to the relevant Pathways to Partnerships project staff to understand potential reasons for nonresponse. When needed, the study team will ask project directors to promote the surveys. As the team tracks response rates, it will explore the need for a nonresponse bias analysis and development of nonresponse weights.

Addressing nonresponse, missing data, and attrition. The study team will carefully analyze variables with missing data in cases in which data might be missing from quantitative or administrative data sources. By accounting for the missing data, the team will avoid biasing impact estimates and outcome measures for the National Evaluation of the Pathways to Partnerships Program.

The study team will use various strategies to account for missing data across analyses. If an outcome measure is missing at random, it will omit that observation from the analysis of that outcome. If an outcome is missing but not at random, (for example, when survey respondents skip certain questions based on their previous answers), the study cannot omit the observation from the analysis without biasing our findings. In these situations, the study team will use more advanced techniques—such as multiple imputation—to account for patterns of missing data or reassess whether those outcomes can be meaningfully analyzed.

For the outcomes and impact analysis, the study team will carefully consider how to analyze unrealized outcomes among participants. Many participants’ education and employment outcomes will be realized after the timeline of the evaluation. The team plans to focus on service use outcomes that apply to all participants and analyze the employment and education outcomes for the subgroup of youth with disabilities who were old enough to attain such outcomes.

B.4 Tests of Procedures or Methods to be Undertaken

Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

During the 60-day public comment period, the study team pretested each survey instrument with fewer than nine people who represent each respondent population (staff or children and youth participants with disabilities).

Survey of child and youth participants. The child and youth participant survey was pretested with five participants, including one individual with disabilities who was over age 18 completing the survey on their own and four parents of children with disabilities completing the survey on behalf of their children. During the pretest, participants completed the online survey via a link shared during the meeting. After the online survey was completed, the study team debriefed with the participant to review any issues they may have encountered and gathered additional feedback on the survey. Interviewers followed a protocol to probe certain items to ensure they were phrased clearly and collected accurate information. The expected respondent burden for the baseline survey was 15 minutes. Respondent burden averaged 7 minutes (7 minutes before revisions; 8 minutes after revisions), about 8 minutes shorter than expected. After three pretests, the child and youth participant survey was revised and pretested again with two interviewees. The study team revised the survey in response to the pretest results. The revisions include providing more detailed instructions at certain questions, providing additional definitions for certain data elements, and adding supplemental prompts or response options to increase the clarity of existing questions. 

State director and project staff survey. The state director and project staff survey was pretested with seven participants. The process was similar to the process used for the child and youth participant survey. Expected respondent burden for the follow-up survey was 30 minutes. Respondent burden averaged 17 minutes (19 minutes before revisions; 14.33 minutes after revisions), about 13 minutes shorter than expected. After four pretests, the state director and project staff survey was revised and then pretested again with two interviewees. The study team revised the state director and project staff survey in response to the pretest results. The revisions include providing more detailed instructions and rational for the data collection and providing additional definitions for certain data elements. 

Administrative data collection forms. All administrative data collection forms – the staff roster, cost worksheet, and website use form – were tested with three project staff members. Each pretester reviewed all three forms with a member of the study team to provide feedback on each form’s anticipated ease of use, points of confusion, suggested changes, and the estimated length of time it would take to complete the forms.

  • Staff roster. Project staff pretesters estimated that it would take between 10 and 90 minutes to complete the staff roster, depending on the size of their project’s workforce and their internal roster system. This estimated time-to-complete is significantly lower than the eight hours projected by the study team. The study team revised the burden estimate to reflect this feedback, reducing it to three hours. The study team also revised the instrument to incorporate other feedback collected during pretesting, including clarifying that respondents should provide information about where staff members provide services rather than where they live.

  • Cost worksheet. Project staff pretesters estimated that it would take between two and three hours to complete the cost worksheet. This estimated time-to-complete was significantly lower than the six hours projected by the study team. The study team revised the burden estimate to reflect this feedback, reducing it to three hours. The study team also revised the instrument to incorporate other feedback collected during pretesting, including providing additional examples for the types of expenses that would qualify for each section, collapsing some categories of expenses, and adding headers to split costs into relevant categories.

  • Website use form. Project staff pretesters estimated that it would take between 10 and 20 minutes to complete the website use form. To be conservative in our estimates, the study team maintained the anticipated one-hour burden originally proposed. Pretesters did not have any recommended changes to this instrument.

Having made these instrument revisions, the study team will program the survey instruments for administration via computer-assisted web interviewing methods and the forms via an online platform. Before deployment, the team will test the survey instruments and forms to ensure they function as designed. This process will include extensive manual testing for survey skip patterns, fills, and other logic, as well as many configurations of form entries. To reduce data entry errors, numerical entries will be checked against an acceptable range, and, when appropriate, prompts will be presented for valid but unlikely values. This testing will increase the accuracy of data collected while minimizing respondent burden.

The study team did not pretest the site visit interview protocols. These were not pretested because they are closely modeled on similar protocols that have been effectively used for other studies and because they will be used as general guides for semistructured conversations.

No public comments were received during the 60-day notice period.

B.5 Individuals Consulted on Statistical Aspects of the Design

Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other persons who will actually collect and/or analyze the information for the agency

Exhibit B.5 lists the people who were consulted on the statistical aspects of the design of the National Evaluation of the Pathways to Partnerships Program.

Exhibit B.5. People consulted on statistical aspects of the evaluation design

Name 

Title 

Telephone number

Mathematica

John Deke

Senior Research Fellow

(609) 275-2230

Lisbeth Goble

Principal Survey Researcher

(312) 994-1016

Barbara Harris

Senior Researcher

(202) 554-7568

Gina Livermore

Senior Research Fellow

(202) 264-3462

Purvi Sevak

Disability Area Director 

(609) 945-6596

M. Davis and Company

Kim Dorazio

Vice President

(215) 790-8903

RSA

Diandrea Bailey, PhD

Project Officer, Contracting Officer Representative

(202) 245-6244

Sheryl Fenwick

Budget Analyst, Alternate Contracting Officer Representative

(202) 245-6345

Cassandra Shoffler

Project Officer, DIF Program Manager

(202) 245-7827

Dr. Ashley Brizzo

Director, Training and Service Programs Division

(202) 245-6379

Douglas Zhu

Chief, Training Programs Unit

(202) 987-0127



1 Carter, E. W., Austin, D., & Trainor, A. A. (2012). Predictors of Postschool Employment Outcomes for Young Adults With Severe Disabilities. Journal of Disability Policy Studies23(1), 50-63.

2 Sevak, P., O’Neill, J., Houtenville, A., & Brucker, D. (2018). State and Local Determinants of Employment Outcomes Among Individuals With Disabilities. Journal of Disability Policy Studies29(2), 119-128

16


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJames McClure - Mathematica
File Modified0000-00-00
File Created2025-05-19

© 2025 OMB.report | Privacy Policy