1625-0089 Supporting Statement_Part_B_20250312

1625-0089 Supporting Statement_Part_B_20250312.docx

The National Recreational Boating Safety Survey

OMB: 1625-0089

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT FOR
NATIONAL RECREATIONAL BOATING SAFETY SURVEY
OMB Control No.: 1625-0089
SUPPORTING
STATEMENT PART B
COLLECTION
INSTRUMENT(S): Survey Questionnaires


  1. Collection of Information Employment Statistical Methods


The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When Item 17 on the Form OMB 83-I is checked "Yes", the following documentation should be included in the Supporting Statement to the extent that it applies to the methods proposed:


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

The U.S. Coast Guard has partnered with the National Opinion Research Center (NORC) at the University of Chicago, a nonprofit organization, through a cooperative agreement to administer the next National Recreational Boating Safety Survey (NRBSS). A cooperative agreement is distinguished from a grant through substantial involvement. NORC’s substantial involvement includes helping the Coast Guard develop survey questionnaires, a sampling plan, conducting data collection and quality control, writing final reports and preparing final datasets.


We have divided this section into two parts to describe two universes: the universe of boats and the universe of boating participants. In each section, we provide (a) a definition of the universe, (b) sample sources, (c) sampling and respondent selection methods, (d) population and sample sizes, and (e) expected response rates.

The Universe of Recreational Boats

A Definition of the Universe

The universe for the Exposure Survey is all recreational boats in the U.S. on January 1 of the survey year where a “recreational boat” is a boat not used for any commercial purpose.

The universe of recreational boats includes:

    • Privately owned, registered [with a State Department of Motor Vehicles or equivalent agency] boats;

    • Privately owned, unregistered boats (i.e. canoes, kayaks and standup paddleboards);

    • Rented boats that are captained by private citizens - people who are not licensed, commercial boat captains for hire - for the purpose of recreational boating; and

    • Boats accessed through a boat club and captained by private citizens for the purpose of recreational boating.


The universe does not include:

  • Boats owned or captained for commercial purposes.

  • Commercial ships, such as cruise ships, containerships, tankers or USCG-inspected passenger boats.

  • Uninspected passenger boats or towboats

  • Foreign-flagged commercial or recreational boats not registered in the United States

  • Pool toys such as inner tubs or pool floats

Sample Sources

For the Exposure Survey, the universe is all recreational boats in the United States. There are three sources for selecting a sample of recreational boats:

  1. State boat registration databases that will cover registered boats in states where lists are available for the National Recreational Boating Safety Survey (NRBSS).

  2. An address-based sample (ABS) that will cover registered and documented boats, registered boats in states where lists are not available, unregistered boats, and rented boats or boats available through a boat club.

  3. Non-probability sample from either on-line panels or lists of likely boaters from cooperating organizations that will cover registered and documented boats, registered boats in states where lists are not available, unregistered boats, and rented boats or boats available through a boat club.


Most states make their registration databases available to the public. These lists will be used for state sampling in the states in which they are available. ABS and non-probability online panels will be used for data collection in all states, regardless if a boat registration database is available, to identify unregistered boat owners in states where registration lists are available and all boats in states where registration lists are not available.

Sampling and Respondent Selection Methods

The Exposure Survey will be conducted throughout 2026 and will collect information on registered and unregistered, owned and rented recreational boats. There are three sampling sources for the Exposure Survey: Boat registration lists from states, ABS sample, and nonprobability sample from opt-in survey panels or lists of boat owners from boating organizations.

A sample of boats will be selected from each state that grants the project access to its boating registration list. InfoLink, a vendor who has helped provide sample for the last two NRBSS, is working with NORC and the United States Coast Guard (USCG) to gain access to each state’s boating registration list for the sample. Each state sample will be stratified by boat type and designed to be representative of all main boat types identified by the National Association of State Boating Law Administrators (NASBLA) Vessel Type Classification Guide which aligns with official recreational boat types defined in 33 CFR 174.3. Mail invitations will be directly addressed to the registered boat owner.

An ABS sample will be selected for each state from the United State Postal Service’s Computerized Delivery Sequence file (CDS), which has been evaluated in the literature to contain essentially all US households that receive mail (Harter et al. 2016). NORC currently holds a license for a complete copy of the USPS CDS file. Unlike the second-generation Delivery Sequence File (DSF2, another standard USPS data product), CDS provides both regular address updates and access to supplemental information important for addressing under-coverage in rural areas. The ABS sample will be geocoded and stratified by ZIP code and propensity to own a boat. ZIP codes will be classified into two to five groups based on the number of registered boats per capita, and zip codes with a higher number of registered boats per capita will be oversampled. Addresses will also be matched to commercial data files that identify households likely to own boats. Addresses flagged as likely boat owners will be oversampled to improve the precision and reduce the number of people invited to the survey. Mail invitations will be directly addressed to the address resident.

NORC will work with both cooperating boating organizations and online nonprobability panel vendors to invite people in each state to complete the survey. NORC will provide the online nonprobability vendors demographic targets for the sample composition in each state such as the portion of interviews from certain age, sex, race/ethnicity, and education groups. The vendors will invite panelists who are eligible for the survey to participate via email or panel portal. NORC will coordinate with cooperating boating organizations to invite boat owners to participate in the survey.

Population and Sample Sizes

The population for the Exposure Survey is all registered and unregistered boats intended for recreational boating. Because state-level estimates are required for the Exposure Survey, state sample sizes are provided in Exhibit 1.

State sample sizes were calculated with the goal of a median Coefficient of Variation (CV) of 0.11, where the median CV is across all states. Sample was first proportionally allocated across the 50 states and Washington, DC. Then, sample by state was reallocated to achieve the goal CV of 0.11 with each state having a minimum of 1,000 interviews and a maximum of 5,500 interviews expected.

The sample sizes in the table below include the number of owners contacted and the number of addresses sampled for the Exposure Survey along with the number of interviews anticipated through the nonprobability panels.

Exhibit 1: Anticipated Sample Sizes for the Exposure Survey

 

Total Number of Households

Registered Vessels

State Registration List Sample

ABS Sample

Nonprobability Interviews

Alaska

274,574

54,246

1,172

1,412

586

Alabama

2,016,448

246,243

2,183

2,629

1,091

Arkansas

1,216,207

186,549

1,598

1,925

798

Arizona

2,850,377

115,210

2,579

3,107

1,289

California

13,550,586

704,633

5,860

7,059

2,929

Colorado

2,384,584

77,301

1,730

2,085

865

Connecticut

1,433,635

96,585

2,162

2,605

1,080

Dist of Columbia

326,970

2,534

1,172

1,412

586

Delaware

402,334

35,819

1,172

1,412

586

Florida

8,826,394

978,389

5,860

7,059

2,929

Georgia

4,092,467

336,766

3,125

3,765

1,562

Hawaii

494,827

15,804

1,172

1,412

586

Iowa

1,330,995

217,588

1,800

2,169

900

Idaho

717,151

76,258

1,707

2,056

853

Illinois

5,056,360

225,225

5,042

6,074

2,520

Indiana

2,726,489

219,020

2,571

3,097

1,285

Kansas

1,175,294

90,223

2,020

2,433

1,009

Kentucky

1,828,680

170,432

2,044

2,463

1,022

Louisiana

1,816,902

288,293

2,185

2,632

1,092

Massachusetts

2,797,776

171,556

3,840

4,626

1,919

Maryland

2,375,984

160,886

3,602

4,339

1,800

Maine

605,338

105,270

1,258

1,515

628

Michigan

4,089,794

744,313

4,780

5,758

2,389

Minnesota

2,322,190

770,742

4,590

5,529

2,294

Missouri

2,521,832

247,279

2,417

2,911

1,208

Mississippi

1,148,340

125,572

1,607

1,935

803

Montana

464,072

44,189

1,172

1,412

586

North Carolina

4,299,266

355,246

3,199

3,854

1,599

North Dakota

331,481

75,768

1,172

1,412

586

Nebraska

803,157

78,942

1,767

2,129

883

New Hampshire

557,220

74,395

1,665

2,006

832

New Jersey

3,516,978

148,861

3,332

4,014

1,665

New Mexico

848,218

27,535

1,172

1,412

586

Nevada

1,198,356

40,836

1,172

1,412

586

New York

7,774,308

391,755

4,872

5,869

2,435

Ohio

4,878,206

642,037

4,494

5,414

2,246

Oklahoma

1,573,180

184,330

1,896

2,284

948

Oregon

1,726,340

146,883

2,183

2,630

1,091

Pennsylvania

5,294,065

315,204

3,746

4,512

1,872

Rhode Island

446,688

35,383

1,172

1,412

586

South Carolina

2,136,080

327,342

2,457

2,960

1,228

South Dakota

368,300

58,945

1,172

1,412

586

Tennessee

2,846,684

235,934

2,561

3,085

1,280

Texas

11,087,708

569,117

5,106

6,151

2,552

Utah

1,129,660

67,936

1,521

1,832

760

Virginia

3,380,607

237,740

2,956

3,561

1,477

Vermont

277,090

29,190

1,172

1,412

586

Washington

3,079,953

243,722

3,015

3,632

1,507

Wisconsin

2,491,121

546,322

3,500

4,216

1,749

West Virginia

736,341

45,503

1,172

1,412

586

Wyoming

243,321

24,354

1,172

1,412

586

Total U.S.

129,870,928

11,410,205

128,066

154,276

64,000

Expected Response Rates

The Exposure Survey will be conducted with the registration list, ABS, and nonprobability samples, with the survey approach designed to maximize response rates for each sample. Response rates for surveys continue to decline, and the Bureau of Labor Statistics has highlighted how the decrease in response rates for government surveys has accelerated since the COVID-19 pandemic1. With this in mind, we expect about a 27% response rate nationally for the list frame, about a 12.5% response rate nationally for the ABS sample, for a combined national response rate of about 18.5% (AAPOR RR32). In comparison, the 2018 NRBSS had response rates of 32.4% for the list frame, 15.1% for the ABS sample, and 21.9% overall. The response rate is not a widely accepted metric for nonprobability samples given that people are not randomly sampled to complete a survey. Thus, a response rate will not be calculated for the nonprobability sample.

The Universe of Boating Participants

A Definition of the Universe

The second universe of interest to the NRBSS is boating participants. The base for the universe of boating participants is the U.S. household population. It is a goal of the survey to determine the proportion of Americans who have participated in recreational boating during the reference year. A boating participant is defined as someone who has spent time on a recreational boat, docked or on the water, during the reference period.

Sample Source

The sample source for the Participant Survey is the AmeriSpeak Panel, NORC’s probability-based panel that is representative of households in all 50 states and the District of Columbia. During the initial recruitment phase of the panel, randomly selected U.S. households were sampled with a known, non-zero probability of selection from the NORC National Sample Frame and then contacted by U.S. mail, email, telephone, and field interviewers (face-to-face). The panel provides sample coverage of approximately 97 percent of the U.S. household population. Those excluded from the sample include people with P.O. Box only addresses, some addresses not listed in the USPS Delivery Sequence File, and some newly constructed dwellings.

AmeriSpeak is used by several federal agencies, including the Centers for Disease Control and Prevention, the Department of Defense, the Internal Revenue Service, the U.S. Fish and Wildlife Service, and the Office of the Assistant Secretary for Public Affairs in the Department of Health and Human Services to provide nationally representative data to inform their work.

Sampling and Respondent Selection Methods

Approximately 10,700 panel members will be randomly drawn from the AmeriSpeak Panel and be invited to answer for their household. The sample will be stratified by age, sex, race/ethnicity, and education. Historical panelist completion rates will be used to select a sample within each strata such that the selected sample is representative of the population. For households with two or more panelists, one panelist will be randomly selected for the survey.

Population and Sample Sizes

The population for the boating Participant Survey is all children and adults in the United States. To survey the population, we will conduct household interviews with AmeriSpeak panelists.

Expected Response Rates

Due to the extensive panel recruitment process, AmeriSpeak surveys achieve the highest AAPOR response rate (AAPOR RR3) of any multi-client panel solution on the market (Dennis 2019). Recruitment is a two-stage process: initial recruitment using less expensive methods and then nonresponse follow-up using personal interviewers. For the initial recruitment, sample units are invited to join AmeriSpeak online by visiting the panel website AmeriSpeak.org or by telephone (in-bound/outbound supported). The second-stage nonresponse follow-up targets a stratified random subsample of the non-responders from the initial recruitment. Units sampled for the nonresponse follow-up are sent a new recruitment package by FedEx with an enhanced incentive offer.

Field interviewers then make personal, face-to-face visits to the respondents’ homes to encourage participation. Field interviewers administer the recruitment survey in-person or encourage the respondents to register at AmeriSpeak.org or call the toll-free AmeriSpeak telephone number to register. This face-to-face nonresponse follow-up increases the recruitment rate and measurably reduces biases in the AmeriSpeak Panel compared to its peers.

Panel retention also contributes to the overall response rate. NORC maintains strict rules to limit respondent burden and reduce the risk of panel fatigue. On average, panelists participate in AmeriSpeak web-based or phone-based studies two to four times a month. Because the risk of panel attrition increases with the fielding of poorly constructed survey questionnaires, the AmeriSpeak team works to create surveys that provide an appropriate user experience for panelists.

A properly calculated cumulative response rate for panel-based research considers all sources of nonresponse at each stage of the panel recruitment, management, and survey administration process. Using this approach, we estimate an overall national response rate of about 14% (AAPOR RR3) for the Participation Survey, compared to the 14.9% 2018 response rate. The overall response rate assumes a panel recruitment rate of about 25%, a panel retention rate of 75%, and a survey completion rate of 75%.

2. Describe the procedures for the collection of information including:


      • Statistical methodology for stratification and sample selection,


      • Estimation procedure,


      • Degree of accuracy needed for the purpose described in the justification,


      • Unusual problems requiring specialized sampling procedures, and


      • Any use of periodic (less frequently than annual) data collection cycles to reduce burden.


Statistical methodology for stratification and sample selection

Exposure Survey Sample Processing and Management

The Exposure Survey will use three sample sources; 1) state registration lists, 2) ABS sample, and 3) nonprobability sample from online panels and lists of boat owners from boating organizations. The state registration list and ABS samples will use the same mail survey methodology described below. Nonprobability panels will use their respective panel’s contact methods and boat owners identified through boating organization lists will be contacted as the respective organization allows.

NORC will obtain registration databases from states willing to share this information. To facilitate mailing, sample records obtained from the vendor will be provided in standardized format containing the owner’s mailing address and vessel type and owner name if allowed by the state. The lists will reflect the most accurate contact information available at the time and will have been updated via the National Change of Address Database (NCOA).

The ABS sample will be developed from the USPS Computerized Delivery Sequence file (CDS), including only city-style residential addresses and PO BOX addresses that are flagged as Only Way to Get Mail (OWGM). Drop delivery and vacant households will be removed.

Once the samples are obtained, they will be assigned a unique identifying PIN number that will be used to track all survey mailings and each record’s disposition during fielding.

Exposure Survey Document Preparation

NORC will format all survey documents including initial mailings, text messages, survey instruments, cover letters, and reminders. The survey text will be carefully designed to include clear, user-friendly instructions that encourage respondent cooperation and increase response rate. The paper survey instrument will be designed to accurately capture the data reported by respondents. Skip patterns will be clearly marked with explanatory text to guide the respondent to the next appropriate point in the survey. Questions will be numbered and sections marked that provide an intuitive path for the respondent.

Exposure Survey Data Collection Protocol

The Exposure Survey will employ a sequential, mixed-mode, web, phone, and paper questionnaire data collection design in which list and ABS respondents will be invited to participate by web or inbound telephone first, then mailed self-administered paper questionnaires if they do not respond online or via phone. Survey materials will include an 800 number, which respondents can dial to complete the survey by telephone if they wish.

Respondents who choose to complete via the web will be directed to the study website housed on NORC’s secure servers. The website will include information about the study, how to access their PIN in case it is misplaced, and how to contact the study team with questions via a dedicated email inbox or dedicated toll-free project phone line. Respondents will be able to toggle the study website between English and Spanish depending on their preference. Usability testing will be done prior to fielding.

List and ABS, potential Exposure Survey respondents will receive up to four mailings and one text message inviting them to complete the survey. The five contacts will be: 1) Letter with $1 bill; 2) Text message 3) Reminder postcard; 4) Paper survey or second letter; 5) Second reminder postcard.

The first contact will be the initial survey invitation mailed to all sampled addresses. This letter will encourage completion of the questionnaire via the web instrument, provide the web URL and unique access code, and offer a toll-free phone number to complete the survey or to have questions answered. We will use a #10 envelope with a visible window on the back to display the cash incentive enclosed ($1 bill) given it will improve response rates (Dutwin et al., 2023; Sherr et al., 2021). The letter will also mention the $10 promised incentive for completing the survey, include a study-specific email address, and provide the URL for the study’s site.

The text message will be sent to sampled addresses about four days after the survey invitation letter is mailed. The message will include information about the study and a link to the web survey.

The reminder postcard will be sent about seven days after the invitation letter to all sampled addresses. It is mailed when response to the initial invitation begins to fall off, and it encourages non-responders to complete the questionnaire via the web or phone. The postcard will provide the PIN under a scratch-off so respondents can easily complete the survey online. It will also contain the toll-free number if they wish to complete the survey via telephone or if they need assistance.

The third mailing will be either a paper questionnaire or a second letter 18 days after the second reminder postcard. This paper questionnaire will be targeted to a subsample of non-responders identified with Big Data Classifier models as either members of hard-to-reach populations or those identified as most likely to complete the survey with a paper questionnaire. The paper questionnaire mailing will include a cover letter, the paper questionnaire, and a postage-paid business reply envelope for respondents to use when returning the completed questionnaire to NORC. Respondents who do not receive the paper survey will instead receive a second letter with language targeted toward encouraging them to complete the survey. This mailing will be about four weeks after the initial invitation letter.

Seven days after the third mailing, NORC will mail a second reminder postcard to all non-responding addresses. The postcard is mailed to encourage non-responders to complete the survey online, via phone, or with the paper questionnaire. It will contain their PIN, the toll-free number, and a project-specific email address so that respondents can contact NORC if they have questions, need assistance, or if they wish to complete the survey via telephone.

Exposure Survey Return Mail Processing and Data Entry

NORC clerks will promptly receipt returned mail questionnaires to ensure that the case management information is updated in a timely manner. This will ensure that no additional follow-up attempts take place for respondents that have already completed the survey for that round. Once receipted, completed paper questionnaires will be securely transported to our Computer Assisted Data Entry (CADE) vendor, Data Shop Incorporated (DSI), to be data entered. NORC has a longstanding relationship with DSI as the primary CADE vendor for many large data collection projects, and their double-entry technique has proven to be highly accurate. Data files containing the entered survey data will be securely transmitted back to NORC via a secure file transport protocol, and the paper questionnaires will be transported back to NORC for secure storage and eventual destruction.

NORC clerks will also receipt all project mail that is returned undeliverable, and enter this information into the case management system, to prevent continued mail attempts to undeliverable addresses.

Exposure Survey Data Collection Tracking System

NORC will store and track sample information, dispositions, and survey data allowing for study progress reports throughout data collection.

Participation Survey Data Collection

Approximately 10,700 panel members will be randomly drawn from the AmeriSpeak Panel and be invited to answer for their household. The sample will be stratified by age, sex, race/ethnicity, and education. Historical panelist completion rates will be used to select a sample within each strata such that the selected sample is representative of the population. For households with two or more panelists, one panelist will be randomly selected for the survey.

Information regarding boating participation will be collected via the AmeriSpeak Panel. The panelists are able to complete each survey either on-line or over the phone with an interviewer and are invited to complete a new survey via email and their member portal. Recruited panelists will be contacted on a four-month schedule for a period of 12 months.

The approach for the Participation Survey will be designed to maximize cooperation and boost the overall response rate. Similar to other AmeriSpeak studies, panelists will receive up to seven emails, texts, and/or phone reminders to complete the survey and respondents will be offered an incentive to complete the survey.

Estimation procedure

The Exposure Survey will be used to estimate boat ownership incidence and boat exposure at the state level. The estimates of boat ownership incidence for registered boats will be weighted to match state lists for registration by type of boat. Lists of unregistered boats are not available, and NORC will weight estimates of unregistered boat ownership incidence to benchmarks from statistical models that leverage 2026 NRBSS and prior rounds of NRBSS data, 2026 Participation Survey data, and commercial data. To estimate annual boating exposure at the state level, NORC will develop statistical models that incorporate respondents’ reports about both boating exposure for a particular month and number of months they boat. The models will be designed to account for seasonal variation in boating exposure in some states.

The Participation Survey will be used to estimate national boating incidence and demographic characteristics of boaters and boat owners. NORC will weight the data so the demographic makeup of respondents matches population benchmarks from the American Community Survey for characteristics such as age, sex, race, ethnicity, education, Census Division, and urbanicity. The data will also be weighted so the proportion of registered boat owners matches the benchmark from state registration lists and proportion of unregistered boat ownership matches the previously described model-based benchmarks.

Degree of accuracy needed for purpose described in justification

The Exposure survey will be completed by approximately 30,000 registered boat owners via the state registration lists, 4,600 via the ABS sample, and 7,800 via nonprobability panels. This design will allow for boat ownership estimates by boat type and exposure estimates at the state level. State sample sizes were calculated with the goal of a median Coefficient of Variation (CV) of 0.11, where the median CV is across all states. Sample was first proportionally allocated across the 50 states and Washington, DC. Then, sample by state was reallocated to achieve the goal CV of 0.11 with each state having a minimum of 1,000 interviews and a maximum of 5,500 interviews expected. This design has an overall margin of error of 2.4 percentage points at the 95 percent confidence level, including the design effect.


The Participation Survey will collect data using the AmeriSpeak Panel and ask questions of the same panelists three times over the course of a year. It is expected that approximately 8,000 AmeriSpeak panelists will complete at least two of the three waves of the Participation Survey. This design has an overall margin of error of 1.1 percentage points at the 95 percent confidence level, including the design effect.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


We have planned the data collection frequency to keep respondent burden at the lowest possible level while mitigating potential nonresponse bias. To maximize response rates, we have designed survey systems to obtain every possible response. To address the problem of non-response, we will weight the data to be representative of the target population and undertake a nonresponse bias analysis.

Increases in the number of survey noncontacts and refusals have led to a decline in response rates, regardless of survey mode. However, survey nonresponse in and of itself isn’t necessarily a problem; it becomes problematic when the propensity to respond to either the survey in general (unit nonresponse) or survey questions in particular (item nonresponse) is correlated with the variables of interest. NORC will implement a comprehensive adaptive survey design (ASD) strategy aimed at reducing potential nonresponse bias through improving the balance of the respondent sample.

For the Exposure Survey, the design will minimize nonresponse bias in a few ways. We want to hear from everyone, not just boat owners, to get an accurate understanding of how many people own, rent, and borrow a boat. However, because people are more likely to respond to a survey about topics in which they are interested, people who do not own a boat are less likely to respond to a boating survey. The Exposure survey will be branded as a survey about public waterways to ABS respondents to cast the widest net of interest reducing nonresponse bias. Similarly, nonprobability respondents will receive their respective panels standard contact language. Doing so will make the request look like all others they receive, reducing nonresponse bias due to disinterest in boating.

Commercial data and Census block group data will also be appended to the ABS sample to understand if households that own a boat along with Census block group characteristics are predictive of survey response. Nonresponse adjusted weights will incorporate commercial data along with Census tract level data to adjust for survey nonresponse.

The Exposure Survey is designed to understand boat ownership of registered and unregistered boats, but owners of kayaks, canoes and other unregistered/nonmotorized craft may not view themselves as boat owners. Because lists of unregistered craft are not available, NORC will build statistical models to predict nonmotorized boat prevalence using previously collected data, participation data, and commercial data to adjust via weighting for potential nonresponse bias on unregistered boat ownership measures.

For the Participation Survey, the AmeriSpeak panel is an ideal sample to help mitigate nonresponse error because it is less susceptible to avidity bias than an address-based sample when measuring participation in boating activities. AmeriSpeak panelists do surveys on a variety of topics, and interest in boating is less likely to impact their willingness to do the survey compared to respondents recruited specifically for a survey on boating. When respondents are recruited specifically for a survey on boating, people interested in boating could be more likely and willing to respond than people not interested in boating, and this nonresponse error could reduce the reliability of the estimates. To further reduce avidity bias, the panelists will receive the standard AmeriSpeak language inviting them to the survey making the request look like all others rather than a boating survey for the USCG.

The AmeriSpeak panel also provides significant coverage of hard-to-reach populations such as those in rural areas, younger adults, lower educated adults, and people of color that are often underrepresented in samples.

Sampled boat owners and ABS addresses will receive a $1 incentive in the first letter inviting them to participate in the Exposure Survey. They will also receive a $10 completion incentive. AmeriSpeak panelists will be offered a $5 incentive for completing each of the three Participation Survey requests. Research and best practices are conclusive in that incentives significantly increase response rate, which in turn can help increase data quality and reduce concerns for bias (Dillman et al., 2014; Stanley et al., 2020).

To provide respondents with maximum flexibility for participation, the Exposure Survey instrument will be available to respondents in web, phone, and paper formats. Incorporating an option to take the survey via phone, which was not available in the 2018 NRBSS, will boost response rates by allowing households who either cannot or prefer not to take the survey online to participate. NORC has found that offering respondents the chance to do a survey on the phone is important, and about 5%-10% of completes from similar ABS studies have come via the phone. Also, in NORC’s experience, offering inbound phone helps increase participation among ethnic minorities as well as older and less-educated adults who may not be willing to complete the survey via web or paper thus decreasing nonresponse bias. Paper surveys will also be sent to hard-to-reach groups and those most likely to respond to a mail survey (identified through Big Data Classifiers) to maximize response and again reduce nonresponse bias. The Participation Survey will be offered to AmeriSpeak panelists on both the web and phone, which are the modes they complete all surveys.

Multiple contacts will be made with each respondent of both surveys to increase survey participation. Four mailings will be sent to each list and ABS Exposure Survey respondent, along with a text message, while AmeriSpeak panelists will receive up to seven contacts for the Participation Survey. NORC and other organizations have found that three or four mailings provide an optimal combination of a strong response rate with minimal potential response bias and a large sample size with low cost per interview. Sending a text message to list and ABS Exposure Survey respondents can help boost response rates and minimize nonresponse bias by providing another mode to invite people to complete the survey.

Exposure Survey invitation letters will also be enhanced to increase response through the use of envelopes with peekaboo windows in the back that make the $1 pre-incentive visible to respondents. Having the incentive visible increases the likelihood of people opening the letter and has shown to lead to a higher response rate than standard envelopes (Dutwin et al. 2023; Sherr et al. 2021).

4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


NORC will conduct a small pilot test study during the summer of 2025. The pretest will test both the participation and exposure instruments. For the Participation Survey, respondents will be asked to recall their boating activities for the past four months. For the Exposure Survey, respondents will be asked to recall their activities in the previous month. This timing mimics the realistic timing of both surveys.

The Participation Survey pilot will feature about 500 interviews with AmeriSpeak panelists. The Exposure Survey pilot will also feature 1,500 interviews, with 500 interviews coming from the registered boater list sample, 500 interviews coming from the ABS sample, and 500 interviews coming from the nonprobability sample. The list and ABS, pilot study sample for the Exposure Survey will receive a postcard, letter, or text message inviting them to complete the survey online or by calling an 800 number to do the survey with a live interviewer. The large sample size of the pilot test will allow us to assess the effectiveness of the instrument across modes and with all types of respondents and demographic groups. With the low incidence of some activities relevant to the study, a large sample size will ensure that a sufficient number of respondents will be eligible to complete the full interview.

A primary goal of the Pilot Test Study is to assess the functionality of the questionnaires prior to fielding the main Participation and Exposure Surveys in 2026. The Pilot Study also provides an opportunity to do a first test and further refinement of our protocols related to: sampling and design, data collection, respondent and data protection, disclosure review, data monitoring, and data delivery. Following the conclusion of the Pilot Study, we will review each of these protocols, as well as the results of the survey itself to evaluate how well they worked and what final adjustments need to be made prior to the main data collection.

5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Nadarajasundaram Ganesh, Ph.D.

NORC at the University of Chicago

55 East Monroe Street

30th Floor

Chicago, IL 60603

ganesh-nada@norc.org

(301) 775-0289

David Sterrett, Ph.D.

NORC at the University of Chicago

55 East Monroe Street

30th Floor

Chicago, IL 60603

sterrett-david@norc.org

(312) 350-2848


References

Dennis, M. (2019). Technical overview of the AmeriSpeak panel—NORC’s probability-based research panel. NORC at the University of Chicago White Paper.

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys: The tailored design method. John Wiley & Sons.

Dutwin, D., Bilgen, I., Hendarwan, E., Roopam S. (2023). Peekaboo! The Effect of Different Visible Cash Display and Amount Options During Mail Contact When Recruiting to a Probability-Based Panel, Journal of Survey Statistics and Methodology.

Harter, R., Battaglia, M.P., Buskirk, T.D., Dillman D.A., English, N., Fahimi, M., Frankel, M.R., Kennel, T., McMichael, J.P., McPhee, C.B., Montaquila, J., Yancey, T. & Zukerberg, A.L. (2016). Address-based sampling. Prepared for AAPOR Council by the Task Force on Address-Based Sampling, Operating Under the Auspices of the AAPOR Standards Committee. Oakbrook Terrace, Il. Available at http://www. aapor.org/getattachment/Education-Resources/Reports/AAPOR_Report_1_7_16_CLEAN-COPY-FINAL-(2).pdf.aspx, 140.

Sherr, S, and Wells, B. (2021). What you see is what you get: Evaluating the use of visible incentives in the California Health Interview Survey. AAPOR Conference.

Stanley, M., Roycroft, J., Amaya, A., Dever, J. A., & Srivastav, A. (2020). The Effectiveness of Incentives on Completion Rates, Data Quality, and Nonresponse Bias in a Probability-based Internet Panel Survey. Field Methods32(2), 159-179. 


1 https://www.bls.gov/blog/2023/what-is-bls-doing-to-maintain-data-quality-as-response-rates-decline.htm

Shape1

25


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWALSH, CHRISTINA
File Created2025:05:19 08:08:45Z

© 2025 OMB.report | Privacy Policy