1660-0105 - Supporting Statement B - 2024 09 17 clean

1660-0105 - Supporting Statement B - 2024 09 17 clean.docx

National Household Survey on Disaster Preparedness

OMB: 1660-0105

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT B FOR

NATIONAL HOUSEHOLD SURVEY ON DISASTER PREPAREDNESS

OMB Control No.: 1660-0105

COLLECTION INSTRUMENT(S): FF-008-FY-21-103 (formerly 008-0-15);
FF-008-FY-21-104


B. Collection of Information Employment Statistical Methods


The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When Item 17 on the Form OMB 83-I is checked "Yes", the following documentation should be included in the Supporting Statement to the extent that it applies to the methods proposed:


  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The target population for the total survey sample is the non-institutionalized U.S. adult population (aged 18 years and older). According to the U.S. Census Bureau’s 2022 American Community Survey 5-Year Estimates, the U.S. population aged 18 years and over is about 260 million people. However, the survey sample will not include adults in penal or mental institutions, or other institutionalized adults. Additionally, the sample will be limited to adults who communicate in English or Spanish well enough to complete the survey.


The sample will consist of at least 7,500 completed surveys each time it is fielded.


The sample will be comprised of geographic strata based on the U.S. Census divisions. Response quotas for each stratum will be proportional to the adult population for that stratum. However, certain demographic groups may be disproportionately oversampled to ensure that sufficient responses are collected for minority populations.


The survey will be conducted using primarily web-based/online surveys. However, telephone surveys may be utilized for some respondents who lack the ability to respond to a web-based/online survey. The mix of survey respondents by survey modality may vary from year to year.


Up through 2020, the survey was implemented by telephone only (both landlines and cell phones) and the 2020 survey response rate was less than 1%. In 2021 and 2022, the survey was implemented using a mixed mode approach of telephone-based (landlines and cell phones) and web-based surveys. Starting in 2023, the survey was implemented using a web-based survey only and resulted in a survey completion rate of about 18%. However, this completion rate is based on the number of completed surveys compared to the number of survey panelists contacted to take the survey; it does not account for the number of individuals contacted to participate on the survey panel, which if accounted for would yield an effective response rate much lower than 18%.


2. Describe the procedures for the collection of information including:


  1. Statistical methodology for stratification and sample selection,


Sampling Frame


Potential survey participants will be selected from a sampling frame made up of probability-based online panels, using a web-based application programming interface. Panelists will have been recruited using a probability-based method (randomly selected using Address-Based Sampling and/or Random Digit Dialing from a nationwide sampling frame with near universal coverage of the U.S. population). When selected to complete specific surveys, most panelists will complete surveys online using their personal device (computer, tablet, mobile phone, etc.). Panelists who do not have a personal device or lack internet connectivity will either be given an internet-connected device to use for completing the surveys or will be given the option to complete the surveys by telephone.


Sample Selection


Invitations to complete this survey will be sent to panelists that have previously indicated they are residents of the United States. Residency information will be reconfirmed to be sure they are still residents of the United States, along with location information so the data can be properly weighted based upon the proportional population breakouts.


Geographic Stratification


To ensure sufficient national representation, a proportional geographic stratification approach will be implemented based on U.S. Census Bureau divisions. The sample allocation across the strata will be proportional to the population size of each stratum. Using proportional sample allocation, a target will be set for the number of surveys to be completed in each stratum. In addition, each stratum will strive for proportional representation for the following demographic variables when possible: age, gender, ethnicity, disability, education, income, race, homeownership, geographic divisions. A random sampling approach will be implemented within each stratum to select participants. This may include simple random sampling, where each individual in the stratum has an equal chance of being selected, or systematic sampling, where every nth individual is selected from a list.


  1. Estimation procedure,


Post Stratification Weighting


We will run post-stratification weighting to adjust for the following demographic variables when possible: age, gender, ethnicity, disability, education, income, race, homeownership, geographic divisions. We use the U.S. Census Bureau division proportions of these demographics to generate the raking procedure for the post-stratification weights.


  1. Degree of accuracy needed for the purpose described in the justification,


We plan to complete about 7,500 surveys per administration. The survey estimates of unknown population parameters (for example, population proportions) based on a sample size of 7,500 will have a precision (margin of error) of about +1.1 percentage points at 95 percent level of significance. This is under the assumption of no design effect and under the most conservative assumption that the unknown population proportion is around 50 percent. The margin of error (MOE) for estimating the unknown population proportion ‘P’ at the 95 percent confidence level can be derived based on the following formula:


MOE = 1.96 *   where “n” is the sample size (i.e. the number of completed surveys).


The sampling error of estimates for this survey will be computed using special software (SPSS, SAS, etc.) that calculates standard errors of estimates by considering the complexity, if any, in the sample design and the resulting set of unequal sample weights.


  1. Unusual problems requiring specialized sampling procedures, and


Unusual problems requiring specialized sampling procedures are not anticipated at this time. If response rates fall below the expected levels, additional samples will be released to generate the targeted number of surveys. However, all necessary steps to maximize response rates will be taken throughout the data collection period and hence such situations are not anticipated.


  1. Any use of periodic (less frequently than annual) data collection cycles to reduce burden.


In the past, the data collection has occurred on an annual basis. Going forward, the data collection will occur every other year to reduce the burden and annualized costs.

3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Maximize Response Rates. The survey will utilize standard techniques to maximize response rates. Potential respondents will receive a pre-notification letter detailing the purpose of the survey, followed by an email survey notification containing an embedded hyperlink to where the survey can be completed. The survey questionnaire will provide clear instructions for completing the survey and reiterate the purpose of the survey and how results will be used. The survey will be open for a period of six-to-eight weeks, to allow sufficient time for response. Reminder email notification(s) will be sent to non-respondents each week, providing the survey hyperlink and encouraging response. The survey will allow respondents to resume completing the survey where they left off. A web-based graphic user interface will be utilized to facilitate navigation and completion of the survey across multiple devices. Web-based participants may receive gift cards or points toward a gift card for participation in the online panels they are recruited for.


Issues of Non-Response. Survey based estimates for this study will be weighted to minimize any potential bias, including any bias that may be associated with unit level nonresponse. All estimates will be weighted to reduce bias and it will be possible to calculate the sampling error associated with any subgroup estimate in order to ensure that the accuracy and reliability is adequate for intended uses of any such estimate. In addition, because web-based surveys can mandate responses to every question, non-response can be eliminated.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


This survey has been conducted annually since 2013 and continually revised to enhance the user-friendliness, reliability, and utility of the questionnaires. Pilot tests were conducted previously to check for correct skip patterns and procedures, as well as comprehensibility of questions. For each new round of data collection, the newly programmed survey undergoes quality checks for skip logic adherence, confirmation of proper data capture, etc. No additional tests are planned for this survey aside from the programming tests/quality checks immediately prior to data collection.


  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Deloitte Consulting LLP has been contracted by the FEMA’s Individual and Community Preparedness Division to conduct this survey data collection and analyze the results. The individuals consulted on the statistical aspects of design are:


Andrew Burrows

Preparedness Behavior Change Branch Chief

Federal Emergency Management Agency

(202) 716-0527

andrew.burrows@fema.dhs.gov


Kelly Walter

Program Manager

Deloitte Consulting

1919 N Lynn St.

Arlington, VA 22209

(571) 777-4970

kewalter@deloitte.com


Marc Penz

System Administrator

Zogby Analytics

901 Broad Street Suite 307

Utica, NY 13501

marc@zogbyanalytics.com


Page 4 of 4

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT FOR
AuthorTSA Standard PC User
File Created2025:05:19 01:37:40Z

© 2025 OMB.report | Privacy Policy