Supporting Statement

ICR A11 FECA April 25 OCIO1a.docx

Improving Customer Experience (OMB Circular A-11, Section 280 Implementation) for the Department of Labor (DOL)

Supporting Statement

OMB: 1225-0093

Document [docx]
Download: docx | pdf

Rev. 4/4/2025

Request for Approval under the “Generic Clearance for Improving Customer Experience: OMB Circular A-11, Section 280 Implementation”

(OMB Control Number:1225-0093)

Shape1

TITLE OF INFORMATION COLLECTION: OWCP FECA Program Customer Experience Survey



TYPE OF ACTIVITY


Select all that apply.


Screener (for example, distributed before or during a usability testing session or other kind of session)

Question script for interview, focus group, discussion group, etc. Scripts for usability testing sessions are exempt from PRA review.

Survey to obtain feedback immediately following a transaction - limited to 15 questions and 5-minutes of burden maximum.

Other survey


TYPE OF SUBMISSION


New collection: Select this if you do not already have an approved collection.

Change to an already approved collection: Select this if you have an already approved collection but are making changes to the questions, the number of respondents, etc.


Please attach your instrument as a separate document using the Word template that pertains to your particular engagement, i.e., either a survey/screener or script. OMB has prepared special templates that must be used for submitting your instrument. Please ask your agency’s PRA officer for these templates. Do not paste the question list within this document. As a reminder, usability testing session scripts are exempt from PRA review, but screeners for those sessions must be submitted for PRA review.


PURPOSE OF COLLECTION:

What are you hoping to learn / improve? How do you plan to use what you learn to support service delivery improvement activities? Are there artifacts (user personas, journey maps, digital roadmaps, summary of customer insights to inform service improvements, performance dashboards) the data from this collection will inform?

We are hoping to learn more about and improve the experience of filing a claim using our electronic claim system (ECOMP). The survey asks claimants about their experience directly after using this system to file a claim. Survey results are analyzed on a monthly basis, and periodic recommendations are made for programmatic improvements. Satisfaction scores are reported agency-wide on a quarterly basis, and this data is also used for quarterly OMB reporting.


ACTIVITY DETAILS

  1. If this is a survey for a High Impact Service Provider, will the results of this survey be reported to OMB as part of quarterly reporting obligations?

Yes

No

Not applicable


  1. How will you collect the information? (Check all that apply)

Social Media

Website

Telephone

In-person

Video conference (e.g., Zoom, WebEx, Teams, etc.)

Mail

Other, Explain Electronic portal


  1. If this is a survey OR a screener, what platform will be used (e.g., Medallia, Microsoft Forms, Qualtrics, SurveyMonkey, Touchpoints, etc.)

    Qualtrics


  1. Who will you collect the information from?


Please describe exactly how you will select the people who you will invite to take the screener or survey, or to participate in the session. Some examples:


  • If you are requesting approval for a survey that will appear as a “feedback button” on a website, you can write: “100% of people who visit the website will have the opportunity to take the survey, since the invitation appears on each page of the website.”

  • If you are requesting approval for a focus group to conduct discovery phase research, you can write: “of the people who provided their email address to the call center rep, 15% of them will be selected at random to be invited to take part in the focus group.”


These are just examples. The key is you must describe the exact methodology you used to determine who gets the invitation to take part in the screener/survey/session.


100% of customers who file a CA-1 or CA-2 form will be presented with the opportunity to complete a survey (via pop-up and redirect to Qualtrics) following claim submission.


  1. Please describe the activity or methodology

Describe the information collection activity – e.g., what happens when a person agrees to participate? Will facilitators or interviewers be used? What’s the format of the interview/focus group?

When a customer agrees to participate, they are redirected the survey on the Qualtrics platform. They are presented with a brief introduction followed by 4 questions.


  • How will you ask a respondent to provide information?

(e.g., after an application is submitted online, the final screen will present the opportunity to provide feedback by presenting a link to a feedback form / an actual feedback form)


Customers
will be presented with the opportunity to complete a survey (via pop-up and redirect to Qualtrics) following claim submission.


  • When will the activity happen?

Describe the time frame or number of events that will occur (e.g., “We will conduct focus groups on May 13, 14, 15 of XXXX (year)”; “We plan to conduct customer intercept interviews over the course of the summer of XXXX (year) at the field offices identified in response to #2 based on scheduling logistics concluding by Sept. 10th”, or “This survey will remain on our website in alignment with the timing of the overall clearance.” If you are uncertain as to how long it will take to complete your research, you can write: “Until all participants complete/are interviewed.”)


This survey will remain on our portal in alignment with the timing of the overall clearance.

  1. Is an incentive (e.g., money or reimbursement of expenses, token of appreciation) provided to participants?
    Note that incentives are only justified when it would be otherwise impossible to recruit an adequate sample for your research and can only be used when your agency’s legal office has approved their use for a given engagement.

Yes

No


If yes, please describe the amount and the type of reimbursement per participant below:

     



Personally Identifiable Information: The following three questions are asked to determine whether your instrument triggers the Privacy Act of 1974 and, if so, whether you have an applicable System of Records Notice (SORN) published. If your answer to questions 1 and 2 are both “yes,” that means your instrument triggers the Privacy Act and you must have a SORN published.


1. Do any of the questions invite respondents to provide personally identifiable information (PII) including name, email address, phone number, etc. OR are you able to link survey responses with email addresses/names/other personal identifiers?

Yes

No


2. If yes to the question immediately above, will this PII be included in a system of records?
A system of records is defined by the Privacy Act of 1974 as a group of any records under the control of any agency from which information is retrieved by the name of the individual or by some identifying number, symbol, or other identifying particular assigned to the individual.

Yes

No

Not applicable


3. If you answered yes to both questions immediately above, then your instrument triggers the Privacy Act and you must have a SORN published for this collection. If you are unsure whether you have a SORN published for this collection, use a search engine to search [name of your agency] + “SORN,” and one of the top results should be a webpage that displays all SORNs at your agency. For example, see here. If you are still unsure, please consult your agency’s privacy team.


Do you have a SORN published that covers this collection?

Yes

No

Not applicable


BURDEN HOURS


IF YOU SELECTED “NEW COLLECTION” IN RESPONSE TO “TYPE OF SUBMISSION” ON PAGE 1:


Please fill in each applicable cell in the table below.


IF YOU SELECTED “CHANGE TO AN ALREADY APPROVED COLLECTION” IN RESPONSE TO “TYPE OF SUBMISSION” ON PAGE 1:


First, you must figure out whether your change will affect the number of respondents per year or the participation time for the instrument.


  • If the number of respondents and the participation time are both staying the same: input “1” for “No. of Respondents per year,” input 60 for “Participation Time in minutes,” and input “1” for “Total Burden per year in hrs.” This is because “1 hour” is the minimum that can be inputted into ROCIS for total burden hours.

  • If the number of respondents or the participation time is changing: Contact the CX Desk Officer before completing this table.


Type of Instrument

No. of Respondents per year

Participation Time in minutes

Total Burden per year in hrs

Screener (e.g., distributed before or during a usability testing session or other kind of session)

     

     

     

Question script for focus group, interview group, etc. Scripts for usability testing sessions are exempt from PRA review.

     

     

     

Survey to obtain feedback immediately following a transaction


Rspondent type: Individuals or Households

10,000

3

(.05 hours)

500

Other survey

     


     

Totals

10,000


500


CERTIFICATION: Please read the certification carefully. If you incorrectly certify, the collection will be returned as improperly submitted or it will be disapproved.


I certify the following to be true:

  1. The collections are voluntary;

  2. The collections are low-burden for respondents (based on considerations of total burden hours or burden-hours per respondent) and are low-cost for both the respondents and the Federal Government;

  3. The collections are non-controversial;

  4. Any collection is targeted to the solicitation of opinions from respondents who have experience with the program or may have experience with the program in the near future;

  5. Personally identifiable information (PII) is collected only to the extent necessary and is not retained;

  6. Information gathered is intended to be used for general service improvement and program management purposes

  7. The agency will follow the procedures specified in any relevant OMB guidance for the required reporting to OMB of data from surveys.

  8. Outside of the reporting mentioned in the bullet immediately above, if the agency intends to release journey maps, user personas, reports, or other data-related summaries stemming from this collection, the agency must include appropriate caveats around those summaries, noting that conclusions should not be generalized beyond the sample, considering the sample size and response rates. The agency must submit the data summary itself (e.g., the report) and the caveat language mentioned above to OMB before it releases them outside the agency. OMB will engage in a passback process with the agency.


Name of Person(s) who developed the screener/question script/survey: Elizabeth Ackerman

Email address of the above person(s): Ackerman.Elizabeth@dol.gov


6

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleDOCUMENTATION FOR THE GENERIC CLEARANCE
Author558022
File Created2025:05:19 15:41:43Z

© 2025 OMB.report | Privacy Policy