Download: 
doc | 
pdf
	OMB
	Control No.: 2020-0031
	
	Approval
	expires: 11/30/08
	State
	Review Framework Evaluation
	Survey
	Questions for State and Local Agencies
	2/5/2021
	
	
		| 
			
 Q1
			Overarching Evaluation Question: Improvement
			of state program 
			 consistency: 
 
				Is
				it your perception that the SRF process has improved consistency
				in core enforcement activities across programs in your state? 
				Across states in your region?  Why?Were
				your state’s policies, levels of activities and data
				completeness found to be consistent with national policies? 
					What
					changes or improvements were recommended? Were they made?What
				other steps has your state taken to better align with national
				policy? | 
	
		| 
			Q2
			Overarching Evaluation Question:
			Improvement in consistency of EPA oversight of state programs: 
 
				Is
				it your perception that the EPA Region followed the SRF process?Did
				the state feel that the report accurately reflected state
				performance?Did
				the use of the 12 elements of the SRF as a standard for
				evaluating performance lead to EPA Regional oversight that was: 
					Fair?Consistent
					across programs?  Across states?Does
					it promote a level playing field across states? 4.
			Did the state feel that there were subjective factors that
			interfered with the process? 5.
			Did the regional reviewers appear knowledgeable about the SRF? 
			About national policies? | 
	
		| 
			Q3
			Overarching Evaluation Question: Collaboration between EPA and
			states in implementing the SRF: 
 
				Did
				the EPA Regional review team work collaboratively with your state
				to implement the review?  Did they do the following in a
				collaborative manner and tone: 
					Conduct
					a preliminary conference call and other pre-review activitiesDiscuss
					schedule and process for the reviewDiscuss
					relevant documents and data to prepare for the on-site reviewConduct
					a preliminary data analysis and review of data metricsConduct
					an on-site review that included: 
						Entrance
						meetingClose
						out meetingShare
					draft report (findings and recommendations)Review
					and address state comments on draft reportAttach
					state’s final comments to the final reportDiscuss
					a plan for implementing the recommendationsNegotiate
					plans for improvement into PPGs or other agreementsDid
				the report identify and recognize the state’s
				accomplishments and best practices?Did
				you agree with the recommendations in the SRF report?  If not,
				how were problems resolved? Was the process fair?Do
				you feel that you have a better understanding of the federal
				enforcement program?Do
				you feel that the Region has a better understanding of the
				state’s program? | 
	
	
		| 
			Q4
			Overarching Evaluation Question: Improvement
			in efficiency and effectiveness of SRF review process: 
 
				What
				components of the review were especially valuable or important? 
				Which were not? 
				What
				steps or different approaches can be taken to improve the
				effectiveness and efficiency of the SRF?  
				What
				steps can be taken to reduce the cost of the reviews?Did
				your state attend any SRF training?  Did it help the state to
				understand national policies, targets and goals that were used to
				gauge performance?Was
				the SRF training beneficial?  How could it be improved? | 
	
		| 
			Q5
			Overarching Evaluation Question:
			Value derived by states and locals from SRF review and approach: 
 
				What
				do you see as the value of the SRF reviews to your state’s
				enforcement program?From
				your perspective, were the advantages to the regions from
				conducting a consistent oversight system nationally? 
				What
				do you see as the value of the SRF to the national program?What
				improvements could be made to enhance the value of the SRF
				reviews?How
				would you like to see the information that is gathered through
				the SRF process used? 
 | 
	
		| 
			
 Q6
			Overarching Evaluation Question:
			Use of differential oversight in future strategies: Differential
			Oversight means applying different levels, types and degrees of
			oversight based on performance. 
				How
				can the results of the SRF review be used to effectively
				implement a differential oversight system? 
				What
				are the components of a differential oversight system that you
				would like to see implemented?How
				would the levels of differential oversight be defined? 
					What
					should be the results or benefits of good performance? 
						Current
						Menu of benefits: 
							Reduce
							frequency of reviews up to a 3 year cycleReduce
							frequency of other oversight activitiesConduct
							joint inspections to train rather than oversight inspectionsAbility
							to do self-evaluation and play more participatory roleProvide
							flexibility in how a state applies its resources to allow the
							inclusion of state priorities, while still maintaining a
							balanced program 
							Get
							recognition or offset credit for alternative approaches 
							Coordinated
							approach to national initiatives (negotiate ability to lead
							cases)EPA
							provides extra funds for state prioritiesEPA
							promotes state as national expert in demonstrated areasState
							participates in national policy or regulatory effortsIncrease
							availability of EPA specialized training opportunities. 
							EPA
							provides public recognition of state programWhat
					should be the results of under performing?What
				is the appropriate cycle for conducting SRF reviews for all
				states?  Within that cycle, how much flexibility should exist for
				regions to handle states or programs differently that perform
				well? 
 | 
Attachment
1
List
of review components for Q4, Question 1
Pre-review
activities
	
		
			- 
			RA
			contacts state commissioner to set tone and context for review. 
- 
			Region
			forms review team and prepares for review 
- 
			Regional
			team has expertise required to conduct the review 
- 
			Prior
			to review, Region provides the state with introductory letter
			explaining review process, schedule, etc. 
Offsite
review activities
	
		
			- 
			Region
			and state identify any relevant reviews (within 2 years) to
			prevent duplication. 
- 
			Query
			OTIS for data metrics, perform preliminary data analysis and share
			with state 
- 
			Allow
			states to provide additional data 
- 
			Regional
			team assesses all relevant state documents (PPA/PPGs, etc.) 
- 
			Optional:
			 negotiate additional program areas (i.e., assistance under 
			Element 13) 
Onsite
review activities
	
		
			- 
			Region
			and state participate in entrance meeting 
- 
			Region
			and state determine the files to be reviewed 
- 
			Region
			and state participate in exit meeting.  Region presents
			preliminary
			significant
			findings and discusses timing for draft and final report. 
Draft
report
	
		
			- 
			Region
			prepares and distributes draft report 
- 
			State
			and OECA provide comments on draft report 
- 
			Region
			and state work to reconcile differences in draft report 
- 
			Region
			responds to OECA comments and makes changes to the report 
Final
report and follow-up
	
		
			- 
			Final
			report includes comments from the state 
			 
- 
			Final
			report includes executive summary 
- 
			Final
			report includes sufficient detail on data and file metrics, other
			state data and contains findings, conclusions, and recommendations 
- 
			Final
			report is submitted to OECA and posted on Tracker 
- 
			Region
			and state negotiate follow-up through PPA, grant agreements, etc 
BURDEN STATEMENT
	The OMB Control Number and expiration date must appear on the front
page of an OMB-approved form or survey, or on the first screen viewed
by the respondent for an on-line application.  The rest of the burden
statement must be included somewhere on the form, questionnaire or
other collection of information, or in the instructions for such
collection. 
OMB Control No.: 2020-0031
Approval expires: 11/30/08
	The public reporting and recordkeeping burden for this collection of
information is estimated to average 1.96 per response.  Send comments
on the Agency's need for this information, the accuracy of the
provided burden estimates, and any suggested methods for minimizing
respondent burden, including through the use of automated collection
techniques to the Director, Collection Strategies Division, U.S.
Environmental Protection Agency (2822T), 1200 Pennsylvania Ave., NW,
Washington, D.C. 20460.  Include the OMB control number in any
correspondence.  Do not send the completed survey to this address.
| File Type | application/msword | 
| File Title | DISCUSSION GUIDE FOR USE ACROSS STAKEHOLDER GROUPS – STATE REVIEW FRAMEWORK | 
| Author | DPIANTAN | 
| Last Modified By | Rick Westlund | 
| File Modified | 2007-09-27 | 
| File Created | 2007-09-27 |