Attachment 1. ERS Crop Management Survey Part A

Attachment 1. ERS Crop Management Survey Part A.docx

Crop Management With or Without Cover Crops

OMB: 0536-0080

Document [docx]
Download: docx | pdf

Title: Crop Management With or Without Cover Crops


SUPPORTING STATEMENT A


  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


Interest and investment in implementation of cover crops as a conservation practice has increased dramatically in the last decade. Cover crops provide long-term soil health benefits including increased water retention and reduced erosion, as well as potential to increase greenhouse gas sequestration. The number of cover crop practices applied in the two primary Natural Resources Conservation Service (NRCS) conservation programs nearly doubled from 2016 to 2020, from 54,000 practices to 121,000 practices1. From 2019-2023, cover cropping was the top conservation practice applied nationally, and much of this growth has been in Midwestern states. Using NRCS administrative data on contracts from the ProTracts database, we estimate that the Heartland region (an area encompassing Iowa, Illinois, Indiana, and parts of surrounding states) accounts for 32% of planned acres in cover crops in NRCS’ Environmental Quality Incentive Program (EQIP), and surrounding regions in the upper Midwest and Great Plains also had high planned acreages in cover crops.


In addition, the recent Partnership for Climate Smart Commodities initiative invested $2.8 billion in projects to implement climate smart practices in agricultural production, including incentives to expand long-term adoption of cover crops. Private carbon credit programs that support cover crops have also increased over this time period2. However, cover crops are still not commonly applied on agricultural land (approximately 5% of harvested cropland in 20173), and more research is needed to explore the motivations and incentives that farmers face regarding cover crop adoption.


The mission of the USDA Economic Research Service (ERS) is to conduct high-quality, objective economic research to inform and enhance public and private decision making. This request is part of a larger project supported by an ERS Strategic Priority Grant that aims to provide new information about farmers’ cover crop practices and willingness to participate in cover crop contracts such as those administered by the Natural Resource Conservation Service (NRCS) and other federal, state, and local agencies.


Given the recency of the increase in funding and proliferation of new programs, little is known about the behavioral responses that farmers may have to changes in contracts, such as payment rates, contract length, and management requirements. This information collection will provide foundational information in support of the ERS mission and Strategic Priorities to help better understand what influences farmers to adopt cover cropping through the use of a farmer survey.

ERS’ authorizing statute is US Code: 7 USC 2204: General duties of Secretary; advisory functions; research and development. The Secretary of Agriculture is charged with collection of statistics and other appropriate means within his power (7 USC Sec. 2204). In addition, the Under Secretary for Research Education and Economics is directed to “coordinate the research programs and activities of the Department” which specifically includes research on “renewable energy, natural resources, and environment” as well as “agricultural economics and rural communities” (7 USC 6971). The USDA Economic Research Service is responsible for collecting data and conducting economic and social science research including natural resources (7 CFR Sec. 2.21(a) (8) (ii) (D)).



  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The purpose of this survey is to provide new and useful information about the characteristics of farms, farmers, and contracts that influence adoption and continued use of cover crops.


This project will survey farmers in Midwestern states who grow corn or soybeans. The overarching goal of the survey is to elicit information from farmers with varying levels of experience with cover crops on their current practices, attitudes, and willingness to adopt cover crops to address questions related to drivers of adoption. The survey instrument is designed for farmers that grow corn or soybeans, and the states in our sample area represent the vast majority (86%) of corn and soy production in the United States4.


The survey has four specific research objectives:

  1. Elicit information about farmers’ cover crop attitudes and current or historical practices if they have cover crop experience

  2. Use established survey methods and econometric techniques to estimate preferences for cover crop contracts among corn-soy farmers in the Midwest and to control for geographic, operational, and farmer characteristics and attitudes that may influence preferences

  3. Compare preferences between three groups of farmers: general population of farmers, farmers with prior NRCS program experience without cover crops, and farmers with prior NRCS program experience with cover crops

  4. Leverage field-specific responses to cover crop adoption questions to estimate a supply curve of land for cover crops that controls for geographic, operational, and farmer characteristics


The survey will include questions about the farm operation, cover crop experience, experience with conservation programs and practices, and attitudes towards cover cropping that will be used as controls in models. Farm and field characteristics include physical attributes such as soil type and slope, and management practices such as crop rotation tillage, and livestock which are expected to influence cover crop adoption. Questions on cover crop experience include a variety of management practices such as cover crop species planted and termination methods. Many of the questions will focus on management in 2024 as the survey aims to understand current practices. The survey will also use a series of cover crop adoption questions to estimate the economic values of the features of cover crop contracts. These questions pose different cover crop contracts to the farmer that vary in their features (e.g., payment levels, contract length) according to an experimental design. Farmers will respond to these adoption questions with a single, specific field in mind. This data will be used to estimate models of cover crop contract adoption and the value of a change in contract features. Because responses are field-specific, the data will also be able to develop estimates of the supply of land for cover crops as a function of price and other contract features.


The data from this survey will be used by the joint research team at the USDA ERS and Michigan State University (MSU) to develop multiple research products that will be disseminated to internal and external stakeholders, academia, and the public. Results will include summary statistics and estimates of current cover crop practices and perceptions of cover crops among the survey population, and results from regression modeling of contract adoption as a function of price, features of contracts, and characteristics of the farm and farmer. Anticipated research products include academic journal articles, ERS reports, and other publications such as Amber Waves articles and Charts of Note. In addition, results will be shared with federal agencies and other stakeholders including but not limited to NRCS and the Farm Service Agency (FSA) through briefings and presentations. By disseminating results through multiple avenues, this information will be available for use in the design of future programs by both federal agencies and other decision-makers.



  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


The survey will be available online, hosted by Qualtrics, a FedRAMP certified and approved platform, as well as by paper. Recruitment will encourage respondents to respond to the survey online by providing an individualized web link. The online survey will reduce time spent on the survey using concise question formatting, piped text, and automatic skip patterns. Respondents will be sent a paper version of the survey only if they do not respond to the online invitation or cannot respond online.

  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.



There are several surveys that collect information on cover crops, however, no existing information collection allows for the estimation of behavioral responses to changes in cover crop contracts across a large region of the United States. In the academic literature, several small-scale surveys have been conducted on cover crop use. Such smaller-scale surveys include a North Carolina survey on perceptions of cover crops5, a survey of New York dairy farmers on cover crop use and barriers6, a survey of organic vegetable farmers in Wisconsin7, and a recent survey comparing barriers to adoption in Maryland and Ohio8 among others9 10 11 12. These studies are limited in scope, and some are focused on specialty production systems. One larger, multi-state survey gathered data on adoption decisions and farmer attitudes and characteristics. The data were used to classify farmers into types or classes of adopters as a function of their experience and attitudes towards cover cropping13. A recent survey modeled cover crop adoption decisions using similar methods, however, its respondents were members of a state organization, and results are not generalizable to a larger population of row-crop farmers14.


The Sustainable Agriculture Research and Education (SARE) in conjunction with the Conservation Technology Information Center (CTIC) and American Seed Trade Association (ASTA) conducts a period national survey of farmers on their experiences with cover crops. The SARE cover crop survey collects information on cover crop revenues and costs, as well as perceptions about cover crops. However, it uses a non-random sample, recruiting through producer association email lists and social media channels, likely resulting in selection bias for producers who have tried or are interested in cover crops.


Finally, existing federal surveys that collect cover crop information are not targeted at existing or likely cover crop adopters but rather collect more broad information about cropping practices and farm management with limited space devoted to cover crop management questions. Since cover crop adoption remains relatively rare, the share of cover crop adopters in most agricultural surveys is small, which limits the ability to analyze drivers of cover adoption. More importantly, these only collect observational data restricting what is known to existing contracts, and variation in the drivers of cover crop adoption is not random. The Agricultural Resource Management Survey (ARMS) Phases II and III are annual surveys that collect details on field and farm-level practices, respectively. However, space devoted to cover crop practices is limited and of the total responses, only roughly 5-10 percent do cover crops. The Conservation Effects Assessment Project (CEAP) Survey also collects field-level information on practices but is not conducted annually and also has a small percentage of respondents engaged in cover crops. The Conservation Practice Adoption Motivations Survey (CPAMS) is a farm-level survey of a broader population that explores motivations for adoption and disadoption of various conservation practices. These information collections cannot be used to model farmer choices.


This information collection will provide new data not already available via existing studies or academic literature because it combines four key aspects not available in other data collections:

  1. The sample area covers twelve states in the Midwest and will be targeted to corn-soy farmers, therefore representing a large share of U.S. farmers and farmland.

    1. Corn and soybeans are the top two crops by planted area in the United States15, and the states in the sampling area account for the vast majority of corn and soybean production.

  2. Recruitment methods and materials will minimize selection bias to the greatest extent possible.

    1. The sampling frame will be developed using USDA administrative records on all farmers that interact with USDA programs. The vast majority of corn and soybean crop farmers, which the survey is designed for, are represented while small boutique and hobby farms which do not grow corn or soybeans may be under-represented.

    2. A split sample design will allow us to compare responses between three groups of farmers: The general population of farmers, conservation program participants who have not cover cropped, and conservation program participants who have cover cropped.

    3. Existing data collections typically focus either on cover crop adopters, or the general population, and not both.

  3. The survey includes behavioral questions that ask respondents to make a choice over cover crop contracts at a time when such incentive programs and contracts are a major national focus, with increases in funding from federal and private sources in recent years.

    1. Much of the existing literature has found that cost is an important barrier to cover crop adoption, yet no existing studies examine the impact of different payment levels or other features of contract design on adoption.



  1. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


For this information collection we will survey individuals who own and operate farms that qualify as small businesses.


The key method used to minimize burden for small businesses as well as any other respondents was thorough testing of the survey instrument using in-depth cognitive interviews in the collection authorized by OMB Control No. 0536-0073. Feedback from discussions with farmers about how they perceived questions, whether response options available were suitable for their unique circumstances, and whether text helped their understanding was used to ensure that question wording, response options, and design are appropriate for the targeted population.


Recruitment materials will encourage the use of the online survey instrument in Qualtrics, which will utilize streamlined options and question formatting to reduce time spent on the survey. The online survey will automate all skip patterns and include limited dropdown menu options and piped text that reduce burden compared to other survey modes.


The design of the sample will further minimize impacts to small entities. We will use a stratified sampling strategy, where large farms belong to strata with a higher probability of being sampled than small farms. Since small farms are likely to be part-time hobby farms unlikely to adopt cover crops, this strategy helps to ensure that our survey is targeted to the intended population of row-crop farms.


Finally, this information collection will benefit small farm businesses by providing information to decision-makers about farmers’ preferences for features of cover crop contracts (such as contract length, requirements, and payment rates) as well as what is associated in diversity of those preferences (such as cropping systems, prior experience, and attitudes). Increased understanding of this may be used by program agencies and other providers of cover crop contracts to inform contract design that better serves small businesses.




  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Cover crops are the subject of increased interest in the conservation community and is currently the top applied conservation practice funded by two major NRCS programs, EQIP and CSP. Increased cover crop adoption is also supported by major federal initiatives such as the Partnership for Climate Smart Commodities. Different federal, state, regional, and private programs offer financial incentives for planting cover crops, with varying levels of payments and often different features of contract design. Funding for cover crops has increased in the last decade and will continue in the future, yet current adoption rates are still low so there is significant potential for expansion if contracts are designed effectively.


The findings from this study will be used by the USDA to inform stakeholders and the public about drivers of cover crop adoption as well as how producers may respond to changes in programs that incentivize cover crops. . This information collection will contribute new insights into the adoption decision by providing monetary estimates of the tradeoffs that producers face in terms of program flexibility, administration, and level of assistance for a sample of farmers that represent a large share of commodity crop production in the Nation. This data collection may be used to inform the design of cover crop programs, including federal programs, other governmental programs, and newer private programs such as carbon credits..


If this study is not conducted, the USDA as well as other stakeholders would lack information about responses to changes in incentives when considering changes to or expansions of current conservation programs, or the development of new programs. Future programs would not benefit from the information provided by this study, which would assist in identifying how features of contracts affect decisions and the interactions with farm operation and farmer characteristics.



  1. Explain any special circumstances that would cause an information collection to be conducted in a manner:

requiring respondents to report information to the agency more often than quarterly;

requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

requiring respondents to submit more than an original and two copies of any document;

requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


This information collection seeks an exemption from U.S. Office of Management and Budget's Statistical Policy Directive No. 15: Standards for Maintaining, Collecting, and Presenting Federal Data on Race and Ethnicity requiring the collection of detailed race and ethnicity subcategory data. ERS seeks to limit questions on race and ethnicity to the following question and reporting categories shown in the question below:


What is your race and/or ethnicity? (Select all that apply)

  • American Indian or Alaska Native

  • Asian

  • Black or African American

  • Hispanic or Latino

  • Middle Eastern or North African

  • Native Hawaiian or Pacific Islander

  • White


The 2022 U.S. Census of Agriculture Race, Ethnicity, and Gender Profiles16 show that 96.7% of all United States farms have White producers and this figure is similar in our sample states. Furthermore, we will draw three separate samples of size Group 1=10,000, Group 2=2,500, and Group 3=2,500. Given the sample sizes and expected response rates for this survey, the collection of detailed race and ethnicity questions would not provide sufficient statistical power for use in estimation and may pose a disclosure risk. Therefore, we believe that limiting the response categories to the race and ethnicity question will reduce respondent burden and minimize potential disclosure risk, while also providing sufficient information for planned data analysis. The focus of this data collection is to understand the availability of land for cover crops and the race and ethnicity categories above provide sufficient detail for use as control variables in estimating models.


This information collection does not involve any other special circumstances. All responses will be one-time responses.



  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

    Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.

    Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


ERS published a notice on October 27, 2023 in the Federal Register, Document Citation: Volume 88, No. 73825 October 27, 2023, page 73825, Document Number: 2023-23755 (Attachment 11). The 60-day period for public comments ended December 26, 2023.


ERS received 7 comments on the October 27 Federal Register Notice. Public comments are summarized below. The full text of each comment is provided in Attachment 12, and the ERS response to comments is provided below.


The public comments received in response to the 60-day FRN demonstrated a strong interest in and support of USDA efforts to provide new information about cover crop adoption. Many comments emphasized the need for fundamental information about cover crop use: land ownership, crop rotations, use of seed mixes, and participation in current programs. Questions addressing these issues are incorporated into the survey instrument.


Additional comments spoke to an interest in understanding adoption motivations, attitudes, and the role of incentives in the adoption decision. The survey will investigate the role of incentives and motivations in adoption by using a series of enrollment questions in which respondents select to enroll or not enroll a field in a contract. Contract choices will be statistically modeled as a function of the features of the contract with responses to questions about motivations and attitudes used as controls. In this way, the framework will capture both structural and administrative incentives and barriers to cover cropping and farmer preferences and attitudes.


The non-governmental organizations that submitted comments expressed an interest in the role of crop insurance programs such as the Pandemic Cover Crop Program which provided a $5 per acre discount on federal crop insurance for cover cropping. In response, the survey team included additional questions that aim to better understand participation in the PCCP and other similar programs as well as the influence that crop insurance has on cover cropping decisions.


Finally, the comments stressed the need for including non-program participants in the sample. The sample will be developed from FSA administrative data which includes all farmers engaged with the USDA through any program, including crop insurance. For our target sample of commodity corn and soy growers, the sampling frame will include a large majority of farmers.


This project has consulted with USDA stakeholders as well as experts in academia on content and design of the survey. USDA stakeholders include Natural Resources Conservation Service (NRCS) Programs Deputy Area, NRCS Science and Technology Deputy Area, USDA Farm Production and Conservation Business Center (FPAC-BC), FSA Mailings Team, Risk Management Agency (RMA) and the Office of the chief Economist (OCE). ERS and NRCS signed an interagency agreement that outlines cooperation and input on design of the survey instrument. NRCS officials consulted include but are not limited to:


  • Amanda Branham, Director, Soil Health Division

  • Betsy Dierberger, USDA NRCS National Agronomist

  • Noller Herbert, Deputy Chief of Science and Technology

  • Julie Suhr-Pierce, National Economist

  • Mark Xu, Director (former), Resource Inventory and Assessment Division


In addition, the research team received input from Robert Myers (Sustainable Agriculture Research and Education; SARE) and responded to public comments from Rural Investment to Protect the Environment, the National Corn Growers Association, Illinois Corn Growers Association, Midwest Cover Crops Council, the AGree Coalition, the Center for Rural Affairs, and combined comments from American Farmland Trust, National Wildlife Federation, and the Natural Resources Defense Council. This study is conducted under a cooperative agreement with researchers at Michigan State University (MSU). Members of the study team at MSU are experts on choice survey design, cognitive interview techniques, and development and implementation of farmer surveys. Members of the team at MSU include:


  • Dr. Frank Lupi, Professor, Department of Agricultural, Food, and Resource Economics and Fisheries and Wildlife Department, Michigan State University

  • Dr. Scott Swinton, University Distinguished Professor, Department of Agricultural, Food, and Resource Economics, Michigan State University

  • Dr. Ying Wang, Postdoctoral Researcher, Department of Agricultural, Food, and Resource Economics, Michigan State University


On November 1, 2024 the NASS Methodology Division provided a review of the ICR package. The review and ERS response to comments are provided below:


This survey by USDA-ERS and Michigan State University is meant to clarify the reasons why farmers use cover crops, and what contracts/incentive structures could lead to further adoption of cover crops on farms that produce corn and/or soy in the Midwest. It is a well-structured survey with no obvious flaws. I was particularly impressed with their disclosure avoidance plans to keep the results (including summary statistics) confidential. I was also impressed with their use of a probe sample to help focus the questions and make sure all questions are properly understood. My comments below are all meant to either alert ERS of a potential – although by no means definitive – mistake in the survey or to suggest an alternative way of asking a question.


Comments:

Letter 2: I would recommend taking out “you are a part of a small number of people”. Your sample size was scientifically determined and I am worried if you mention that it’s a small sample it may make the respondents concerned about data confidentiality. There is no need to tell them it’s a small sample. The small sample is mentioned in the first letter but it is not accentuated as much.


ERS Response: Thank you for this comment. The language in the letter is intended to communicate that each individual’s response matters and has been recommended in the survey literature17. While we understand the comment, this language has also worked well for the research team in past farmer surveys.


Survey Questionnaire B3: Need a place for the respondent to describe “most of the field” if the user chooses both options.


ERS Response: We have clarified the question so that respondents know to select one option that represents the majority of the field.


Survey Questionnaire B6: I recommend putting both of the “variable grade” options at the bottom rather than level, even, variable, even, variable. I think it would easier to process for the user.


ERS Response: This question tested with during the 16 cognitive interviews conducted under OMB Control No. 0536-0073. All respondents answered this question quickly and easily.


Survey Questionnaire D1 D2: Given that the missing element is financial cost of nutrient losses, which likely applies to the specific farmland in question, I imagine it was intentional to drop only that element between D.1 and D.2, but I am flagging it for you just in case.


ERS Response: The financial cost of nutrient losses would only impact the individual farmer so we removed that element from D.2.


Supporting Statement: Part 8 suggests a sampling frame of at most 1.27M corn-soy Midwest farms. I understand that it says the words "at most", but this estimate almost doubles the TOTAL number of Midwest farms, regardless of which crop or livestock is raised. I bring this up because I am concerned about the source of this high number and whether the sampling frame is reliable.

Based on Table 1 in the Supporting Statement part B, most states have roughly the same farm inflation number of about 2x, but Illinois stands out as a particular offender at a 2.6x inflation rate. Michigan has the least inflation at 1.36x. I see that you expect there to be some duplication, and so when you aggregate by principal operator your farm counts will fall. I recommend comparing your ultimate numbers to Table 1 of the Census by state https://www.nass.usda.gov/Publications/AgCensus/2022/Full_Report/Volume_1,_Chapter_2_US_State_Level/st99_2_001_001.pdf to ensure your farm count estimate is reasonable and not unreliably inflated, particularly by state.


ERS Response: We agree that the counts of administrative Farm ID are greater than the number of farm operations in the intended sample area. Based on past experience from ERS research teams drawing samples from the same data sources, we expect that aggregating USDA Farm IDs will result in a count of farm operations approximately 2x lower. We intend to further de-duplicate by listed mailing address and will compare final numbers with Census of Agriculture Data.



  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


Following standard practices in the survey literature, participants will be provided with an incentive payment of $5 in the invitation to the survey. Respondents will also be offered a $30 incentive upon completion of the survey (close to the hourly wage rate for farmers and ranchers18, however, we believe this amount is appropriate because we will oversample large farms). Incentive payments are recommended by Dillman, et al. (2014)19 to increase response rates and are widely used by survey practitioners, particularly when surveying small businesses including farmers. A meta-analysis on survey nonresponse suggests that incentives consistently result in higher response rates2021. In addition, research suggests that incentives either reduce nonresponse bias by reducing variation in propensity to respond22, or have no effect on nonresponse bias23 24 25. The research team at Michigan State University has had success with completion incentives in past farmer surveys of Midwestern states in increasing response rates.



  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy. If the collection requires a systems of records notice (SORN) or privacy impact assessment (PIA), those should be cited and described here.


Respondent data will be protected by the Confidential Information Protection and Statistical Efficiency Act of 2018 (CIPSEA). Participants will consent to the study at the start of the survey (Attachment 3). They will be allowed to skip questions or terminate the survey at any time.


Respondents will be assigned a unique identifier used to track responses for the purposes of sending follow-up notifications. Personal information that could be used to identify individuals such as names, addresses, and emails will only be used to recruit respondents, and never associated with survey responses. Names, addresses, and email addresses will be destroyed at the end of the study. Contact information for respondents will be treated as PII and email correspondence with potential respondents will be conducted from a federal email address.


Respondents will use the FEDRAMP certified Qualtrics environment configured for the USDA to respond to the survey, and all analyses will take place in limited access folders on CIPSEA approved USDA servers. Any paper responses will be entered into Qualtrics manually using double key entry by CIPSEA-pledged enumerators and surveys kept in a secure location according to ERS data storage and retention guidelines. Analyses will use anonymized responses and never reported at a level lower than the county. The following disclosure avoidance tools will be used:


  1. No individual responses will be reported

  2. No minimum values will be reported (except 0) and no maximum values will be reported

  3. A minimum of 5 observations will be required in any cell in a frequency table

  4. A minimum of 20 weighted responses will be required in any cell in a weighted frequency table

  5. A (1,k) rule will be applied to tables of magnitude


These disclosure rules are in line with rules applied to other ERS data collections such as the Agricultural Resource Management Survey (ARMS). Anonymized responses will be removed from the Qualtrics environment at the end of the project and kept according to ERS data storage and retention guidelines.


The following pledge will be placed on all instruments:


The information you provide will be used for statistical purposes only. Your response will be kept confidential and any person who willfully discloses ANY identifiable information about you or your operation is subject to a jail term, a fine, or both. This survey is conducted in accordance with the Confidential Information Protection and Statistical Efficiency Act of 2018, Title III of Pub. L. No. 115-435, codified in 44 U.S.C. Ch. 35 and other applicable Federal laws. 





  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


There will be no questions of a sensitive nature included in this information collection.



  1. Provide estimates of the hour burden of the collection of information. The statement should:

Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under ‘Annual Cost to Federal Government’.


Burden estimates for survey respondents are drawn from the cognitive interviews under OMB Control No. 0536-0073. Interview respondents completed the full survey with multiple breaks for discussion between 45-60 minutes and time stamps indicate that the time spent on answering questions themselves ranged between 25-30 minutes with an average of around 28 minutes.


Past surveys of farmers on conservation practice adoption conducted between 2013-2019 have achieved response rates of 25-30%26 27 28 29 30. The overall estimated response rate of 17% for this collection reflects two differences from published studies and was developed through consultation with project leads at Michigan State University, Frank Lupi and Scott Swinton, who have a combined 50 years of experience conducting surveys of farmers and other populations in Michigan and surrounding states. First, recruitment will take place across a broad (12-state) region of the Midwest, and respondents may not have any affiliation with either Michigan State University or the USDA ERS. Secondly, there is a general declining trend in response rates for all surveys, and anecdotal evidence from survey experts suggests that response rates have declined further in the post-pandemic years.


Estimates of the pattern response rates is based on past surveys by Frank Lupi and Scott Swinton that suggest approximately half of all survey responses come from the initial mailing, with declining response rates to each subsequent mailing.


We use an initial screening question:

  • Are you the main decision maker for annual crop management on your farm?


If the respondent is not the main decision maker, we request that the survey is given to the main decision maker for annual crop management.


The sampling frame will be developed to minimize attrition to the screening question. It will be based on known growers of corn or soybeans using USDA administrative records, and we will use the primary operator listed on farm records to identify the likely decision-maker on each farm.


Estimated response rates and associated burden to each stage of recruitment are shown in Table 1. Response rates follow the pattern of responses for past push to mail surveys of farmers and other populations by the principal investigators at Michigan State University.


Table 1 Estimated Burden Hours for Respondents and Non-Respondents and Cost

 

 

Respondents

Non-Respondents

Total

 

Sample Size

Count

Min. / Response

Burden Hours

Count

Min. / Response

Burden Hrs.

Total Burden Hrs.

Estimated cost ($/Hr)

Total Burden Cost ($)

Mailing #1

15,000

1,500

2

50

13,500

1

225

275

37.18

10224.50

Mailing #2

13,500

450

2

15

13,050

1

217.5

232.5

37.18

8644.35

Mailing #3

13,050

300

2

10

10,050

1

167.5

177.5

37.18

6599.45

Mailing #4

10,050

150

2

5

9,000

1

150

155

37.18

5762.90

Mailing #5

9,000

150

2

5

8,850

1

147.5

152.5

37.18

5669.95

Informed consent

2,550

2,520

2

84.0

30

1

0.5

84.5

37.18

3141.71

Screening question

2,520

2,500

2

83.3

20

1

0.3

83.7

37.18

3110.73

Survey

2,500

2,500

30

1250

0

0

0

1,250

37.18

46475.00

Total

 

 

 

1,502.3

 

 

908.3

2,410.7


89,628.59


Note: Cost per hour for the sample was derived by using U.S. Bureau of Labor Statistics Current Population Survey, 2023, Farmers, Ranchers, and Other Agricultural Managers.

Three samples totaling 15,000 potential respondents will be drawn from a sampling frame consisting of at most 1,269,977 corn-soy farms in Midwestern states. This estimate includes some duplicates as our data contain farms as an administrative designation and multiple farms may share a principal operator. We will aggregate farms by shared principal operator prior to drawing the samples. Principal operators of farms will be sent a series of 5 mailings. Estimated burden is 2 minutes per respondent to each mailing, and 1 minute per non-respondent to each mailing. If a farmer responds to the survey, their name will be removed from the mailing list, and they will not receive subsequent mailings.


We estimate a response rate of 10% to Mailing #1 (1,500 respondents and 13,500 non-respondents). The total estimated burden hours for respondents to Mailing #1 are 50 hours, and 225 hours for non-respondents.


Only the 13,500 non-respondents to Mailing #1 will receive Mailing #2. We estimate an additional 3% of the original 15,000 sample will respond to Mailing #2 (450 respondents and 13,050 non-respondents). The total estimated burden hours for respondents to Mailing #2 are 15 hours, and 217.5 hours for non-respondents.


Non-respondents to Mailing #2 (13,050) will receive Mailing #3. We estimate an additional 2% of the original 15,000 sample will respond to Mailing #3 (300 respondents and 10,050 non-respondents). The total estimated burden hours for respondents to Mailing #3 are 10 hours, and 167.5 for non-respondents.


Non-respondents to Mailing #3 (10,050) will receive Mailing # 4. We estimate an additional 1% of the 15,000 sample will respond to Mailing #4 (150 respondents and 9,000 non-respondents). The total estimated burden hours for respondents to Mailing #4 are 5 hours, and 150 for non-respondents.


Non-respondents to Mailing #4 will receive Mailing #5. We estimate an additional 1% of the 15,000 sample will respond to Mailing #5 (150 respondents and 8,850 non-respondents). The total estimated burden hours for respondents to Mailing #5 are 5 hours, and 147.5 for non-respondents.


Of the estimated 2,550 respondents to the mailings, we estimate that 30 (roughly 1% of respondents) will not proceed past the informed consent page. Estimated burden hours for respondents to informed consent are 84 hours, and .5 hours for non-respondents.


An additional estimated 20 (roughly 1% of 2,550 respondents to the mailings) will be screened out by two screening questions. Estimated burden hours for respondents to the screening questions are 83.3 hours, and .3 hours for non-respondents.


We estimate that 2,500 respondents who proceed past informed consent will complete the survey, resulting in a total estimated response rate of approximately 17% to the survey. We estimate that surveys will take 30 minutes to complete, and estimated burden hours for survey respondents are 1,250.


We do not expect to contact anyone who is not eligible, given the nature of the sampling frame, therefore all burden estimates are either for those who are (a) eligible and completed the survey or (b) eligible and did not complete the survey.

  1. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).

The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


Respondent cost per hour for the farmer population was derived by using U.S. Bureau of Labor Statistics Current Population Survey, 2023, Farmers, Ranchers, and Other Agricultural Managers31.


The median weekly earnings for Farmers, Ranchers, and Other Agricultural Managers, as measured by the Bureau of Labor Statistics, Current Population Survey is $1,171, which is approximately $29.28 per hour. Fringe benefits for all private industry workers are an additional 29.7 percent32, or $7.90, resulting in a total of $37.18 per hour. The estimated respondent cost is $ 89,628.59 for both responses and non-responses.


See cost estimates in Table 1, Question 12 above.

There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection.



  1. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.


The total cost to the Federal Government for this study is approximately $699,547. This includes $687,675 of cooperative agreement costs with Michigan State University. The cooperative agreement costs include $150,000 for participant payments (15,000 x $5 + 2,500 x $30), an estimated $427,675 in printing and mailing, and the remaining $110,000 is for personnel who are designing and implementing the study, including the support of a postdoctoral researcher. In addition, there is a cost to the USDA of .3 FTEs for the length of the project.



  1. Explain the reasons for any program changes or adjustments reported on the burden worksheet.


This is a one-time information collection that does not include any program changes or adjustments.



  1. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Survey data collection will begin in January 2025 with an initial mailing. Mailings will continue through March 2025 and the survey will be closed at the end of May 2025. Journal articles and reports will be developed 2025-2026 with anticipated publication dates in 2026 and 2027. An estimated time schedule for the entire project is shown in Table 2.


Table 2 Estimated Time Schedule for the Entire Project


 

2025

2026

2027

 

1

2

3

4

5

6

7

8

9

10

11

12

Q1

Q2

Q3

Q4

Q1

Q2

Data collection

x

x

x

x

x








 

 

 

 


 

Tabulation




x

x

x

x

x

x

x

x

x

 

 

 

 


 

Develop reports and journal articles for publication







x

x

x

x

x

x

x

x

x

x

x

x

Estimated publication dates

 

 

 

 

 

 

 

 

 

 

 

 

x

 

 

x

 

x


The survey was designed for the use of specific analytical techniques that will allow us to address Objectives 2-4 of the project: Estimating preferences for contracts; comparing preferences between groups of farmers; and estimating a supply curve of land for cover crops.


Specifically, the cover crop adoption questions vary the features of cover crop contracts according to an experimental design. This variation allows us to estimate logistic regression models where cover crop contract adoption is the dependent variable and is a function of contract features, farm/farmer characteristics, and other control variables. Results of the models will also be used to develop estimates of the supply of land for cover crops.


In addition, the information collection will support summary statistics on current cover crop practices at the state and regional levels.


Estimates as a result of this information collection will be shared with stakeholders and the public through USDA publications and academic journal articles. The research team will develop 1-2 USDA publications that report descriptive statistics related to cover crop experience and current cover crop practices. In addition, we will estimate preference parameters for features of cover crop contracts using random utility regression models and compare preferences for farmers with no cover crop experience, those with program experience, and those with cover crop experience for publication in academic journals.


  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The agency plans to display the expiration date for OMB approval of the information collection on all instruments.



  1. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.”


The agency is able to certify compliance with all provisions under Item 19 of OMB Form 83-I.

1 https://publicdashboards.dl.usda.gov/t/FPAC_PUB/views/RCATopPracticesbyLandUseandState/TopPracticesDashboard

2 https://www.reuters.com/markets/commodities/farming-climate-off-season-cover-crops-expand-us-growers-eye-low-carbon-future-2022-01-04/

3 Wallander, S., Smith, D., Bowman, M., & Claassen, R. (2021). Cover crop trends, programs, and practices in the United States.

4 USDA National Agricultural Statistics Service (NASS), 2023 Census of Agriculture. Retrieved from https://quickstats.nass.usda.gov/

5 O’Connell, S., Grossman, J. M., Hoyt, G. D., Shi, W., Bowen, S., Marticorena, D. C., … Creamer, N. G. (2015). A survey of cover crop practices and perceptions of sustainable farmers in North Carolina and the surrounding region. Renewable Agriculture and Food Systems30(6), 550–562. doi:10.1017/S1742170514000398

6 Long, E., Ketterings, Q., & Czymmek, K. (2013). Survey of cover crop use on New York dairy farms. Crop Management12(1), 1-5.

7 Moore, V. M., Mitchell, P. D., Silva, E. M., & Barham, B. L. (2016). Cover crop adoption and intensity on Wisconsin’s organic vegetable farms. Agroecology and Sustainable Food Systems40(7), 693-713.

8 Duke, J. M., Johnston, R. J., Shober, A. L., & Liu, Z. (2022). Barriers to cover crop adoption: Evidence from parallel surveys in Maryland and Ohio. Journal of Soil and Water Conservation77(2), 198-211.

9 Arbuckle, J. G., & Roesch-McNally, G. (2015). Cover crop adoption in Iowa: The role of perceived practice characteristics. Journal of Soil and Water Conservation70(6), 418-429.

10 Chami, B., Niles, M. T., Parry, S., Mirsky, S. B., Ackroyd, V. J., & Ryan, M. R. (2023). Incentive programs promote cover crop adoption in the northeastern United States. Agricultural & Environmental Letters8(2), e20114.

11 Das, S., Berns, K., McDonald, M., Ghimire, D., & Maharjan, B. (2022). Soil health, cover crop, and fertility management: Nebraska producers’ perspectives on challenges and adoption. Journal of Soil and Water Conservation77(2), 126-134.

12 Campbell, K. M., Boyer, C. N., Lambert, D. M., Clark, C. D., & Smith, S. A. (2021). Risk, cost-share payments, and adoption of cover crops and no-till. Journal of Soil and Water Conservation76(2), 166-174.

13 Han, G., & Niles, M. T. (2023). An adoption spectrum for sustainable agriculture practices: A new framework applied to cover crop adoption. Agricultural Systems212, 103771.

14 Canales, E., Bergtold, J. S., & Williams, J. R. (2024). Conservation intensification under risk: An assessment of adoption, additionality, and farmer preferences. American Journal of Agricultural Economics106(1), 45-75.

15 https://www.ers.usda.gov/data-products/chart-gallery/gallery/chart-detail/?chartId=109620

16 https://www.nass.usda.gov/Publications/AgCensus/2022/Online_Resources/Race,_Ethnicity_and_Gender_Profiles/cpd99000.pdf

17 Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys: The tailored design method. John Wiley & Sons.

18 https://www.bls.gov/cps/cpsaat39.htm

19 Dillman, Don A., Jolene D. Smyth, and Leah Melani Christian. Internet, phone, mail, and mixed-mode surveys: the tailored design method. John Wiley & Sons, 2014.

20 Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. Survey nonresponse51(1), 163-177.

21 Mercer, A., Caporaso, A., Cantor, D., & Townsend, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly79(1), 105-129.

22 Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-saliency theory of survey participation: description and an illustration. The Public Opinion Quarterly64(3), 299-308.

23 Brick, J. M., & Tourangeau, R. (2017). Responsive survey designs for reducing nonresponse bias. Journal of Official Statistics33(3), 735-752.

24 Groves, R. M., & Heeringa, S. G. (2006). Responsive design for household surveys: tools for actively controlling survey errors and costs. Journal of the Royal Statistical Society Series A: Statistics in Society169(3), 439-457.

25 Stanley, M., Roycroft, J., Amaya, A., Dever, J. A., & Srivastav, A. (2020). The effectiveness of incentives on completion rates, data quality, and nonresponse bias in a probability-based internet panel survey. Field methods32(2), 159-179.

26 Beetstra, M. A., Wilson, R. S., & Doidge, M. (2022). Conservation behavior over time: Examining a Midwestern farmer sample. Land Use Policy115, 106002.

27 Wilson, R. S., Schlea, D. A., Boles, C. M., & Redder, T. M. (2018). Using models of farmer behavior to inform eutrophication policy in the Great Lakes. Water research139, 38-46.

28 Guo, T., Marquart-Pyatt, S. T., & Robertson, G. P. (2023). Using three consecutive years of farmer survey data to identify prevailing conservation practices in four Midwestern US states. Renewable Agriculture and Food Systems38, e44.

29 Lang, Z., & Rabotyagov, S. (2022). Socio-psychological factors influencing intent to adopt conservation practices in the Minnesota River Basin. Journal of Environmental Management307, 114466.

30 Wang, T., Jin, H., Sieverding, H., Kumar, S., Miao, Y., Rao, X., ... & Cheye, S. (2023). Understanding farmer views of precision agriculture profitability in the US Midwest. Ecological Economics213, 107950.

31 https://www.bls.gov/cps/cpsaat39.htm

32 https://www.bls.gov/news.release/pdf/ecec.pdf

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTanner, Sophia - REE-ERS, Kansas City, MO
File Modified0000-00-00
File Created2025-02-28

© 2025 OMB.report | Privacy Policy