NAEP 2026 Part A v36

NAEP 2026 Part A v36.docx

National Assessment of Educational Progress (NAEP) 2026

OMB: 1850-0928

Document [docx]
Download: docx | pdf

National Center for Education Statistics

National Assessment of Educational Progress



National Assessment of Educational Progress (NAEP) 2026



Part A

Supporting Statement




OMB# 1850-0928 v.36










May 2025


Important changes in how we collect demographic information will be reflected in the 2026 instruments when they are ready for publication and review. In March 2024, the Office of Management and Budget (OMB) announced revisions to Statistical Policy Directive No. 15: Standards for Maintaining, Collecting, and Presenting Federal Data on Race and Ethnicity (SPD 15) and published the revised SPD15 standard in the Federal Register (89 FR 22182). See Part A.7 of this package to see how NCES plans to incorporate these revisions into NAEP 2026. Further, as we were preparing this package for 30D publication, the White House issued Executive Orders “Defending Women from Gender Ideology Extremism and Restoring Biological Truth to the Federal Government,” (January 20, 2025) and “Ending Radical and Wasteful Government DEI Programs and Preferencing,” (January 20, 2025). In compliance with these Executive Orders, the materials in this package have been modified. In addition, NAEP law (20 USC 9622) (5) Requirement, under the section Purpose: State Assessments was added, see page 5.

Table of Contents

A.1. Circumstances Making the Collection of Information Necessary 3

A.1.a. Purpose of Submission 3

A.1.b. Legislative Authorization 5

A.1.c. Overview of NAEP Assessments 6

A.1.c.1. NAEP Frameworks 6

A.1.c.2. Cognitive Item Development 6

A.1.c.3. Survey Items 7

A.1.c.4. Participation in NAEP 9

A.1.c.5. Digitally Based Assessments (DBAs) 9

A.1.c.6. Assessment Types 12

A.1.d. Overview of 2026 NAEP Assessments 13

A.2. How, by Whom, and for What Purpose the Data Will Be Used 13

A.3. Improved Use of Technology 14

A.4. Efforts to Identify Duplication 15

A.5. Burden on Small Businesses or Other Small Entities 16

A.6. Consequences of Collecting Information Less Frequently 16

A.7. Consistency with 5 CFR 1320.5 16

A.8. Consultations Outside the Agency 17

A.9. Payments or Gifts to Respondents 19

A.10. Assurance of Confidentiality 19

A.11. Sensitive Questions 22

A.12. Estimation of Respondent Reporting Burden (2026) 23

A.13. Cost to Respondents 30

A.14. Estimates of Cost to the Federal Government 30

A.15. Time Schedule for Data Collection and Publications 31

A.16. Approval for Not Displaying OMB Approval Expiration Date 31

A.17. Exceptions to Certification Statement 31





A.1. Circumstances Making the Collection of Information Necessary

A.1.a. Purpose of Submission

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, and civics.


NAEP is conducted by the National Center for Education Statistics (NCES) in the Institute of Education Sciences of the U.S. Department of Education. As such, NCES is responsible for designing and executing the assessment, including designing the assessment procedures and methodology, developing the assessment content, selecting the final assessment content, sampling schools and students, recruiting schools, administering the assessment, scoring student responses, determining the analysis procedures, analyzing the data, and reporting the results.1


The National Assessment Governing Board (henceforth referred to as the Governing Board or NAGB), appointed by the Secretary of Education but independent of the Department, is a bipartisan group whose members include governors, state legislators, local and state school officials, educators, business representatives, and members of the general public. The Governing Board sets policy for NAEP and is responsible for developing the frameworks and test specifications that serve as the blueprint for the assessments.


The NAEP assessments contain diverse items such as “cognitive” assessment items, which measure what students know and can do in an academic subject, and “survey” or “non-cognitive” items, which gather information such as demographic variables, as well as construct-related information, such as courses taken. The survey portion includes a collection of data from students, teachers, and school administrators. Since NAEP assessments are administered uniformly using the same sets of test forms across the nation, NAEP results serve as a common metric for all states and select urban districts. The assessment stays essentially the same from year to year, with only carefully documented changes. This permits NAEP to provide a clear picture of student academic progress over time.


NAEP consists of two assessment programs: the NAEP Long-term trend (LTT) assessment and the main NAEP assessment. The LTT assessments are given at the national level only and are administered to students at ages 9, 13, and 17 in a manner that is very different from that used for the main NAEP assessments. LTT reports mathematics and reading results that present trend data since the 1970s. NAEP provides results on subject-matter achievement, instructional experiences, and school environment for populations of students (e.g., all fourth-graders) and groups within those populations (e.g., female students, Hispanic students). NAEP does not provide scores for individual students or schools. The main NAEP assessments report current achievement levels and trends in student achievement at grades 4, 8, and 12 for the nation and, for certain assessments (e.g., reading and mathematics), states and select urban districts. The Trial Urban District Assessment (TUDA) is a special project developed to determine the feasibility of reporting district-level results for large urban districts. Currently, the following 26 districts participate in the TUDA program: Albuquerque, Atlanta, Austin, Baltimore City, Boston, Charlotte, Chicago, Clark County (NV), Cleveland, Dallas, Denver, Detroit, District of Columbia (DCPS), Duval County (FL), Fort Worth, Guilford County (NC), Hillsborough County (FL), Houston, Jefferson County (KY), Orange County (FL), Los Angeles, Miami-Dade, Milwaukee, New York City, Philadelphia, and San Diego.


The possible universe of student respondents for NAEP 2026 is estimated to be 12 million at grades 4, 8, and 12, attending the approximately 154,000 public and private elementary and secondary schools in 50 states and the District of Columbia, including Bureau of Indian Education and Department of Defense Education Activity (DoDEA) Schools, and fourth-grade and eighth-grade public schools in Puerto Rico.


This request is to conduct NAEP in 2026, specifically as follows:


  • Main NAEP operational assessments will include for grades 4 and 8 (first administration of the new frameworks for reading and mathematics), grade 8 (civics and U.S. history); in Puerto Rico, grades 4 and 8 mathematics will be the only subject assessed and will include the new framework.

  • Pilot testing in grades 4, 8, and 12 (reading and mathematics); in Puerto Rico, grades 4 and 8 mathematics will be the only subject assessed.

  • Field Trial for grades 4, 8, and 12 in U.S. mainland and grades 4 and 8 in Puerto Rico.


In 2024, NAEP transitioned to the eNAEP test delivery software, the platform on which the assessment is delivered to students. NAEP is also changing the operational assessment delivery model. While NAEP previously administered assessments on NAEP Surface Pros or Chromebooks utilizing numerous NAEP field staff, the program has transitioned to a model that is ultimately less expensive and more aligned with the administration model used in state assessments. Specifically, NAEP will administer the assessment using school devices and the internet. For schools that cannot meet the minimum specification for use of school devices, NAEP will provide an alternate delivery model of utilizing less expensive, NAEP-provided Chromebooks. Additionally, to evaluate the impact of the transition to school devices, a sample of schools will be assigned to the NAEP Device Model by default, regardless of their ability to meet eligibility requirements for School Device Model.


To successfully transition to this ultimate plan, a staged approach is currently being undertaken so that trends can be measured across time. Namely, NAEP has conducted a School-based Equipment study in 2024 (OMB# 1850-0803 v.347) as well as a Field Test in 2025 (OMB# 1850-0803 v.353) to provide more information about student and school interactions with the eNAEP system on school devices as compared to NAEP Chromebook and preparations for the use of school devices in operational NAEP assessments moving forward.


In preparation for the 2026 NAEP administration, a Field Trial will be conducted with students in a live classroom environment in November 2025 by NAEP field administration staff. The Field Trial will fully replicate the NAEP operational administration testing conditions in a small number of schools. Since 2018, the NAEP program has utilized Field Trials prior to large-scale digitally based assessments to inform the upcoming administration.

Some of the assessment, questionnaire, and recruitment materials are translated into Spanish. Specifically, Spanish versions of the student assessments and questionnaires are used for qualified English learner (EL) students who qualify for a bilingual accommodation. Historically, this is done for all operational grade 4 and 8 assessments as permitted by the framework. In addition, Puerto Rican Spanish versions are offered for all students in Puerto Rico. Accordingly, Spanish versions of communication materials for parents, teachers, and staff as well as teacher and school questionnaires are provided.


This is the first package for the 2026 assessment, with both 60-day and 30-day consecutive public comment period notices published in the Federal Register. The 60-day posting was completed in December 2024. Not all final materials for the 2026 assessment are available at this time, although some initial communication materials (Appendix D), and Assessment Management System (AMS) screens (Appendix I) are available in this 30-day public posting. Further, one to two Amendments to this Clearance package are planned to be submitted in the coming months to update materials, with the planned materials for each detailed on the following page.









NAEP 2026 Amendment Schedule Table

Amendment #1

(Summer 2025)

  • Part A & B: Possible revisions to reflect any updates to the administration 

  • Appendix C: 2026 Sampling Memo 

  • Appendix D: 2026 Communication Materials 

  • Appendix E: Feedback forms  

  • Appendix I: Final 2026 AMS System screenshots  

  • Appendices J1-J3: English 2026 SQ versions; Operational Spanish translated 2026 SQs 

Amendment #2

(August/September 2025)

  • Part A & B: Possible revisions to reflect any updates to the administration 

  • J-S: Spanish translated 2026 Pilot SQs



A.1.b. Legislative Authorization

In the current legislation that reauthorized NAEP, the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622), Congress mandates the collection of national education survey data through a national assessment program:


  1. ESTABLISHMENT- The Commissioner for Education Statistics shall, with the advice of the Assessment Board established under section 302, carry out, through grants, contracts, or cooperative agreements with one or more qualified organizations, or consortia thereof, a National Assessment of Educational Progress, which collectively refers to a national assessment, state assessments, and a long-term trend assessment in reading and mathematics.


  1. PURPOSE; STATE ASSESSMENTS-

(1) PURPOSE- The purpose of this section is to provide, in a timely manner, a fair and accurate measurement of student academic achievement and reporting of trends in such achievement in reading, mathematics, and other subject matter as specified in this section.


(5) REQUIREMENT - In carrying out any assessment authorized under this section, the Commissioner for Education Statistics, in a manner consistent with subsection (c)(3), shall-

(A) use widely accepted professional testing standards, objectively measure academic achievement, knowledge, and skills, and ensure that any academic assessment authorized under this section be tests that do not evaluate or assess personal or family beliefs and attitudes or publicly disclose personally identifiable information;

(B) only collect information that is directly related to the appraisal of academic achievement, and to the fair and accurate presentation of such information; and

(C) collect information on race, ethnicity, socioeconomic status, disability, limited English proficiency, and gender.


This allows for the fair and accurate presentation of achievement data and permits the collection of background, non-cognitive, or descriptive information that is related to academic achievement and aids in the fair reporting of results. The intent of the law is to provide representative sample data on student achievement for the nation, the states, and a variety of populations of students, and to monitor progress over time.


The statute and regulation mandating or authorizing the collection of this information can be found at https://www.law.cornell.edu/uscode/text/20/9622.


A.1.c. Overview of NAEP Assessments

This section provides a broad overview of main NAEP assessments, including information on the assessment frameworks, the cognitive and survey items, inclusion policies, the transition to digitally based assessments (DBA), and the assessment types.


A.1.c.1. NAEP Frameworks

NAEP assessments follow subject-area frameworks developed by the Governing Board and use the latest advances in assessment methodology. Frameworks capture a range of subject-specific content and thinking skills needed by students in order to deal with the complex issues they encounter inside and outside their classrooms. The NAEP frameworks are determined through a development process that ensures they are appropriate for current educational requirements. Because the assessments must remain flexible to mirror changes in educational objectives and curricula, the frameworks must be forward-looking and responsive, balancing current teaching practices with research findings.


NAEP frameworks can serve as guidelines for planning assessments or revising curricula. They also can provide information on skills appropriate to grades 4, 8, and 12 and can be models for measuring these skills in innovative ways. The subject-area frameworks evolve to match instructional practices.


Developing a framework generally involves the following steps:

  • widespread participation and reviews by educators and state education officials;

  • reviews by steering committees whose members represent policymakers, practitioners, and members of the general public;

  • involvement of subject supervisors from education agencies;

  • public hearings; and

  • reviews by scholars in the field, by NCES staff, and by a policy advisory panel.

The frameworks can be found at https://www.nagb.gov/naep-frameworks/frameworks-overview.html.


A.1.c.2. Cognitive Item Development

As part of the item development process, NCES calls on many constituents to guide the process and review the assessment. Item development is guided by a multi-year design plan, which is guided by the framework and establishes the design principles, priorities, schedules, and reporting goals for each subject. Based on this plan, the NAEP contractor creates a development plan outlining the item inventory and objectives for new items and then begins the development process by developing more items than are needed. This item pool is then subjected to:

  • internal contractor review with content experts, teachers, and experts on fairness, sensitivity and bias;

  • playtesting, tryouts, or cognitive interviews with small groups of students for select items (particularly those that have new item types, formats, or challenging content), cleared under the NCES pretesting generic clearance agreement (OMB# 1850-0803); and

  • refinement of items and scoring rubrics under NCES guidance.


Next, a standing committee of content experts, state and local education agency representatives, teachers, and representatives of professional associations reviews the items. The standing committee (see Appendix A for the membership of the committees) considers the following:

  • the appropriateness of the items for the particular grade;

  • the representative nature of the item set;

  • the compatibility of the items with the framework and test specifications; and

  • the quality of items and scoring rubrics.


For state-level assessments, this may be followed by a state item review where further feedback is received. Items are then revised and submitted to NCES and the Governing Board Assessment Development Committee for approval prior to pilot testing.


The pilot test is used to finalize the testing instrument. Items may be dropped from consideration or moved forward to the operational assessment. The item set is once again subjected to review by the standing committee and NCES generally follows the same procedure described above. A final set of test items is then assembled for NCES and the Governing Board’s review and approval. After the operational assessment, items are once again examined. In rare cases where item statistics indicate problems, the item may be dropped from the assessment. The remaining items are secured for reuse in future assessments, with a subset of those items publicly released.


A.1.c.3. Survey Items

In addition to assessing subject-area achievement, NAEP collects information that serves to fulfill the reporting requirements of the federal legislation and to provide context for the reporting of student performance. The legislation requires that, whenever feasible, NAEP includes information on special groups (e.g., information reported by race, ethnicity, socioeconomic status, sex, disability, and limited English proficiency). As part of most NAEP assessments, three types of questionnaires are used to collect information: student, teacher, and school. An overview of the questionnaires is presented below.


Student Questionnaires

Each NAEP student assessment form includes non-cognitive items, also known as the student questionnaire. The questionnaires appear in separately timed blocks of items in the assessment forms. The items collect information on students’ demographic characteristics, classroom experiences, and educational support. Students’ responses provide data that give context to NAEP results and/or allow researchers to track factors associated with academic achievement. Students complete the questionnaires voluntarily (for confidentiality provisions see section A.10 for more information). Student names are never reported with their responses or with the other information collected by NAEP.


Each student questionnaire includes three types of items, as follows:

  • General student information: Student responses to these items are used to collect information about factors such as race or ethnicity and parents’ education level. Answers on the questionnaires also provide information about factors associated with academic performance, including household composition, academic self-discipline, and the number of books in the home.

  • Other contextual/policy information: These items focus on students’ educational settings and experiences and collect information about students’ attendance (i.e., days absent), family discourse (i.e., talking about school at home), reading load (i.e., pages read per day), and technology use at school. There are also items that ask about students’ effort on the assessment and the difficulty of the assessment. Answers on the questionnaires provide information on how aspects of education and educational resources are distributed among different groups.

  • Subject-specific information: In most NAEP administrations, these items cover three categories of information: (1) time spent studying the subject; (2) instructional experiences in the subject; and (3) student factors (e.g., effort, confidence) related to the subject and the assessment.


Teacher Questionnaires

To provide supplemental information about the instructional experiences reported by students, teachers are asked to complete an online questionnaire using NAEPq about their instructional practices, classroom organization, teaching background and training, and the subject in which students are being assessed. NAEPq is an online platform used for the completion of online questionnaires. Teacher responses are then matched to student data. While completion of the questionnaire is voluntary, NAEP encourages teachers’ participation since their responses improve the accuracy and completeness of the NAEP assessment.


Teacher questionnaires are typically only given to teachers at grades 4 and 8; NAEP typically does not collect teacher information for grade 12. By grade 12, there is such variation in student course-taking experiences that students cannot be matched to individual teachers for each tested subject. For example, a student may not be taking a mathematics class in grade 12, so they cannot be matched to a teacher. Conversely, a student could be taking two mathematics classes at grade 12 and have multiple teachers related to mathematics. Only an economics teacher questionnaire has been developed and administered at grade 12. However, these data were not released (with either the 2006 or the 2012 results) due to a student-teacher match rate below statistical standards.2


Teacher questionnaires are organized into different parts. The first part of the teacher questionnaire covers background and general training and includes items concerning years of teaching experience, certifications, degrees, major and minor fields of study, coursework in education, coursework in specific subject areas, the amount of in-service training, the extent of control over instructional issues, and the availability of resources for the classroom. Subsequent parts of the teacher questionnaire tend to cover training in the subject area, classroom instructional information, and teacher exposure to issues related to the subject and the teaching of the subject. They also ask about pre- and in-service training, the ability level of the students in the class, the length of homework assignments, the use of particular resources, and how students are assigned to particular classes.


School Questionnaires

The school questionnaire provides supplemental information about school factors that may influence students’ achievement. It is given to the principal or another official of each school that participates in the NAEP assessment. While schools’ completion of the questionnaire is voluntary, NAEP encourages schools’ participation since it makes the NAEP assessment more accurate and complete. The school questionnaire is organized into different parts. The first part tends to cover characteristics of the school, including the length of the school day and year, school enrollment, absenteeism, dropout rates, and the size and composition of the teaching staff. Subsequent parts of the school questionnaire tend to cover tracking policies, curricula, testing practices, special priorities, and schoolwide programs and problems. The questionnaire also collects information about the availability of resources, policies for parental involvement, special services, and community services.


The school questionnaire is accessed online through NAEPq. The supplemental charter school questionnaire designed to collect information on charter school policies and characteristics is provided to administrators of charter schools who are sampled to participate in NAEP. The supplement covers organization and school governance, parental involvement, and curriculum and offerings.


Development of Survey Items

The Background Information Framework and the Governing Board’s Policy on the Collection and Reporting of Background Data (located at https://www.nagb.gov/content/nagb/assets/documents/policies/collection-report-backg-data.pdf), guide the collection and reporting of non-cognitive assessment information. In addition, subject-area frameworks provide guidance on subject-specific, non-cognitive assessment questions to be included in the questionnaires. The development process is very similar to the cognitive items, including review of the existing item pool; development of more items than are intended for use; review by experts (including the standing committee); and cognitive interviews with students, teachers, and schools. When developing the questionnaires, NAEP uses a pretesting process so that the final questions are minimally intrusive or sensitive, are grounded in educational research, and the answers can provide information relevant to the subject being assessed. All non-cognitive items undergo one-on-one cognitive interviews, which are useful for identifying questionnaire and procedural problems before larger-scale pilot testing is undertaken.


In the web-based NAEP Data Explorer,3 (located at https://www.nationsreportcard.gov/ndecore/landing) the results of the questionnaires are sorted into eight broad categories: Major Reporting Groups, Student Factors, Factors Beyond School, Instructional Content and Practice, Teacher Factors, School Factors, Community Factors, and Government Factors.


To minimize burden on the respondents and maximize the constructs addressed via the questionnaires, NAEP may spiral items across respondents and/or rotate some non-required items across assessment administrations. The possible “library” of items for the NAEP 2026 questionnaires, for each subject and respondent, are included in appendix F. Approved versions of the Main NAEP questionnaires will be provided within Amendment #1 in Appendices J1, J2, J3, and J-S (Spanish Operational SQ items). The translated Pilot SQs will be in Amendment #2 in J-S.


A.1.c.4. Participation in NAEP

It is important for NAEP to assess as many students selected to participate as possible. Assessing representative samples of students, including students with disabilities (SDs) and English learners (ELs), helps to ensure that NAEP results accurately reflect the educational performance of all students in the target population and can continue to serve as a meaningful measure of U.S. students’ academic achievement over time.


The Governing Board, which sets policy for NAEP, has been exploring ways to ensure that NAEP continues to appropriately include as many students as possible and to do so in a consistent manner for all jurisdictions assessed and reported on. In March 2010 (updated August 2014), the Governing Board adopted a policy, NAEP Testing and Reporting on Students with Disabilities and English Language Learners (located at https://www.nagb.gov/content/nagb/assets/documents/policies/naep_testandreport_studentswithdisabilities.pdf). This policy was the culmination of work with experts in testing and curriculum and those who work with exceptional children and students learning to speak English. The policy aims to:

  • maximize participation of sampled students in NAEP;

  • reduce variation in exclusion rates for SD and EL students across states and districts;

  • develop uniform national rules for including students in NAEP; and

  • ensure that NAEP is fully representative of SD and EL students.


The policy defines specific inclusion goals for NAEP samples. At the national, state, and district levels, the goal is to include 95 percent of all students selected for the NAEP samples, and 85 percent of those in the NAEP sample who are identified as SD or EL.


Students are selected to participate in NAEP based on a sampling procedure4 designed to yield a sample of students that is representative of students in all schools nationwide and in public schools within each state. First, schools are selected, and then students are sampled from within those schools without regard to disability or English language proficiency. Once students are selected, those previously identified as SD or EL may be offered accommodations or excluded.


Accommodations in the testing environment or administration procedures are provided for SD and EL students. Some examples of accommodations permitted by NAEP are extended time, magnification, hearing impaired version of the test, and high contrast for visually impaired students. Some examples of testing accommodations not allowed are giving the reading assessment in a language other than English or reading the passages in the reading assessment aloud to the student.


States and jurisdictions vary in their proportions of students with disabilities and in their use of accommodations. Despite the increasing identification of SD and EL students in some states, in particular EL students at grade 4, NAEP inclusion rates have generally remained steady or increased since 2003. This reflects efforts on the part of states and jurisdictions to include all students who can meaningfully participate in the NAEP assessments. The NAEP inclusion policy is an effort to ensure that this trend continues.


A.1.c.5. Digitally Based Assessments (DBAs)

Our nation’s schools continue to make digital tools an integral component of the learning environment, reflecting the knowledge and skills needed for future post-secondary success. NAEP is reinforcing this by continuing to evolve in the changing educational landscape through leveraging the use of DBAs.


In 2026, the NAEP assessment will be administered on school devices using the NAEP Assessment Application. During the preassessment phase the application will be installed and confirmed on school devices. Sampled schools who are not eligible and qualified for school devices will be provided NAEP Chromebooks that will utilize the NAEP Provided Network with the NAEP Assessment Application installed for students to complete the assessment. Additionally, in order to evaluate the impact of the transition to school devices, a sample of schools will be assigned to the NAEP Device Model by default, regardless of their ability to meet eligibility requirements for School Device Model.


Leveraging Technologies

NAEP DBAs use testing methods and item types that reflect the use of technology in education. Examples of such item types include the following:


  • Multimedia elements, such as video and audio clips are used in NAEP assessments. For example, the following elements are included:

    • Immersive reading experiences that mimic complex websites students would experience in school/general research. 

    • Imagery that provides context-building images, such as diagrams that track and convey progress visually.

    • Audio - All scenario-based tasks (SBTs) use real voice actors to make the characters more real and engaging and bring in multiple modalities for increased engagement. 

  • Interactive items and tools: Some questions may allow the use of embedded technological features to form a response. For example, students may use “drag and drop” functionality to place labels on a graphic or may tap an area or zone on the screen to make a selection. Other questions may involve the use of digital tools. In the mathematics DBA, an online calculator is available for students to use when responding to some items. NAEP interactive item components, such as the ruler, number line, bar graph, and various coordinate-grid-based line and point tools, expand measurement capabilities with tools comparable to how students learn about fundamental math concepts.

  • Immersive SBTs: SBTs use multimedia features and tools to engage students in rich, authentic problem-solving contexts. NAEP’s first scenario-based tasks were administered in 2009, when students at grades 4, 8, and 12 were assessed with interactive computer tasks in science. The science tasks asked students to solve scientific problems and perform experiments, often by simulation. They provide students more opportunities than a paper-based assessment (PBA) to demonstrate skills involved in doing science without many of the logistical constraints associated with a natural or laboratory setting. The science tasks administered in 2019 can be explored at https://www.nationsreportcard.gov/science/sample-questions/. NAEP also administered scenario-based tasks in the 2018 technology and engineering literacy (TEL) assessment, where students were challenged to work through computer simulations of real-world situations they might encounter in their everyday lives. A sample TEL task can be viewed at https://www.nationsreportcard.gov/tel/tasks/. NAEP is continuing to expand the use of scenario-based tasks to measure knowledge and skills in other subject areas such as mathematics and reading. SBTs have been part of the operational reading assessment since 2019 and math SBTs will be administered for the first time for NAEP for 2026.


In addition to new item types, the transition to DBA makes it possible for NAEP to employ an adaptive testing design, in which assessment content is targeted to students’ ability based on performance during the test administration. Thus, students see items that are tailored to their ability levels, and they may be more likely to be able to engage in the assessment and demonstrate what they know and can do. The goal of implementing adaptive testing is to achieve better measurement of student knowledge and skills across the wide range of student performance levels on which NAEP reports.


The type of adaptive testing being considered for NAEP is a multi-stage test (MST) design that uses two stages. Students take sections of cognitive items, just as in past NAEP administrations. Based on their performance on the first section of items, students receive one or more subsequent sections of items that is targeted to their ability level. For example, students who do not perform well on the first section of items receive a second section composed of somewhat easier items. The implementation of this two-stage MST design for NAEP has been informed by previous research on the benefits, applicability, and feasibility of adaptive testing for NAEP.


DBA technology also allows NAEP to capture information about what students do while attempting to answer questions. While PBA only yields the final responses in the test form, DBA captures actions students perform while interacting with the assessment tasks, as well as the time at which students take these actions. These student interactions with the assessment interface are not used to assess students’ knowledge and skills, but provide valuable context on item performance, time-on-tasks, and tool usage. For example, more proficient students may use digital tools such as the calculator in mathematics or the spell checker in writing assessments, compared to +less proficient students. As such, NAEP will potentially uncover more information about which actions students use when they successfully (or unsuccessfully) answer specific questions on the assessment.


NAEP will capture the following actions in the DBA, although not all actions will be captured for all assessments: 

  • Student navigation (e.g., clicking back/next; clicking on the progress navigator; clicking to leave a section); 

  • Student use of tools (e.g., zooming; using text-to-speech; opening and interactions with the scratchwork tool; opening and interactions with the calculator; using the equation editor; clicking the change language button; selecting the theme; opening the Help tool); 

  • Student responses (e.g., clicking a choice; eliminating a choice; clearing an answer; keystroke log of student typed text); 

  • Timing data (e.g., the time between events, which can be used to deconstruct student time spent on certain tasks);

  • Other student events (e.g., vertical and horizontal scrolling; media interaction such as playing an audio stimulus); and 

  • Tutorial events (records student interactions with the tutorial practice item or not interacting with the tutorial when prompted). 


Development of Digitally Based Assessments (DBAs)

NAEP’s item and system development processes include several types of activities that help to ensure our DBA measure the subject-area knowledge and skills outlined in the NAEP frameworks and not students’ ability to use the device or the particular interface elements and digital tools included in the DBA.


During item development, new digitally based item types and tasks are studied and pretested with diverse groups of students. The purpose of these pretesting activities is to determine whether construct-irrelevant features, such as confusing wording, unfamiliar interactivity or contexts, or other factors, prevent students from demonstrating the targeted knowledge, skills, and abilities. Such activities help identify usability, design, and validity issues so that items and tasks may be further revised and refined prior to administration.


Development of the assessment delivery system, including the interface that students interact with when taking NAEP DBA, is informed by best practices in accessibility and user experience design. Decisions about the availability, appearance, and functionality of system features and tools are also made based on the results of usability testing with students.


To help ensure that students know how to use the assessment system and tools, each administration of a NAEP DBA begins with a brief interactive tutorial that teaches students how to use the system features to take the assessment. Students actively engage with the tutorial, as they are asked to use specific tools and features. Help screens are also built into the system, and students can access them at any time while taking the assessment. The 2024 tutorials are available at https://enaep.cotw-ng.naep.ed.gov/totw/2024/English.html.


Accommodations and Universal Design Elements with DBA

Technologies are improving NAEP’s ability to provide appropriate accommodations that allow greater participation and provide universal access for all students including those with disabilities and English learners. Universal Design Elements allow for zooming and read aloud/text-to-speech for test items in English. These features are available for assessments excluding the reading cognitive content. In addition, students taking the assessment can choose high contrast color theming, utilize a scratchwork/highlighter tool, or eliminate answer choices.


In addition to these Universal Design Elements as described in section A.1.c.4, NAEP also continues to provide accommodations to students with Individualized Education Programs (IEPs), Section 504 plans, and English learning plans. Some accommodations are available in the testing system (such as additional time, a magnification tool, or a Spanish/English version of the test), while others are provided by the test administrator or the school (such as breaks during testing, sign language interpretation of the test, or a bilingual dictionary). Section B.2.b. provides more information on the classification of students and the assignment of accommodations.


A.1.c.6. Assessment Types

NAEP uses three types of assessment activities, which may simultaneously be in the field during any given data collection effort. Each is described in more detail below.


Operational Assessments

Operational NAEP administrations, unlike pilot administrations, collect data to publicly report on the educational achievement of students as required by federal law. The NAEP results are reported in The Nation’s Report Card (http://nationsreportcard.gov/), which is used by policymakers, state and local educators, principals, teachers, and parents to inform educational policy decisions.


Pilot Assessments

Pilot testing of cognitive and non-cognitive items is carried out in all subject areas. The purpose of pilot testing is to obtain information regarding clarity, difficulty levels, timing, and feasibility of items and conditions. In addition to ensuring that items measure what is intended, the data collected from pilot tests serve as the basis for selecting the most effective items and data collection procedures for the subsequent operational assessments. Pilot testing is a cost-effective means for revising and selecting items prior to an operational data collection because the items are administered to a small, nationally representative sample of students and data are gathered about performance that crosses the spectrum of student achievement. Items that do not work well can be dropped or modified before the operational administration.


Prior to pilot testing, many new items are pre-tested with small groups of sample participants (cleared under the NCES pretesting generic clearance agreement; OMB# 1850-0803). All non-cognitive items undergo one-on-one cognitive interviews, which is useful for identifying questionnaire and procedural problems before larger scale pilot testing is undertaken. Select cognitive items also undergo pre-pilot testing, such as item tryouts or cognitive interviews, in order to test out new item types or formats, or challenging content. In addition, usability testing is conducted on new technologies and technology-based platforms and instruments.


Special Studies

Special studies are an opportunity for NAEP to investigate specific aspects of the assessment without impacting the reporting of NAEP results. Previous special studies have focused on linking NAEP to other assessments or linking across NAEP same-subject frameworks, investigating the expansion of the item pool, evaluating specific accommodations, investigating administration modes, and providing targeted data on specific student populations.


In addition to the overarching goal of NAEP providing data about student achievement at the national, state, and district levels, NAEP provides specially targeted data on an as-needed basis. At times, this may only mean that a special analysis of the existing data is necessary. At other times, this may include the addition of a brief, add-on questionnaire targeted at specified groups. For example, in the past, additional student, teacher, and school questionnaires were developed and administered as part of the National Indian Education Study (NIES) that NCES conducted on behalf of the Office of Indian Education. Through such targeted questionnaires, important information about the achievement of a specific group is gathered at minimal additional burden. These types of special studies are intentionally kept to a minimum and are designed to avoid jeopardizing the main purpose of the program.


Field Trial

The purpose of a field trial is to perform a dress rehearsal prior to an operational administration. The field trial is conducted with students in a live classroom environment at a small number of schools, allowing the system to be tested in the way it will be used in the national study to help identify platform system or operational issues prior to an administration.


A.1.d. Overview of 2026 NAEP Assessments

The Governing Board determines NAEP policy and the assessment schedule,5 and future Governing Board decisions may result in changes to the plans represented here. Any changes will be presented in subsequent clearance packages or revisions to the current package.


The 2026 data collection will consist of the following:

  • Main NAEP operational assessments will include grades 4 and 8 (new framework for reading and mathematics), grade 8 (civics and U.S. history); in Puerto Rico, grades 4 and 8 mathematics will be the only subject assessed and will include the new framework.

  • Pilot testing in grades 4, 8, and 12 (reading and mathematics); in Puerto Rico, grades 4 and 8 mathematics will be the only subject assessed.

  • Field trial for grades 4, 8, and 12 in U.S. mainland and grades 4 and 8 in Puerto Rico.


The 2026 operational assessment will include a bridge study comparing NAEP-provided devices and school-provided devices. The NAEP program will transition operationally to assessments administered on school-provided devices (e.g., desktops, laptops, tablets with keyboards). Schools that do not meet NAEP’s minimum specifications will be assessed on NAEP-provided devices. To accurately evaluate that scores from the two different device types are comparable, a bridge study will be conducted in 2026.


However, the 2026 bridge study is not as simple as comparing the two different types of schools (i.e., the school-provided qualified schools assessing on school devices and the schools not qualified to assess on school devices which assess on NAEP-provided devices), given that they might have different characteristics. Therefore, some schools that qualify to be assessed on school devices will be assessed on NAEP-provided devices. This will establish a common population linking.


A.2. How, by Whom, and for What Purpose the Data Will Be Used

Results will be reported on the 2026 operational assessments in mathematics, reading, U.S. history, and civics. NAEP will use the results from the 2026 pilot testing to inform future assessments and procedures. Results from any potential special studies may be published as research reports.


The NAEP operational results are reported in The Nation’s Report Card, which is used by policymakers, state and local educators, principals, teachers, and parents to help inform educational policy decisions. The main NAEP report cards provide national results, trends for different student groups, results on scale scores and achievement levels, and sample items. In reports with state or urban district results, there are sections that provide overview information on the performance of these jurisdictions. If NCES elects to release sample items, percentage correct statistics on those items will be provided in the report. NAEP does not provide scores for individual students or schools.


Results from each NAEP assessment are provided online in an interactive website (http://nationsreportcard.gov/) and in one-page summary reports, for each participating state or urban district. Additional data tools are available online for those interested in:


In addition to contributing to the reporting tools mentioned above, data from the questionnaires are used as part of the marginal estimation procedures that produce the student achievement results. Questionnaire data are also used to perform quality control checks on school-reported data and in special reports, such as the Black–White Achievement Gap report (http://nces.ed.gov/nationsreportcard/studies/gaps/) and the Classroom Instruction Report in reading, mathematics, and science based on the 2015 Student Questionnaire Data (https://www.nationsreportcard.gov/sq_classroom/#mathematics).


Lastly, there are numerous opportunities for secondary data analysis because of NAEP’s large scale, the regularity of its administrations, and its stringent quality control processes for data collection and analysis. NAEP data are used by researchers and educators who have diverse interests and varying levels of analytical experience.


A.3. Improved Use of Technology

NAEP has continually moved to administration methods that utilize technology, as described below.


Online Teacher and School Questionnaires

The NAEP program provides the teacher and school questionnaires primarily online through a tool known as NAEPq.


Preassessment Activities

Participating NAEP school have designated staff members who serve as support for the NAEP assessment. Preassessment and assessment activities include functions like finalizing student samples, verifying student demographics, reviewing accommodations, installing the NAEP application on student devices, and planning logistics for the assessment. NAEP uses an electronic Assessment Management System (AMS) for school staff to easily provide necessary administration information through the system, which includes logistical information, updates of student and teacher information, and the completion of inclusion and accommodation information.6


Digitally Based Assessments (DBA)

As described in section A.1.c.5, NAEP has transitioned to DBA. DBA allows NAEP to provide assessments consistent with other large-scale assessments. In addition, DBA allows NAEP to mirror today’s classrooms, improve measurement of knowledge and skills, and collect new types of data that enhance our understanding of what students can do and know.


NAEP Assessment Application

The NAEP Assessment Application was first used in 2024 for the School-based Equipment Proof of Concept to allow students to access the NAEP assessment using school devices. The application is installed by school staff following the installation and validation instructions found on the eNAEP Download Center, updated with each NAEP administration cycle (see Appendix D). The application is available as a desktop shortcut on Windows devices or as a Kiosk app on the Chromebook login screen, which allows the student to launch the NAEP assessment. It conducts various checks on the device to ensure that it meets the necessary specifications so that the user can consistently interact with the NAEP assessment.. The NAEP Assessment Application enables real-time assessment data transfer to NAEP Cloud servers..


Automated Scoring

NAEP administers a combination of selected-response items and open-ended or constructed-response items. In recent years, NAEP has introduced algorithmic scoring for selected-responses items that have a definable and finite number of responses, and each response can be unambiguously coded to map to a level of the scoring rubric. Algorithmic scoring is fully automated, yielding process and cost efficiencies. The volume of algorithmically-scored items continues to increase as more items are developed for DBA. NAEP currently uses human scorers to score the constructed-response items, using detailed scoring rubrics and proven scoring methodologies. With the increased use of technologies, the methodology and reliability of automated scoring (i.e., the scoring of constructed-response items using computer software) has advanced. While NAEP has not employed automated scoring methodologies operationally to date, these are being investigated for possible use in 2026. In particular, NCES recently held a competition to examine a variety of automated scoring engines and methods for consideration in NAEP (see: https://nces.ed.gov/whatsnew/press_releases/1_21_2022.asp). In 2024, an Automated Scoring dress rehearsal study was conducted. The purpose of the study was to do the following: 

  • Determine if the selected 2024 NAEP grade 4 and grade 8 open-ended reading items can be scored using automated scoring at scale within the scoring timelines for reporting results. 

  • Evaluate the differences between the two modes of scoring: human and predictive. 

  • Identify process and procedural changes needed to inputs and outputs throughout the assessment administration to operationalize automated scoring.

A final report will be concluded during the summer of 2025. The results will be used to determine the use of automated scoring options for future NAEP administrations.


A.4. Efforts to Identify Duplication

The proposed assessments, including the questionnaires, do not exist in the same format or combination in the U.S. Department of Education or elsewhere. The non-cognitive data gathered by NAEP comprise the only comprehensive cross-sectional survey performed regularly on a large-scale basis that can be related to extensive achievement data in the United States. No other federally funded studies have been designed to collect data for the purpose of regularly assessing trends in educational progress and comparing these trends across states. None of the major non-federal studies of educational achievement were designed to measure changes in national achievement. In short, no existing data source in the public or private sector duplicates NAEP.


While the survey items in NAEP are unique, the items are not developed in a vacuum. Their development is informed by similar items in other assessments and survey programs. In addition, in future rounds of development, NCES will continue to align the NAEP survey questions with other surveys (particularly, but not limited to, those from other NCES and federal survey programs).


Historically, NAEP has served as a critical national “audit” function, offering an extremely helpful reference point in the interpretation of score trends on “high-stakes” tests used for school accountability. The main NAEP scales have served this function well even though high-stake state assessments were not always closely aligned with the corresponding NAEP assessments.


NAEP has provided the best available information about the academic achievement of the nation’s students in relation to consensus assessment frameworks, maintaining long-term trend lines for decades. In addition to reporting at the national level, NAEP has offered achievement comparisons among participating states for more than three decades, and, since 2003, all states have participated in the NAEP mathematics and reading assessments at the fourth- and eighth-grades. More recently, NAEP has also reported achievement for selected large urban school districts. Also, characterizing the achievement of fourth-, eighth-, and twelfth-grade students in a variety of subject areas, NAEP has served to document the often-substantial disparities in achievement across demographic groups, tracking both achievement and achievement gaps over time. NAEP has furthered deliberation as to the scope and meaning of achievement in mathematics, reading, and other subject areas. NAEP assessments are aligned to ambitious assessment frameworks developed by a thoughtful process to reflect the best thinking of educators and content specialists. These frameworks have served as models for the states and other organizations. Finally, NAEP has served as a laboratory for innovation, developing and demonstrating new item formats, as well as statistical methods and models now emulated by large-scale assessments worldwide.


NAEP has functioned well as a suite of complex survey modules conducted as assessments of student achievement in fixed testing windows. The complexity of NAEP evolved by necessity to address its legal and policy reporting requirements and the complex sampling of items and students needed to make reliable and valid inferences at the subgroup, district, state, and national level for stakeholders, ranging from policymakers to secondary analysts, and to do so without creating an undue burden on students and schools.


A.5. Burden on Small Businesses or Other Small Entities

The school samples for NAEP contain small-, medium-, and large-size schools, including private schools. Schools are included in the sample proportional to their representation in the population, or as necessary to meet reporting goals. It is necessary to include all types of schools including small and private schools so that students attending all schools are represented in the data collection and in the reports. The trained field staff work closely with all sampled schools to ensure that the preassessment and administration activities can be completed with minimal disruption.


A.6. Consequences of Collecting Information Less Frequently

Under the National Assessment of Educational Progress Authorization Act, Congress has mandated the ongoing collection of NAEP data. Failure to collect the 2026 assessment data on the current schedule would affect the quality and schedule of the NAEP assessments and would result in assessments that would not fulfill the mandate of the legislation.


A.7. Consistency with 5 CFR 1320.5

No special circumstances are involved. This data collection observes all requirements of 5 CFR 1320.5.


In March 2024, the Office of Management and Budget (OMB) announced revisions to Statistical Policy Directive No. 15: Standards for Maintaining, Collecting, and Presenting Federal Data on Race and Ethnicity (SPD 15) and published the revised SPD15 standard in the Federal Register (89 FR 22182). The 2026 NAEP data collection described in this package continues to use race and ethnicity categories as described in the 1997 SPD 15 standards in some contexts but has moved toward compliance with the 2024 SPD 15 standards in some parts of the data collection. The plans for NAEP’s implementation of the revised SPD 15 standards for the 2026 main NAEP assessment are described below. Amendments #1 and #2 will include draft and final data collection instruments to reflect these changes.


NAEP collects race and ethnicity data in two ways, as part of individual questionnaires and as part of the roster uploads from schools (as seen in Appendix I, “Provide Student Information”). As discussed in A.1.a, not all final materials for the 2026 assessments are available at this time, and all instruments presented in this first package will be revised in future 30-day amendments. Although we are unable to present the individual instruments at this time, below is a description of the plan for race and ethnicity items to be used in NAEP 2026.


Individual questionnaires in main NAEP include the student, teacher, and school administrator questionnaires.


For 2026 Grade 4, 8, and 12 students, NAEP will administer the revised SPD 15 Figure 3 simplified version (see updated SPD 15 guidelines), to comply with the updated OMB SPD 15 standards. Figure 3 allows for more streamlined response options for students taking the assessment in comparison to Figures 1 and 2 and may reduce burden and increase response rates, particularly for younger students. Future NAEP administrations may consider the use of Figure 2 for student questionnaires as more information is learned from potential special studies with student populations.


For 2026 Grade 4 and 8 teacher respondents, NAEP will administer the revised 2024 SPD 15 Figure 1 version of the race and ethnicity item. Like many other data collections carried out online (and with the possibility that respondents will use mobile devices for response in the future) this is structured as a rollout of items. Respondents first see an item identical to Figure 3 as shown in the revised standards, followed up with breakout items based on the selected minimal categories. This approach may be modified based on the data from the study collected as part of the 2025 NAEP Field Test (OMB# 1850-0803 v.361). If any changes are to be made, they will be reflected in Amendment #1.


Of note, school administrators participating in NAEP are no longer asked to respond to any items about race and ethnicity.


The roster data that NAEP receives from schools for the purpose of student sampling is proxy data, reported by institutions. Because of this, NAEP and NCES are reliant on the ability of those third-party recordkeepers to report their data in compliance with SPD 15. NCES and the Department of Education are currently working with the National Assessment Governing Board (NAGB) and the National Forum of Education Statistics (the Forum), as well as other stakeholders, to establish timelines for compliance with the revised standard for all school systems across the country. The details of these timelines will be included in the ED Action Plan on Race and Ethnicity when it is submitted to OMB on or by September 29, 2025, but ED does not anticipate that school systems will be ready to report data to NAEP that is consistent with 2024 SPD 15 by the time of data collection for NAEP 2026.


A.8. Consultations Outside the Agency

The NAEP assessments are conducted by a Coalition of organizations, as well as organizations that support the Coalition) under contract with the U.S. Department of Education.


The current Coalition, and organizations that support the Coalition, includes the following:


  • Management Strategies is responsible for managing the integration of multiple NAEP project schedules and providing data on timeliness, deliverables, and cost performance.

  • Educational Testing Service (ETS) is responsible for coordinating Coalition contractor activities, developing the assessment instruments, analyzing the data, preparing the reports, and platform development.

  • Sanametrix is responsible for NAEP web technology, development, operations, and maintenance including the Integrated Management System (IMS).

  • Pearson is responsible for scanning and scoring students’ responses.

  • Westat is responsible for printing and distributing the assessment materials, managing field operations and data collection, and coordinating with states and districts. Westat’s responsibilities include selecting the school and student samples and weighting the samples. Westat also provides ongoing support and training for full-time NAEP State and TUDA Coordinators in states across the nation through its NAEP Support and Service Center.


In addition to the NAEP Coalition, other organizations support the NAEP program, all of which are under contract with the U.S. Department of Education. The current list of organizations includes the following:7

  • CRP, Incorporated is responsible for providing logistical and programmatic support.

  • Manhattan Strategies Group is responsible for supporting the planning, development, and dissemination of NAEP publications and outreach activities, and for providing technical support.

  • State Education Agencies (SEAs) establish a liaison between the state education agency and NAEP, serve as the state’s representative to review NAEP assessment items and processes, coordinate the NAEP administration in the state, analyze and report NAEP data, and coordinate the use of NAEP results for policy and program planning.

  • Tribal Tech is responsible for providing support for the National Indian Education Study.


In addition to the contractors responsible for the development and administration of the NAEP assessments, the program involves many consultants and is also reviewed by specialists serving on various technical review panels. These consultants and special reviewers bring expertise concerning students of different ages, ethnic backgrounds, geographic regions, learning abilities, and socioeconomic levels; the specific subject areas being assessed; the analysis methodologies employed; and large-scale assessment design and practices. Contractor staff and consultants have reviewed all items for bias and sensitivity issues, grade appropriateness, and appropriateness of content across states.


In particular, subject-area standing committees play a central role in the development of NAEP assessment instruments and have been essential in creating assessment content that is appropriate for the targeted populations, and that meets the expectations outlined in the Governing Board frameworks. One of the most important functions of the committees is to contribute to the validation of the assessments. Through detailed reviews of items, scoring guides, tasks, constructed-response item training sets for scorers, and other materials, the committees help establish that the assessments are accurate, accessible, fair, relevant, and grade-level appropriate, and that each item measures the knowledge and skills it was designed to measure. When appropriate, members of subject-area standing committees will also review the questionnaires with regards to appropriateness with existing curricular and instructional practices.


Appendix A lists the current members of the following NAEP advisory committees:


  • NAEP Design and Analysis Committee

  • NAEP Validity Studies Panel

  • NAEP National Indian Education Study Technical Review Panel

  • NAEP Mathematics Standing Committee

  • NAEP Reading Standing Committee

  • NAEP Survey Questionnaires Standing Committee

  • NAEP Mathematics Translation Review Committee

  • NAEP Grade 4 and 8 Survey Questionnaire and eNAEP DBA System Translation Review Committee

  • NAEP Principals’ Panel Standing Committee


It is the practice for OMB representatives to be invited to attend the technical review panel meetings that are most informative for OMB purposes.


In addition to the contractors and the external committees, NCES works with the NAEP State Coordinators, who serve as the liaisons between each state education agency and NAEP, coordinating NAEP activities in their state. NAEP State Coordinators work directly with the NAEP-sampled schools.


A.9. Payments or Gifts to Respondents

In general, there will be no gifts or payments to respondents, although students do get to keep the NAEP‑provided earbuds used for the DBA. On occasion, NAEP will leave educational materials at schools for their use (e.g., science kits from the science hands-on assessments). Some schools also offer recognition parties with pizza or other perks for students who participate; however, these are not reimbursed by NCES or the NAEP contractors. If any incentives are proposed as part of a future special study, they would be justified as part of that future clearance package. As appropriate, the amounts would be consistent with amounts approved in other studies with similar conditions.


A.10. Assurance of Confidentiality

Data security and confidentiality protection procedures have been put in place for NAEP to ensure that all NAEP contractors and agents (see section A.8 in this document) comply with all privacy requirements, including:


  1. The Statements of Work of NAEP contracts;

  2. National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622);

  3. Family Educational Rights and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));

  4. Privacy Act of 1974 (5 U.S.C. §552a);

  5. Privacy Act Regulations (34 CFR Part 5b);

  6. Computer Security Act of 1987;

  7. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  8. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  9. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  10. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  11. The U.S. Department of Education Incident Handling Procedures (February 2009);

  12. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  13. NCES Statistical Standards;

  14. The Children’s Online Privacy Protection Act (COPPA; 15 U.S.C. §§ 6501–6506); and

  15. All new legislation that impacts the data collected through the contract for this study.


As of May 2025, NCES’s assurance of confidentiality protections for NAEP have changed due to recent staffing changes at the Department of Education. NCES has removed the Foundations of Evidence-Based Policymaking Act of 2018, Title III, Part B, Confidential Information Protection (“CIPSEA”) as a confidentiality assurance. However, confidentiality assurances under the Education Sciences Reform Act of 2002 (ESRA) remain in effect.


All NAEP contractors and agents will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/. Security controls include secure data processing centers and sites; properly vetted and cleared staff; and data sharing agreements.


An important privacy and confidentiality issue is the protection of the identity of assessed students, their teachers, and their schools. To assure this protection, NAEP has established security procedures, described below, that closely control access to potentially identifying information.


All assessment and questionnaire data are protected. This means that NAEP applications that handle assessment and questionnaire data

  • enforce effective authentication password management policies;

  • limit authorization to individuals who truly need access to the data, only granting the minimum necessary access to individuals (i.e., least privilege user access);

  • keep data encrypted, both in storage and in transport, utilizing volume encryption and transport layer security protocols;

  • utilize SSL certificates and HTTPS protocols for web-based applications;

  • limit access to data via software and firewall configurations as well as not using well known ports for data connections; and

  • restrict access to the portable networks utilized to administer an assessment to only assessment devices.


The data collection process described below is based on the current handoff procedures for the current contractors. This process may be updated with the 2024-2029 contracts. Any such changes will be reflected in Amendments #1 and #2.


Students’ names are submitted to the Sampling and Weighting (SW) contractor for selecting the student sample. This list also includes the month/year of birth, race/ethnicity, sex, and status codes for students with disabilities, English learners, and economic disadvantage. This data request for NAEP fully conforms to the requirements of the Family Educational Rights and Privacy Act of 1974 (FERPA) [20 U.S.C. 1232g; 34 CFR Part 99]. FERPA is designed to protect the privacy rights of students and their families, by providing consistent standards for the release of personally identifiable student and family information. NCES and its agents are explicitly authorized under an exception to FERPA’s general consent rule to obtain student level data from institutions. For the purposes of this collection of data, FERPA permits educational agencies and institutions to disclose personally identifiable information (PII) from students’ education records, without consent, to authorized representatives of the Secretary of Education in connection with an evaluation of federally supported education programs (34 CFR §§ 99.31(a)(3)(iii) and 99.35).


After the student sample is selected, the data for selected students are submitted to the Data Collection (DC) contractor, who includes the data in the packaging and distribution system for the production of student-specific materials (such as labels to attach to the student forms or log-in ID cards), which are then forwarded to field staff and used to manage and facilitate the assessment. These data are also uploaded to the AMS online system for review by schools and used by field staff to print materials used by the schools. Student information is deleted from the packaging and distribution system before the assessment begins. Student information is securely deleted from the AMS typically two weeks after all quality control activities for the assessment are complete.


All paper-based student-specific materials linking PII to assessment materials are destroyed at the schools upon completion of the assessment. The field staff remove names from forms and place the student names in the school’s NAEP-provided storage bag. The school’s NAEP-provided storage bag contains all of the forms and materials with student names and is kept at the school until the end of the school year and then destroyed by school personnel.8


In addition to student information, teacher and principal names are collected and recorded in the AMS online system, which is used to keep track of the distribution and collection of NAEP teacher and school questionnaires. A paper copy of the questionnaire report is printed for use during the assessment, and this paper copy is left in the school’s NAEP-provided storage bag, which is destroyed at the end of the school year. The teacher and principal names are deleted from the AMS at the same time the student information is deleted.


For DBA, NAEP data are stored on systems in a locked-down environment at a secure hosting facility with strict measures in place to prevent unauthorized online access. The student names are not included on the assessment devices or stored by the same contractor or on the same database as the student responses. Shortly before, during, and after assessments, assessment data are transmitted through secure, encrypted channels (SSL, SSH) between NAEP systems, the NAEP assessment servers, and the assessment administration devices. Data on those devices are also encrypted—these data can be read only by the assessment software—and the devices are secured against unauthorized use.


Furthermore, to protect collected data, NAEP staff will use the following precautions:

  • Assessment and questionnaire data files will not identify individual respondents.

  • No PII, either by schools or respondents, will be gathered or released by third parties. No permanent files of names or other direct identifiers of respondents will be maintained.

  • Student participation is voluntary.

  • NAEP data are perturbed. Data perturbation is a statistical data editing technique implemented to ensure privacy for student and school respondents to NAEP’s assessment questionnaires for assessments in which data are reported or attainable via restricted-use licensing arrangements with NCES. The process is coordinated in strict confidence with the IES Disclosure Review Board (DRB), with details of the process shared only with the DRB and a minimal number of contractor staff.


The following text appears on all student assessments, the AMS, and teacher and school questionnaires:


Paperwork Reduction Act (PRA) Statement

The National Center for Education Statistics (NCES) conducts the National Assessment of Educational Progress to evaluate federally supported education programs. All of the information you provide may only be used for the purposes of research, statistics, and evaluation under the Education Sciences Reform Act of 2002 (ESRA; 20 U.S.C. § 9543) and may not be disclosed, our used, in identifiable form for any other purpose except as required by law. Every NCES employee as well as every NCES agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of $250,000, or both if he or she willfully discloses ANY identifiable information about you. Electronic submission of your information will be monitored for viruses, malware, and other threats by Federal employees and contractors in accordance with the Cybersecurity Enhancement Act of 2015.


NCES estimates the time required to complete this information collection to average [XX] minutes, including the time to review instructions and complete and review the information collection. This voluntary information collection was reviewed and approved by OMB (Control No. 1850-0928). If you have any comments concerning the accuracy of the time estimate, suggestions for improving this collection, or any comments or concerns regarding the status of your individual submission, please write to: National Assessment of Educational Progress (NAEP), National Center for Education Statistics (NCES), Potomac Center Plaza, 550 12th St., SW, 4th floor, Washington, DC 20202, or send an email to: nces.information.collections@ed.gov.

OMB No. 1850-0928 APPROVAL EXPIRES 02/28/2027

In addition, the following text appears on the log-in screen for the AMS system and NAEPq, the online system used for teacher and school administrator questionnaires.

AMS

When you have finished your work or need to stop and return later to finish, please LOG OUT of the system to preserve the security of the information contained within the Assessment Management System.

NAEPq

When you have finished or if you need to stop before finishing, please LOG OUT of the survey system by clicking “Exit.”



More specific information about how NAEP handles PII is provided in the table below:

Table NAEP PII Process

PII is created in the following ways  

  1. Public and non-public school samples are released to NAEP State Coordinators (public schools only), NAEP TUDA Coordinators (public schools only), and DC Gaining Cooperation Field Staff (non-public schools only) using the secure Assessment Management System (AMS).  

  1. State and TUDA coordinators recruit public schools and Gaining Cooperation Field Staff recruit private schools.

  1. Participating schools need to submit a current roster of students for the sampled grade for student sampling. 

  1. Rosters of students can be created by NAEP State Coordinators, NAEP TUDA Coordinators, or NAEP School Coordinators. 

  1. Rosters are submitted through the secure AMS website. 

  1. Rosters must be in Excel. 

  1. PII is contained in the roster files: state unique identifiers (optional), student names, month/year of birth, race/ethnicity, sex, and status codes for students with disabilities, English learners, and economic disadvantage. 

  1. PII is stored in the contractor’s secure data environments.  








PII is moved in the following ways  

  1. Student names (PII) are moved to the DC contractor via a secure FTP site. These names are used to print Student Login Cards.  

  1. Student Login Cards are only created for students taking DBA. 

  1. Student PII data is available to the NAEP School Coordinators and the NSSC, DC and Sampling contractor’s Field Staff through the secure NAEP Platform Development (NPD) contractor’s AMS. 

  1. NAEP School Coordinators can view and update PII for their own schools. 

  1. NAEP School Coordinators can print materials containing PII for their own schools. 

  1. NAEP School Coordinators are instructed to destroy all materials containing PII at the end of the assessment cycle. 

  1. DC contractor Field Staff can update PII for schools within their assignment. 

  1. DC contractor Field Staff can print materials containing PII for schools within their assignment. 

  1. DC contractor Field Staff store materials containing PII for schools within their assignment in their NAEP-provided storage bag. 

  1. At no point in time does any individual system have access to both the student’s name and student assessment and questionnaire responses.  



PII is destroyed in the following ways 


  1.  Contractors destroy PII after the data collection has concluded, effectively 10 months after the weights are delivered in June. Therefore, PII are destroyed by April the calendar year after which the assessment was conducted..

  1. School Coordinators destroy the materials containing PII on or before the end of the school year. 

  1. DC contractor Field Staff leave materials containing PII at the school after the assessment has been completed. For paper-pencil assessments, DC contractor Field Staff return their NAEP School Folders (without PII) to Westat Home Office for secure storage, and eventual secure destruction, as applicable. 


In addition, parents are notified of the assessment. See appendices D-5, D-6, D-38, D47, D52, D-61 through D-65, and D-112 which provide samples of the parental notification letters used in 2024 and 2025. The letters are adapted for each grade or age/subject combination and the school principal or school coordinator can download. However, the information regarding confidentiality and the appropriate law reference will remain unchanged. Please note that parents/guardians are required to receive notification of student participation, but NAEP does not require explicit parental consent (by law, parents/guardians of students selected to participate in NAEP must be notified in writing of their child’s selection prior to the administration of the assessment).

A.11. Sensitive Questions

NAEP emphasizes voluntary respondent participation. Insensitive or offensive items are prohibited by the National Assessment of Educational Progress Authorization Act, and the Governing Board reviews all items for bias and sensitivity. The nature of the questions is guided by the reporting requirements in the legislation, the Governing Board’s Policy on the Collection and Reporting of Background Data, and the expertise and guidance of the NAEP Survey Questionnaire Standing Committee (see appendix A-6). Throughout the item development process, NCES staff works with consultants, contractors, and internal reviewers to identify and eliminate potential bias in the items.

The NAEP student questionnaires include items that require students to provide responses on factual questions about their family’s socioeconomic background, self-reported behaviors, and learning contexts, both in the school setting as well as more generally. In compliance with legislation, student questionnaires do not include items about family or personal beliefs (e.g., religious or political beliefs). The student questionnaires focus only on contextual factors that clearly relate to academic achievement.

Educators, psychologists, economists, and others have called for the collection of non-cognitive student information that can explain why some students do better in school than others. Similar questions have been included in other NCES-administered assessments such as the Trends in International Mathematics and Science Study (TIMSS), the Program for International Student Assessment (PISA), the National School Climate Survey, and other federal questionnaires, including the U.S. Census. The insights achieved by the use of these well-established survey questions will help educators, policymakers, and other stakeholders make better informed decisions about how best to help students develop the knowledge and skills they need to succeed.

All questions proposed for piloting have gone through multiple rounds of reviews, including but not limited to reviews by NAEP subject-matter expert groups, organizational Internal Review Boards (IRBs), and the Governing Board, and have successfully passed extensive pre-testing via cognitive interviews with all respondent groups. Furthermore, NAEP does not report student responses at the individual or school level, but strictly in aggregate forms. To reduce the impact of any individual question on NAEP reporting, the program has shifted to a balanced reporting approach that includes multi-item indices, where possible, to maximize robustness and validity. In compliance with legislation and established practices through previous NAEP administrations, students may skip any question.

A.12. Estimation of Respondent Reporting Burden (2026)

The burden numbers for NAEP data collections fluctuate considerably, with the number of students sampled every other year being much larger than in the years in between.


Exhibit 1 provides the burden information per respondent group, by grade, for the 2026 data collections. At the time of this clearance package, it is assumed that 60 percent of the sample will be in the School Device Model, and 40 percent of the sample will be in the NAEP Device Model. Exhibit 2 summarizes the burden by respondent group.


A description of the respondents or study is provided below, as supporting information for Exhibit 1:

All districts will ensure that the School Technology Survey (STS) is completed on behalf of their schools. The study sample will have two random groups to support the bridge study: one group that is a priori assigned to NAEP Device Model and one group that is placed in School Device Model or NAEP Device Model based on their ability to meet minimum specifications to participate in NAEP on school devices, based on their responses to the STS. Additionally, in order to evaluate the impact of the transition to school devices, a sample of schools will be assigned to the NAEP Device Model by default, regardless of their ability to meet eligibility requirements for School Device Model.

School Technology Survey Respondent—District Superintendent, District Assessment Coordinator, and School Administrators receive initial communication information that schools have been sampled for NAEP, they must ensure that the School Technology Survey is completed, the registration for the Assessment Management System is completed, and staff roles for the assessment are assigned. For 2026, the School Technology Survey will determine which administration model schools will be assigned for the assessment. Estimated burden for reading the initial communication and completing the School Technology Survey is 60 minutes (Appendix I).



School Device Model

  • Students—Students in fourth, eighth, and twelfth grades will be assessed using 60 minutes of cognitive blocks in one subject followed by a non-cognitive block which requires up to a total of 15 minutes to complete. The core non-cognitive items are answered by students across subject areas and are related to demographic information. In addition, students answer subject-specific non-cognitive items. Based on timing data collected from cognitive interviews and previous assessments, fourth-grade students can respond to approximately four non-cognitive items per minute, while eighth- and twelfth-grade students can respond to approximately six non-cognitive items per minute. Using this information, the non-cognitive blocks are assembled so that most students can complete all items in the allocated amount of time. Each cognitive and non-cognitive block is timed so that the burden listed in exhibit 1 is the maximum burden time for each student. The administrators and/or test delivery system will move students to the next section once the maximum amount of time is reached. Additional student burden accounts for time to read directions, log on to the digital device, and view a tutorial. This additional burden is estimated at 15 minutes. The cognitive or assessment items are not included in the burden estimate because they are not subject to the Paperwork Reduction Act. Therefore, the total burden for students is 30 minutes. The assessments given in Puerto Rico are translated into Spanish. To account for the language complexities, additional time is provided for the cognitive blocks (for a total of 80 minutes). The burden for students in Puerto Rico is up to 15 minutes for the non-cognitive block, and an additional 15 minutes for directions, logging into the digital device, and the tutorial, for a total of 30 minutes.

  • Teachers—The teachers of fourth- and eighth-grade students participating in main NAEP are asked to complete questionnaires about their teaching background, education, training, and classroom organization. Average fourth-grade teacher burden is estimated to be 30 minutes because fourth-grade teachers often have multiple subject-specific sections to complete. Average eighth-grade teacher burden is 20 minutes if only one subject is taught and an additional 10 minutes for each additional subject taught. Based on timing data collected from cognitive interviews, adults can respond to approximately six non-cognitive items per minute. Using this information, the teacher questionnaires are assembled so that most teachers can complete the questionnaire in the estimated amount of time.

  • Principals/Administrators—The school administrators in the sampled schools are asked to complete a questionnaire. The core items are designed to measure school characteristics and policies that research has shown are highly correlated with student achievement. Subject-specific items concentrate on curriculum and instructional services issues. The burden for school administrators is determined in the same manner as burden for teachers (see above) and is estimated to average 30 minutes per principal/administrator, although burden may vary depending on the number of subject-specific sections included. The 30-minute burden estimate includes a supplemental charter school questionnaire designed to collect information on charter school policies and characteristics and is provided to administrators of charter schools who are sampled to participate in NAEP. The supplement covers organization and school governance, parental involvement, and curriculum and offerings.

  • School Staff Preassessment Activities—Each school participating in main NAEP has designated staff members to support the NAEP assessment: a School Coordinator and a Technology Coordinator. Preassessment and assessment activities include functions such as finalizing student samples, verifying student demographics, reviewing accommodations, and planning logistics for the assessment. The AMS system is used for school coordinators to provide requested administration information online, including logistical information, updates of student and teacher information, and school logistics. Additionally, these individuals must find and prepare devices for the assessment, install the NAEP Application onto select devices, and attend an Assessment Planning Meeting (APM). More information about the Technology Coordinators’ and School Coordinators’ responsibilities is included in section B.2. Based on information collected from previous years’ use of the preassessment activities, it is estimated that it will take a total of 6 hours on average, for school personnel to complete these activities. The AMS system data will be used to inform response patterns to make further refinements to the system and to minimize burden.

  • Submission of Samples—Survey sample information is collected from schools in the form of lists of potential students who may participate in NAEP. This sample information can be gathered manually or electronically at the school, district, or state level. If done at the state level, some states require a data security agreement, which is customized based on the specific requests of the state and provides verbatim security and confidentiality information from section A.10. If done at the school or district level, some burden will be incurred by school personnel. It is estimated that it will take 2 hours, on average, for school personnel to complete the submission process. Based on recent experience, it is estimated that approximately 26 percent of the schools will complete the submission process (based on the data from 2022).

  • School Staff Day of Assessment Activities—The Technology Coordinator will provide support on the day of the assessment to troubleshoot any issues with devices, provide a swap out device if necessary, and work through any network issues. The School Coordinator will review any updates that have occurred since the Assessment Planning Meeting. These could be updates to student information, such as accommodations or new refusals, or dismissal protocols. Both the Technology and School Coordinators will assist with classroom management. Estimated burden for the assumed two individuals is up to 120 minutes per individual, up to 240 minutes total. 

  • Additional School Staff—Schools who choose to assess all 50 sampled students at once in one or two locations are asked to provide an additional staff member to ensure that the assessment runs smoothly. This individual may provide additional technology or classroom management support. Based on the 2024 School Based Equipment Proof of Concept Study, it is estimated that 45 percent of the schools will choose to assess all 50 students at one time. The burden for this individual would be 120 minutes, and this is captured in Exhibit I.

  • Post-Assessment Activities—After the administration, the Technology Coordinator will ensure the removal of the eNAEP Application from student devices and provide feedback about the assessment process, for a total estimated burden of 40 minutes. After the administration, the School Coordinator will provide feedback about the assessment process, and destroy any documents with student identifying information, for a total estimated burden of 20 minutes.

  • SD and EL—SD and EL information is provided by school personnel concerning students identified as SD or EL. This information will be used by those personnel to determine the appropriate accommodations for students. The burden for school administrators is estimated at 15 minutes, on average, for each student identified as SD and/or EL. The estimated percent of SD/EL students is 28 percent and 23 percent and 18 percent at grades 4 and 8 and 12 (based on the NAEP 2024 sample), respectively.


NAEP Device Model

  • Students—Students in fourth, eighth, and twelfth grades will be assessed using 60 minutes of cognitive blocks in one subject followed by a non-cognitive block which requires up to a total of 15 minutes to complete. The core non-cognitive items are answered by students across subject areas and are related to demographic information. In addition, students answer subject-specific non-cognitive items. Based on timing data collected from cognitive interviews and previous assessments, fourth-grade students can respond to approximately four non-cognitive items per minute, while eighth- and twelfth-grade students can respond to approximately six non-cognitive items per minute. Using this information, the non-cognitive blocks are assembled so that most students can complete all items in the allocated amount of time. Each cognitive and non-cognitive block is timed so that the burden listed above is the maximum burden time for each student. The administrators and/or test delivery system will move students to the next section once the maximum amount of time is reached. Additional student burden accounts for time to read directions, log on to the digital device, and view a tutorial. This additional burden is estimated at 15 minutes. The cognitive or assessment items are not included in the burden estimate because they are not subject to the Paperwork Reduction Act. Therefore, the total burden for students is 30 minutes. The assessments given in Puerto Rico are translated into Spanish. To account for the language complexities, additional time is provided for the cognitive blocks (for a total of 80 minutes). The burden for students in Puerto Rico is up to 15 minutes for the non-cognitive block, and an additional 15 minutes for directions, logging into the digital device, and the tutorial, for a total of 30 minutes.

  • Teachers—The teachers of fourth- and eighth-grade students participating in main NAEP are asked to complete questionnaires about their teaching background, education, training, and classroom organization. Average fourth-grade teacher burden is estimated to be 30 minutes because fourth-grade teachers often have multiple subject-specific sections to complete. Average eighth-grade teacher burden is 20 minutes if only one subject is taught and an additional 10 minutes for each additional subject taught. Based on timing data collected from cognitive interviews, adults can respond to approximately six non-cognitive items per minute. Using this information, the teacher questionnaires are assembled so that most teachers can complete the questionnaire in the estimated amount of time.

  • Principals/Administrators—The school administrators in the sampled schools are asked to complete a questionnaire. The core items are designed to measure school characteristics and policies that research has shown are highly correlated with student achievement. Subject-specific items concentrate on curriculum and instructional services issues. The burden for school administrators is determined in the same manner as burden for teachers (see above) and is estimated to average 30 minutes per principal/administrator, although burden may vary depending on the number of subject-specific sections included. The 30-minute burden estimate includes a supplemental charter school questionnaire designed to collect information on charter school policies and characteristics and is provided to administrators of charter schools who are sampled to participate in NAEP. The supplement covers organization and school governance, parental involvement, and curriculum and offerings.

  • School Staff Preassessment Activities Preassessment—Each school participating in main NAEP has a designated staff member to serve as its NAEP school coordinator. Preassessment and assessment activities include functions such as finalizing student samples, verifying student demographics, reviewing accommodations, and planning logistics for the assessment. The AMS system was developed so that school coordinators would provide requested administration information online, including logistical information, updates of student and teacher information, and school logistics. This individual will also attend an Assessment Planning Meeting (APM). More information about the school coordinators responsibilities is included in section B.2. Based on information collected from previous years’ use of the preassessment system, it is estimated that it will take 2 hours on average, for school personnel to complete these activities. The AMS system data will be used to inform response patterns to make further refinements to the system and to minimize burden.

  • Submission of Samples—Survey sample information is collected from schools in the form of lists of potential students who may participate in NAEP. This sample information can be gathered manually or electronically at the school, district, or state level. If done at the state level, some states require a data security agreement, which is customized based on the specific requests of the state and provides verbatim security and confidentiality information from section A.10. If done at the school or district level, some burden will be incurred by school personnel. It is estimated that it will take 2 hours, on average, for school personnel to complete the submission process. Based on recent experience, it is estimated that approximately 26 percent of the schools will complete the submission process (based on the data from 2022).

  • School Staff Day of Assessment Activities—The School Coordinator will review any updates that have occurred since the Assessment Planning Meeting. These could be updates to student information, such as accommodations or new refusals, or dismissal protocols. The School Coordinator will also assist in classroom management. Estimated burden is 120 minutes total. 

  • Post-Assessment Activities— After the administration, the School Coordinator will provide feedback about the assessment process, and destroy any documents with student identifying information, for a total estimated burden of 20 minutes.

  • SD and EL—SD and EL information is provided by school personnel concerning students identified as SD or EL. This information will be used by those personnel to determine the appropriate accommodations for students. The burden for school administrators is estimated at 15 minutes, on average, for each student identified as SD and/or EL. The estimated percent of SD/EL students is 28 percent and 23 percent and 18 percent at grades 4 and 8 and 12 (based on the NAEP 2024 sample), respectively.


EXHIBIT 1

Estimated Burden for NAEP 2026 Assessments (School Device Model)

(Note: all explanatory notes and footnotes are displayed following the table)







4th Grade

 

8th Grade


12th Grade



Subjects

OP (R/M) and Pilot

Puerto Rico OP and Pilot (M)

Field Trial

 

OP (M/R/US History & Civics) and Pilot

Puerto Rico OP and Pilot (M)

Field Trial

 

Pilot

Field Trial

Total

School Technology Survey

# of Staff

3,728

125

5


3,592

117

5


262

2

7,837

Avg. minutes per response

60

60

60


60

60

60


60

60

N/A

Burden (in hours)

3,728

125

5


3,592

117

5


262

2

7,837

Students

# of Students

141,060

2,700

250


137,997

2,565

300


8,400

100

293,372

Avg. mins. per response

30

30

30


30

30

30


30

30

N/A

Burden (in hours)

70,530

1,350

125


68,999

1,283

150


4,200

50

146,687

Teachers

# of Teachers

11,185

376

15


14,366

467

20


0

0

26,429

Avg. mins. per response1

30

30

30


20 and 10 minutes for each additional subject

20

20 and 10 minutes for each additional subject


0

0

N/A

Burden (in hours)

5,593

188

8


5,842

156

8


0

0

11,795

School Questionnaire
(school principal)

# of Schools

3,728

125

5


3,592

117

5


262

2

7,837

Avg. mins. per response

30

30

30


30

30

30


30

30

N/A

Burden (in hours)

1,864

63

3


1,796

58

3


131

1

3,919

Preassessment Activities

# of Staff

7,457

251

10


7,183

234

10


525

4

15,673

Avg. mins. per response

180

180

180


180

180

180


180

180

N/A

Burden (in hours)

22,370

752

30


21,549

701

30


1,574

12

47,020


Sample submission,
(school coordinator)

# of School Coord.

3,728

125

5


3,592

117

5


262

2

7,837

Burden (in hours)

1,939

65

3


1,868

61

3


136

1

4,076

Day of assessment activities

# of Staff

9,135

307

12


8,799

286

12


643

5

19,200

Avg. mins. per response

120

120

120


120

120

120


120

120

N/A

Burden (in hours)

18,269

614

25


17,599

573

25


1,286

10

38,399

Post-assessment activities

# of Staff

7,457

251

10


7,183

234

10


525

4

15,673

Avg. mins. per response

40 for Tech Coord., 20 for School Coord.

40 for Tech Coord., 20 for School Coord.

40 for Tech Coord., 20 for School Coord.


40 for Tech Coord., 20 for School Coord.

40 for Tech Coord., 20 for School Coord.

40 for Tech Coord., 20 for School Coord.


40 for Tech Coord., 20 for School Coord.

40 for Tech Coord., 20 for School Coord.

N/A

Burden (in hours)

3,728

125

5


3,592

117

5


262

2

7,837

SD/EL (school personnel)

# of Schools

3,728

125

N/A


3,592

117

N/A


262

2

7,827

# of SD/EL Students

39,497

756

70


31,739

590

69


1,512

18

74,251

Avg. mins. per response

15

15

15


15

15

15


15

15

N/A

Burden (in hours)

9,874

189

18


7,935

148

17


378

5

18,564

Total Burden (in hours)

137,896

3,473

222


132,771

3,213

246


8,230

83

286,133

Total number of respondents: 362,501


Total number of responses: 468,108







EXHIBIT 1

Estimated Burden for NAEP 2026 Assessments (NAEP Device Model)

(Note: all explanatory notes and footnotes are displayed following the table)









4th Grade

 

8th Grade


12th Grade



Subjects

OP (R/M) and Pilot

Puerto Rico OP and Pilot (M)

Field Trial

 

OP (M/R/US History & Civics) and Pilot

Puerto Rico OP and Pilot (M)

Field Trial

 

Pilot

Field Trial

Total

School Technology Survey

# of Staff

2,486

84

2


2,899

88

5


66

1

5,630

Avg. minutes per response

60

60

60


60

60

60


60

60

N/A

Burden (in hours)

2,486

84

2


2,899

88

5


66

1

5,630

Students

# of Students

94,040

1,800

100


110,983

1,935

250


2,100

50

211,258

Avg. mins. per response

30

30

30


30

30

30


30

30

N/A

Burden (in hours)

47,020

900

50


55,492

968

125


1,050

25

105,630

Teachers

# of Teachers

7,457

251

6


11,598

353

20


0

0

19,685

Avg. mins. per response1

30

30

30


20 and 10 minutes for each additional subject

20

20 and 10 minutes for each additional subject


0

0

N/A

Burden (in hours)

3,729

126

3


4,717

118

8


0

0

8,701

School Questionnaire
(school principal)

# of Schools

2,486

84

2


2,899

88

5


66

1

5,630

Avg. mins. per response

30

30

30


30

30

30


30

30

N/A

Burden (in hours)

1,243

42

1


1,450

44

3


33

1

2,817

Preassessment Activities

# of Staff

2,486

84

2


2,899

88

5


66

1

5,630

Avg. mins. per response

120

120

120


120

120

120


120

120

N/A

Burden (in hours)

4,971

167

4


5,799

176

10


131

2

11,261


Sample submission,
(school coordinator)

# of School Coord.

2,486

84

2


2,899

88

5


66

1

5,630

Burden (in hours)

1,293

43

1


1,508

46

3


34

1

2,929

Day of assessment activities

# of Staff

2,486

84

2


2,899

88

5


66

1

5,630

Avg. mins. per response

120

120

120


120

120

120


120

120

N/A

Burden (in hours)

4,971

167

4


5,799

176

10


131

2

11,261

Post-assessment activities

# of Staff

2,486

84

2


2,899

88

5


66

1

5,630

Avg. mins. per response

20

20

20


20

20

20


20

20

N/A

Burden (in hours)

829

28

1


966

29

2


22

0

1,877

SD/EL (school personnel)

# of Schools

2,486

84

N/A


2,899

88

N/A


66

1

5,623

# of SD/EL Students

26,331

504

28


25,526

445

58


378

9

53,279

Avg. mins. per response

15

15

15


15

15

15


15

15

N/A

Burden (in hours)

6,583

126

7


6,382

111

15


95

2

13,321

Total Burden (in hours)

73,125

1,683

73


85,013

1,757

181


1,562

34

163,427

Total number of respondents: 253,458


Total number of responses: 318,005






Notes for 2026 table in Exhibit 1

  1. Grade 8 teachers who teach one subject have an estimated burden of 20 minutes, with an additional 10 minutes for each additional subject. The estimated number of teachers who teach one subject is 50% percent and 2 subjects is 50 percent. There is only one teacher questionnaire for U.S. history and civics, which is assessed in a separate sample of schools from the reading and mathematics assessments.



EXHIBIT 2

Total Annual Estimated Burden Time Cost for NAEP 2026 Assessments


 Data Collection Year

Number of Respondents

Number of Responses

Total Burden (in hours)

2026

615,958

786,113

449,560


The estimated respondent burden across all these activities translates into an estimated total burden time cost of 449,560 hours,9 broken out by respondent group in the table below.


Students

Teachers and School Staff

Principals

Total

Hours

Cost

Hours

Cost

Hours

Cost

Hours

Cost

2026

252,317

$1,829,298

190,507

$6,566,790

6,736

$357,547

449,560

$8,753,636


This burden estimate is lower than those previously published. The reduction has occurred because of the removal of planned Special Studies and because the original calculation did not reflect the correct split between School Device Model and NAEP Device Model schools. The latter error substantially inflated the estimated burden, and has since been corrected.


A.13. Cost to Respondents

There are no direct costs to respondents.

A.14. Estimates of Cost to the Federal Government

The total cost to the federal government for the administrations of the 2026 NAEP data collections (contract costs and NCES salaries and expenses) is estimated to be $129,534,907. The table below represents the 2026 assessment cost estimates as of February 2025; if the scope changes, any resulting changes in the costs will be reflected in a future Amendment.

NCES salaries and expenses

$175,038

Contract costs

$129,359,869

Scoring

$7,572,747


Item Development

$21,080,351

Sampling and Weighting

$5,156,781


Data Collection (including materials distribution)

$63,198,559


Recruitment and State Support

$1,100,351

Design, Analysis, and Reporting

$12,698,094


Securing and transferring DBA assessment data

$950,351


NAEP system development

$17,602,635



A.15. Time Schedule for Data Collection and Publications

The time schedule for the data collection for the 2026 assessments is shown below.

NAEP 2026 Field Trial

Fall 2025

NAEP 2026 Administration

January–March 2026



The grades 4, 8, and 12 reading and mathematics national and state results are typically released to the public around October of the same year (i.e., about 6-7 months after the end of data collection). All other operational assessments are typically released 12-15 months after the end of data collection. However, given the comparability study that compares the administration using school devices and NAEP devices, the analysis may require additional time and the results may be released later.

The operational schedule for the NAEP assessments generally follows the same schedule for each assessment cycle. The dates below show the likely timeframe for the 2026 state-level assessments. Any changes to this timeline will be provided in future Amendments.

  • Spring–Summer 2025: Select the school sample and notify schools; schools and districts complete the School Technology Survey. If eligible and qualified for school devices, they can begin deploying the NAEP Assessment Application.

  • October–November 2025: States, districts, or schools submit the list of students.

  • November 2025: Administer Field Trial.

  • December 2025: Select the student sample. Schools in School Device Model will complete installation of the NAEP Assessment Application.

  • December 2025–January 2026: Schools prepare for the assessments with support from the AMS system.

  • January–March 2026: Administer the assessments.

  • March–May 2026: Process the data, score constructed-response items and calculate sampling weights.

  • March–September 2026: Analyze the data.

  • September–December 2026: Prepare the reports, obtaining feedback from reviewers.

  • January or February 2027 (Grades 4/8, Reading and Mathematics): Release the results.

  • June or July 2027 (Grade 8 U.S. history and civics, Grade 12 reading and mathematics): Release the results.


A.16. Approval for Not Displaying OMB Approval Expiration Date

No exception is requested.


A.17. Exceptions to Certification Statement

No exception is requested.

1 The role of NCES, led by the Commissioner for Education Statistics, is defined in 20 U.S.C. §9622 (https://www.law.cornell.edu/uscode/text/20/9622) and OMB Statistical Policy Directives No. 1 and 4 (https://obamawhitehouse.archives.gov/omb/inforeg_statpolicy).

2 The grade 12 economics teacher match rate was 56 percent in 2012. For comparison, the 2015 teacher match rates for grades 4 and 8 were approximately 94 percent and 86 percent, respectively.

3 See Section A.2 for more information about how NAEP results are reported.

4 See Section B.1.a for more information on the NAEP sampling procedures.

5 The Governing Board assessment schedule can be found at https://www.nagb.gov/about-naep/assessment-schedule.html.

6 Additional information on the AMS site is included in section B.2.

7 The current contracts expire at varying times. As such, the specific contracting organizations may change during the course of the time period covered under this submittal.

8 In early May, schools receive an email from the AMS reminding them to securely destroy the contents of the NAEP storage envelope and confirm that they have done so. The confirmation is recorded in the system and tracked.

9 The average hourly earnings of teachers and principals derived from May 2023 Bureau of Labor Statistics (BLS) Occupation Employment Statistics is $34.47 for teachers and school staff and $53.08 for principals. If mean hourly wage was not provided, it was computed assuming 2,080 hours per year. The exception is the student wage, which is based on the federal minimum wage of $7.25 an hour. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation codes: Elementary school teachers (25-2000); Middle school teachers (25-2000); High school teachers (25-2000); Principals (11-9030); last modified date May 2023.  

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Author#Administrator
File Modified0000-00-00
File Created2025-05-19

© 2025 OMB.report | Privacy Policy