Download: 
pdf | 
pdfAttachment H- EHR Implementation Guide
CDAR2_IG_NHCS_R1_DSTU1.2_2016AUG_Vol1
HL7 CDA® R2 Implementation Guide:
National Health Care Surveys Release 1,
DSTU Release 1.2 –
US Realm
HL7 Draft Standard for Trial Use (DSTU)
August 2016
Volume 1 — Introductory Material
Sponsored by:
Public Health and Emergency Response Work Group
Structured Documents Work Group
Publication of this draft standard for trial use and comment has been approved by Health Level Seven
International (HL7). This draft standard is not an accredited American National Standard. The comment
period for use of this draft standard shall end 12 months from the date of publication. Suggestions for
revision should be submitted at http://www.hl7.org/dstucomments/index.cfm.
Following this 12 month evaluation period, this draft standard, revised as necessary, will be submitted to
a normative ballot in preparation for approval by ANSI as an American National Standard.
Implementations of this draft standard shall be viable throughout the normative ballot process and for up
to six months after publication of the relevant normative standard.
Copyright © 2016 Health Level Seven International ® ALL RIGHTS RESERVED. The reproduction of this
material in any form is strictly forbidden without the written permission of the publisher. HL7 and Health
Level Seven are registered trademarks of Health Level Seven International. Reg. U.S. Pat & TM Off.
Use of this material is governed by HL7's IP Compliance Policy
IMPORTANT NOTES:
HL7 licenses its standards and select IP free of charge. If you did not acquire a free license from HL7 for this
document, you are not authorized to access or make any use of it. To obtain a free license, please visit
http://www.HL7.org/implement/standards/index.cfm.
If you are the individual that obtained the license for this HL7 Standard, specification or other freely licensed
work (in each and every instance "Specified Material"), the following describes the permitted uses of the Material.
A. HL7 INDIVIDUAL, STUDENT AND HEALTH PROFESSIONAL MEMBERS, who register and agree to the terms
of HL7’s license, are authorized, without additional charge, to read, and to use Specified Material to develop and sell
products and services that implement, but do not directly incorporate, the Specified Material in whole or in part
without paying license fees to HL7.
INDIVIDUAL, STUDENT AND HEALTH PROFESSIONAL MEMBERS wishing to incorporate additional items of
Special Material in whole or part, into products and services, or to enjoy additional authorizations granted to HL7
ORGANIZATIONAL MEMBERS as noted below, must become ORGANIZATIONAL MEMBERS of HL7.
B. HL7 ORGANIZATION MEMBERS, who register and agree to the terms of HL7's License, are authorized, without
additional charge, on a perpetual (except as provided for in the full license terms governing the Material), nonexclusive and worldwide basis, the right to (a) download, copy (for internal purposes only) and share this Material
with your employees and consultants for study purposes, and (b) utilize the Material for the purpose of developing,
making, having made, using, marketing, importing, offering to sell or license, and selling or licensing, and to otherwise
distribute, Compliant Products, in all cases subject to the conditions set forth in this Agreement and any relevant
patent and other intellectual property rights of third parties (which may include members of HL7). No other license,
sublicense, or other rights of any kind are granted under this Agreement.
C. NON-MEMBERS, who register and agree to the terms of HL7’s IP policy for Specified Material, are authorized,
without additional charge, to read and use the Specified Material for evaluating whether to implement, or in
implementing, the Specified Material, and to use Specified Material to develop and sell products and services that
implement, but do not directly incorporate, the Specified Material in whole or in part.
NON-MEMBERS wishing to incorporate additional items of Specified Material in whole or part, into products and
services, or to enjoy the additional authorizations granted to HL7 ORGANIZATIONAL MEMBERS, as noted above,
must become ORGANIZATIONAL MEMBERS of HL7.
Please see http://www.HL7.org/legal/ippolicy.cfm for the full license terms governing the Material.
Ownership. Licensee agrees and acknowledges that HL7 owns all right, title, and interest, in and to the
Trademark. Licensee shall take no action contrary to, or inconsistent with, the foregoing.
Licensee agrees and acknowledges that HL7 may not own all right, title, and interest, in and to the
Materials and that the Materials may contain and/or reference intellectual property owned by third parties
(“Third Party IP”). Acceptance of these License Terms does not grant Licensee any rights with respect to
Third Party IP. Licensee alone is responsible for identifying and obtaining any necessary licenses or
authorizations to utilize Third Party IP in connection with the Materials or otherwise. Any actions, claims or
suits brought by a third party resulting from a breach of any Third Party IP right by the Licensee remains the
Licensee’s liability.
Following is a non-exhaustive list of third-party terminologies that may require a separate license:
Terminology
Owner/Contact
Current Procedures Terminology (CPT)
American Medical Association
code set
http://www.ama-assn.org/ama/pub/physician-resources/solutionsmanaging-your-practice/coding-billing-insurance/cpt/cpt-productsservices/licensing.page?
SNOMED CT
International Healthcare Terminology Standards Developing
Organization (IHTSDO) http://www.ihtsdo.org/snomed-ct/getsnomed-ct or info@ihtsdo.org
Logical Observation Identifiers Names &
Regenstrief Institute
Codes (LOINC)
International Classification of Diseases
World Health Organization (WHO)
(ICD) codes
NUCC Health Care Provider Taxonomy
American Medical Association. Please see 222.nucc.org. AMA
code set
licensing contact: 312-464-5022 (AMA IP services)
Structure of This Guide
Two volumes comprise this HL7 CDA® R2 Implementation Guide: National Health Care
Surveys Release 1, DSTU Release 1.2 - US Realm. Volume 1 provides narrative
introductory and background material pertinent to this implementation guide, including
information on how to understand and use the templates in Volume 2. Volume 2
contains the Clinical Document Architecture (CDA) templates for this guide along with
lists of templates, code systems, and value sets used.
Primary
Editor:
Sarah Gaunt
Lantana Consulting Group
sarah.gaunt@lantanagroup.com
CoEditor:
Michelle Williamson, MS, RN
CDC/NCHS
zup9@cdc.gov
PHER WG
Co-Chair:
Joginder Madra
Gordon Point Informatics Ltd.
joginder.madra@gpinformatics.com
CoEditor:
Brian Gugerty, DNS, RN
CDC/NCHS
vaz6@cdc.gov
PHER WG
Co-Chair:
Erin Holt Coyne, MPH
Tennessee Department of Health
kpool@oz-systems.com
CoEditor:
Kristi Eckerson, MSPH
Emory University IPA to CDC/NCHS
kee8@cdc.gov
PHER WG
Co-Chair:
John Roberts
Tennessee Department of Health
john.a.roberts@tn.gov
CoEditor:
Hetty Khan, MGA, MS, RN
CDC/NCHS
hdk1@cdc.gov
PHER WG
Co-Chair:
Rob Savage, MS
Rob Savage Consulting
robsavage@att.net
CoEditor:
Ryan Murphy
Lantana Consulting Group
ryan.murphy@lantanagroup.com
SDWG CoChair:
Calvin Beebe
Mayo Clinic
cbeebe@mayo.edu
CoEditor:
Tammara Jean Paul, PhD
CDC/NCHS
wro9@cdc.gov
SDWG CoChair:
Rick Geimer
Lantana Consulting Group
rick.geimer@lantanagroup.com
CoEditor:
Sabrina Ridley
Global Evaluation & Applied Research
Solutions
sridley@getingears.com
SDWG CoChair:
Austin Kreisler
SAIC Consultant to CDC/NHSN
duz1@cdc.gov
CoEditor:
Clarice Brown
CDC/NCHS
crb6@cdc.gov
CoEditor/SD
WG CoChair:
Gaye Dolin, MSN, RN
Intelligent Medical Objects, Inc.
gdolin@imo-online.com
CoEditor:
Arsed Joseph
Global Evaluation & Applied Research
Solutions
ajoseph@getingears.com
SDWG CoChair:
Brett Marquard
River Rock Associates
brett@riverrockassociates.com
Technical
Editor:
Diana Wright
Lantana Consulting Group
diana.wright@lantanagroup.com
Co-Editor
Ryan Murphy
Lantana Consulting Group
ryan.murphy@lantanagroup.com
Acknowledgments
This guide was developed and produced under the guidance of the Centers for Disease
Control and Prevention/National Center for Health Statistics (CDC/NCHS) through the
collaboration of the Division of Health Care Statistics (DHCS) and the Classifications
and Public Health Data Standards Staff (CPHDSS).
The editors appreciate the support and sponsorship of the Health Level Seven (HL7)
Structured Documents Working Group (SDWG) and the HL7 Public Health and
Emergency Response Work Group (PHER WG).
This material contains content from SNOMED CT® (http://www.ihtsdo.org/snomedct/). SNOMED CT is a registered trademark of the International Health Terminology
Standard Development Organisation (IHTSDO).
This material contains content from LOINC® (http://loinc.org). The LOINC table, LOINC
codes, and LOINC panels and forms file are copyright © 1995-2016, Regenstrief
Institute, Inc. and the Logical Observation Identifiers Names and Codes (LOINC)
Committee and available at no cost under the license at http://loinc.org/terms-of-use.
Contents
1
INTRODUCTION .............................................................................................................. 9
1.1
Note to Update Readers—Items for Comment .......................................................... 9
Clinical Note and External Document Reference ............................................................... 9
Chief Complaint and Reason for Visit Section ................................................................... 9
1.2
Purpose .................................................................................................................. 9
1.3
Background .......................................................................................................... 10
1.4
Audience .............................................................................................................. 12
1.5
Organization of the Guide ..................................................................................... 13
1.5.1
Volume 1 Introductory Material ........................................................................ 13
1.5.2
Volume 2 CDA Templates and Supporting Material ........................................... 13
1.6
Contents of the Package ........................................................................................ 14
2
CDA R2 BACKGROUND ................................................................................................ 15
3
DESIGN CONSIDERATIONS .......................................................................................... 17
4
3.1
CDA Participations ............................................................................................... 17
3.2
Rendering Header Information for Human Presentation ......................................... 17
3.3
Unknown and No Known Information .................................................................... 18
3.4
Use of Qualifiers ................................................................................................... 22
USING THIS IMPLEMENTATION GUIDE ........................................................................ 24
4.1
Levels of Constraint .............................................................................................. 24
4.2
Conformance Conventions Used in This Guide ...................................................... 24
4.2.1
Errata or Enhancements .................................................................................. 24
4.2.2
Templates and Conformance Statements .......................................................... 25
4.2.3
Template Versioning ......................................................................................... 27
4.2.4
Open and Closed Templates.............................................................................. 28
4.2.5
Conformance Verbs (Keywords) ......................................................................... 28
4.2.6
Cardinality ....................................................................................................... 29
4.2.7
Optional and Required with Cardinality ............................................................ 30
4.2.8
Vocabulary Conformance .................................................................................. 30
4.2.9
Containment Relationships ............................................................................... 32
4.2.10
Data Types ....................................................................................................... 33
4.2.11
Document-Level Templates "Properties" Heading ............................................... 33
4.3
4.3.1
XML Conventions Used in This Guide ................................................................... 33
XPath Notation ................................................................................................. 33
4.3.2
5
XML Examples and Sample Documents ............................................................ 34
REFERENCES .............................................................................................................. 35
APPENDIX A —
ACRONYMS AND ABBREVIATIONS ........................................................... 36
APPENDIX B —
HIGH-LEVEL CHANGES FROM PREVIOUS RELEASES ............................. 39
APPENDIX C —
EXTENSIONS TO CDA R2 ......................................................................... 40
APPENDIX D —
MIME MULTIPART/RELATED MESSAGES ................................................ 42
RFC-2557 MIME Encapsulation of Aggregate Documents, Such as HTML (MHTML) ............ 42
Referencing Supporting Files in Multipart/Related Messages ............................................. 42
Referencing Documents from Other Multiparts within the Same X12 Transactions ............. 43
Figures
Figure 1: Templated CDA ..................................................................................................... 15
Figure 2: nullFlavor Example ............................................................................................... 18
Figure 3: Attribute Required (nullFlavor not allowed) ............................................................ 19
Figure 4: Allowed nullFlavors When Element is Required (with xml examples) ....................... 19
Figure 5: Unknown Medication Example ............................................................................... 20
Figure 6: Unknown Medication Use of Anticoagulant Drug Example ...................................... 20
Figure 7: No Known Medications Example ............................................................................ 21
Figure 8: Value Known, Code for Value Not Known ............................................................... 21
Figure 9: Value Completely Unknown ................................................................................... 21
Figure 10: Value Known, Code in Required, Code System Not Known but Code from Another
Code System is Known.................................................................................................. 22
Figure 11: Qualifier Example ................................................................................................ 23
Figure 12: Context Table Example: Asthma Diagnosis Observation ....................................... 25
Figure 13: Constraints Overview Example: Asthma Diagnosis Observation ............................ 26
Figure 14: Constraints Format Example ............................................................................... 27
Figure 15: Constraints Format – only one allowed ................................................................. 30
Figure 16: Constraints Format – only one like this allowed .................................................... 30
Figure 17: Binding to a Single Code ...................................................................................... 31
Figure 18: XML Expression of a Single-code Binding ............................................................. 31
Figure 19: Translation Code Example ................................................................................... 32
Figure 20: Example Value Set Table (Language) .................................................................... 32
Figure 21: XML Document Example ..................................................................................... 34
Figure 22: XPath Expression Example .................................................................................. 34
Figure 23: ClinicalDocument Example .................................................................................. 34
Tables
Table 1: Contents of the Package .......................................................................................... 14
1 INTRODUCTION
1.1 Note to Update Readers—Items for Comment
This update contains two volumes. Below are descriptions of items that may be
commented on in each volume.
Volume 1:
1. The body of the document up until the appendices MAY be commented on.
Volume 2: Templates that are new or changed MAY be commented on; templates that
are unchanged from the previous release MAY NOT be commented on.
2. Templates that are new or substantially revised are signified by “Draft as part of
National Health Care Surveys Release 1, DSTU Release 1.2 – US Realm” under
the template name. These MAY be commented on.
EXAMPLE:
Clinical Note and External Document Reference
[externalDocument: identifier
urn:hl7ii:2.16.840.1.113883.10.20.34.3.44:2016-07-01 (open)]
Draft as part of National Health Care Surveys Release 1, DSTU Release 1.2 US Realm
3. Templates that have been brought in unchanged from the previous release have
“Published as part of ” under the template name. These MAY NOT
be commented on.
EXAMPLE:
Chief Complaint and Reason for Visit Section
[section: identifier urn:oid:2.16.840.1.113883.10.20.22.2.13
(open)]
Published as part of Consolidated CDA Templates for Clinical Notes (US
Realm) DSTU R1.1
Changes made in this release are summarized in the Appendix in High-Level Changes
from Previous Releases. Volume 2 of this guide contains a detailed section on “Changes
from Previous Version”.
1.2 Purpose
This two-volume implementation guide contains an overview of Clinical Document
Architecture (CDA) markup standards, design, and use (Volume 1) and a collection of
CDA templates for the Centers for Disease Control and Prevention (CDC), National
Center for Health Statistics (NCHS), National Health Care Surveys applicable to the US
Realm (Volume 2). These two volumes constitute a Draft Standard for Trial Use (DSTU).
CDA templates included in Volume 2 represent healthcare data collected by the CDC
NCHS within the Division of Health Care Statistics (DHCS). The data are collected
through three surveys of ambulatory, inpatient, and outpatient care services in the
United States: the National Ambulatory Medical Care Survey (NAMCS), the National
Hospital Care Survey (NHCS) and the National Hospital Ambulatory Medical Care
Survey (NHAMCS).1 These surveys produce nationally representative data to answer key
questions about health care usage, quality, and disparities that are of interest to public
health professionals, researchers, and health care policy makers.
This implementation guide specifies National Health Care Surveys with three document
types:
Emergency Department Encounter, for data collected by NHCS and NHAMCS
Inpatient Encounter, for data collected by NHCS
Outpatient Encounter, for data collected NHCS, NAMCS, and NHAMCS
1.3 Background
The NAMCS collect objective, reliable information about the provision and use of
ambulatory medical care services in the United States. Findings are based on a sample
of visits to non-federally employed, office-based physicians, as well as visits to health
care providers at community health centers.
The NHCS provides accurate and reliable health care statistics on the latest use of
hospitals and hospital-based care organizations in the United States. Findings are
based on all inpatient discharges and all emergency department and outpatient
department visits at a sample of non-federal, non-institutional hospitals with six or
more staffed inpatient beds.
The NHAMCS collects data on the use and provision of ambulatory care services in
hospital emergency, outpatient departments, and ambulatory surgery centers. Findings
are based on a national sample of visits to the emergency departments, outpatient
departments of general and short-stay hospitals, and ambulatory surgery centers.
While there are some differences (detailed in the guide), all three surveys capture
information about the patient, the visit, signs and symptoms, diagnoses, procedures,
medications, and discharge disposition.
Traditionally, human abstractors have collected NAMCS and NHAMCS data, while
NHCS data have been obtained by the electronic submission of administrative claims
(X12N Health Care Claim: Institutional Implementation Guide (837I)). This
implementation guide builds on the standard CDA visit report to allow:
1
Data from a greater number of visits to be collected
More complete data, especially clinical data, to be obtained by electronic means
than can be obtained by human abstractors or administrative claims
Enhancement of the surveys by incorporating readily available data such as the
patient problem list, and vital statistics measures including height and weight
Significantly more standardized data to be collected than previously
CDC, National Health Care Surveys. http://www.cdc.gov/nchs/dhcs.htm
The NAMCS and NHAMCS have traditionally required manual data abstraction. In the
NAMCS data collection in physician offices, U.S. Census field representatives (Field
Reps) visit physician practice locations to obtain the data. The Field Reps ask
physicians practice context and practice management questions. These "Physician
Induction" questions are at the practice level and are outside of the scope of this
implementation guide. Physicians are assigned a randomly-selected, one-week reporting
period, during which data for a random sample of patient visits are recorded by the
visiting Field Reps on an encounter form. Data captured include information on patient
symptoms, diagnoses, and medications. The form also includes information on
diagnostic procedures, patient management, and planned future treatment. Data are
entered into a computer-assisted tool and are later aggregated and sent back to NCHS
for data processing. The NAMCS data collection in community health centers is
conducted in a similar manner (e.g., induction questions, and visit data abstraction and
transmission) except that Field Reps collect information on visits seen by three
randomly-selected health care providers (including physicians, nurse practitioners,
physician assistants, and certified nurse midwives) practicing at the sampled
community health center. In NHAMCS, Field Reps conduct induction interviews with
the sampled hospitals and collect sampled visit data over a four-week reporting period
from randomly selected emergency services areas, outpatient department clinics, and
affiliated ambulatory surgery centers. Visit data captured on the patient and services
used are largely similar between NHAMCS and NAMCS. This manual data abstraction
process is cumbersome, resource intensive, costly, and effectively limits the data pool.
Automating the survey process using CDA streamlines data collection and facilitates
survey participation by providing all physicians and hospitals with a familiar and
standard process. Templates included in this guide align with the CDA R2 (Release 2)
implementation guide, which is the standard indicated by Meaningful Use
requirements. The templates in this guide expand on the scope of the original survey
data elements in that they do not constrain the data collected to the narrow lists on the
survey instruments, allowing data collection of any service, procedure, or diagnosis
recorded.
Implementers use this guide to submit data to fulfill requirements of the National
Health Care Surveys covered under this guide by automatic extraction of the data from
a practice's electronic health record (EHR) system or clinical data repository. In cases
where there is only partial fulfillment of the requirements of the National Health Care
Surveys covered under this guide by a practice’s use of this guide, Field Reps may be
sent into the practice to complete the requirements. In these cases, Field Rep data
collection forms will be pre-populated with the data enabled by this guide, thus
significantly reducing the data collection burden.
Although EHR extraction offers new potential for automating the survey process or
parts thereof, the challenges of automating data extraction are acknowledged in
literature. For example, according to Garrido T, et. al (2013) 2, "Even with improved
standardization of terminologies and codes, EHR content, structure, and data format
vary, as do local data capture and extraction procedures." NCHS is and has been
dealing with EHR content, structure, and data format challenges already, even with
Garrido T, et. al. "e-Measures: insight into the challenges and opportunities of automating publicly
reported quality measures." J Am Med Inform Assoc. Jan 2014; 21(1):181-184 doi: 10.1136/amiajnl2013-001789. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3912717/
2
manual abstraction. We believe that this implementation guide will promote movement
towards standardization of EHR content, structure, data format, and data capture and
extraction procedures for data elements of interest to the surveys—such as diagnoses,
medication, and procedures. Such data are also of interest to a wide variety of other
stakeholders.
We agree with Garrido that "Within a single institution, significant differences in
denominators, numerators, and rates arise from different electronic data sources, and
documentation habits of providers vary. Data entered into the EHR may not be
interpreted or recognized, resulting in substantial numerator loss and underestimates
of the delivery of clinical preventive services." It is important, however, to note that the
National Health Care Surveys are not used to evaluate quality of care within single
institutions or via clinical quality measures within single or multiple institutions. These
concerns are, therefore, not relevant to the National Health Care Surveys. NCHS already
deals with varying provider documentation habits in the current process via paper and
EHR manual abstraction and will closely monitor the effects of varying provider
documentation habits during EHR extraction. This implementation guide is published
as a DSTU, allowing users to comment during the trial period (see Errata or
Enhancements). Instructions for submitting comments are available on the Health Level
Seven (HL7) STU Comments site at: http://www.hl7.org/dstucomments/. The data
collection process will be reviewed for accuracy of automated reporting and to ensure
that new extraction procedures do not excessively burden clinicians or their supporters.
NCHS will do this through planned implementation and collection trials. NCHS plans to
submit the results of this evaluation for publication.
The intent of this implementation guide is to obtain as much survey information as
possible from data currently available in EHRs. It is understood that not all of the data
items indicated on the surveys may be captured by EHR systems at this time.
Submission of survey data from EHRs that do not contain all of the desired data
elements specified in this implementation guide can be accepted, but each survey
submission must include all of the required data elements specified in the
implementation guide. Some of the survey data elements that are not common in EHRs
at present have been included in a Health Statistics profile of the HL7 EHR-S Public
Health Functional profile. Future EHR functionality will address this gap. If participants
in these surveys wish to document additional details to meet the survey requirements
now by configuring encounter forms or other templates in the EHR, they may do so;
however, this is not required for submission and this implementation guide does not
give a site guidance on how to do so.
1.4 Audience
The audience for this implementation guide includes the architects and developers of
healthcare information technology (HIT) systems in the US Realm that exchange patient
clinical data in ambulatory care settings.
1.5
Organization of the Guide
This implementation guide is organized into two volumes. Volume 1 contains primarily
narrative text describing this guide to the three National Health Care Surveys, whereas
Volume 2 contains normative CDA template definitions.
1.5.1 Volume 1 Introductory Material
This document, Volume 1, provides an overview of CDA and information on how to
understand and use the CDA templates provided in Volume 2.
Chapter 1—Introduction
Chapter 2—CDA R2 Background. This chapter contains selected background
material on the CDA Release 2 (CDA R2) base standard, to aid the reader in
conceptualizing the "templated CDA" approach to implementation guide
development.
Chapter 3—Design Considerations. This chapter includes design considerations
that describe overarching principles applied across the CDA templates in this
guide. Material in this chapter can be thought of as "heuristics", as opposed to
the formal and testable constraints found in Volume 2 of this guide.
Chapter 4—Using This Implementation Guide. This chapter describes the rules
and formalisms used to constrain the CDA R2 base standard. It describes the
formal representation of CDA templates, the mechanism by which templates are
bound to vocabulary, and additional information necessary to understand and
correctly implement the normative content found in Volume 2 of this guide.
Appendices. The Appendices include an overview of changes from the previous
release, a summary of extensions to CDA R2, and an excerpt of the Health Level
Seven (HL7) Additional Information Specification Implementation Guide covering
MIME Multipart/Related Messages.
1.5.2 Volume 2 CDA Templates and Supporting Material
Volume 2 includes CDA templates and prescribes their use for a set of specific
document types representing the National Health Care Surveys. The main chapters are:
Chapter 1—Document-Level Templates. This chapter defines the US Realm
Header template that applies across three document types representing the
Emergency Department Encounter (NHCS-ED, NHAMCS-ED), Inpatient
Encounter (NHCS-IP), and Outpatient Encounter (NHCS-OPD, NAMCS,
NHAMCS-OPD). It defines each of the document types and header constraints
specific to each, as well as the section-level templates (required and optional) for
each.
Chapter 2—Section-Level Templates. This chapter defines the section templates
referenced within the document types. Sections are atomic units, and can be
reused by future specifications.
Chapter 3—Entry-Level Templates. This chapter defines entry-level templates,
called clinical statements. Machine-processable (coded) data are sent in the
entry templates. The entry templates are referenced by one or more section
templates. Entry-level templates are always contained in section-level templates,
and section-level templates are always contained in a document.
1.6
Chapter 4—Participation and Other Templates. This chapter defines templates
for CDA participants (e.g., author, performer) and other fielded items (e.g.,
address, name) that cannot stand on their own without being nested in another
template.
Chapters 5-7 include template IDs, value sets, and code systems used in this
guide.
Chapter 8—Changes from Previous Version. This chapter provides detailed
change logs.
Contents of the Package
The following files comprise the implementation guide package:
Table 1: Contents of the Package
Filename
Description
Standards
Applicability
CDAR2_IG_NHCS_R1_DSTU1.2_2016JUL
_V1_Introductory_Material.docx
Implementation Guide
Introductory Material
Normative
CDAR2_IG_NHCS_R1_DSTU1.2_2016JUL
_V2_Templates_and_Supporting.docx
Implementation Guide
Template Library and
Supporting Material
Normative
CDAR2_IG_NHCS_R1_DSTU1.2_2016JUL
_IPE.xml
Inpatient Encounter
Sample
Informative
CDAR2_IG_NHCS_R1_DSTU1.2_2016JUL
_OPE.xml
Outpatient Encounter
Sample
Informative
CDAR2_IG_NHCS_R1_DSTU1.2_2016JUL
_EDE.xml
Emergency Department
Sample
Informative
CDA.xsl
Stylesheet for rendering
Informative
_readme.txt
Text file describing
contents of the package
Informative
2 CDA R2 BACKGROUND
CDA is "… a document markup standard that specifies the structure and semantics of
‘clinical documents’ for the purpose of exchange" [CDA R2, Section 1.1].3 Clinical
documents, according to CDA, have the following characteristics:
Persistence
Stewardship
Potential for authentication
Context
Wholeness
Human readability
CDA defines a header for classification and management and a document body that
carries the clinical record. While the header metadata are prescriptive and designed for
consistency across all instances, the body is highly generic, leaving the designation of
semantic requirements to implementation.
CDA R2 can be constrained by mechanisms defined in the "Refinement and
Localization"4 section of the HL7 Version 3 Interoperability Standards. The mechanism
most commonly used to constrain CDA is referred to as "templated CDA". In this
approach, a library is created containing modular CDA templates such that the
templates can be reused across any number of CDA document types, as shown in the
following figure.
Figure 1: Templated CDA
There are many different kinds of templates that might be created. Among them, the
most common are:
Document-level templates: These templates constrain fields in the CDA
header, and define containment relationships to CDA sections. For example, an
HL7 CDA Release 2. http://www.hl7.org/implement/standards/product_brief.cfm?product_id=7
HL7 Version 3 Standard.
http://www.hl7.org/v3ballot/html/infrastructure/conformance/conformance.htm (Login required.)
3
4
NAMCS document-level template might require that the provider’s ID be present,
and that the document contain a Services section.
Section-level templates: These templates constrain fields in the CDA section,
and define containment relationships to CDA entries. For example, a Services
section-level template might require that the section/code be fixed to a
particular LOINC code, and that the section contain a Provided Service
Observation.
Entry-level templates: These templates constrain the CDA clinical statement
model in accordance with real world observations and acts. For example, a
Provided Service Observation entry-level template defines how the CDA
Observation class is constrained (how to populate observation/code, how to
populate observation/value, etc.) to represent the notion of a particular
observation.
A CDA implementation guide (such as this one) includes reference to those templates
that are applicable. On the implementation side, a CDA instance populates the template
identifier (templateId) field where it wants to assert conformance to a given template.
On the receiving side, the recipient can both test the instance for conformance against
the CDA XML (Extensible Markup Language) schema and test the instance for
conformance against asserted templates.
3 DESIGN CONSIDERATIONS
Design considerations describe overarching principles that have been developed and
applied across the CDA templates in this guide. Material in this chapter can be thought
of as "heuristics", as opposed to the formal and testable constraints found in Volume 2
of this guide.
3.1 CDA Participations
A CDA participant (e.g., Author, Informant), per the Reference Information Model (RIM),
is "an association between an Act and a Role with an Entity playing that Role. Each
Entity (in a Role) involved in an Act in a certain way is linked to the act by one
Participation-instance. The kind of involvement in the Act is specified by the
Participation.typeCode".
CDA principles when asserting participations include:
Participation persistence: An object's participations (and participation time
stamps) don't change just because that object is reused. For instance,
authorship of an object doesn't change just because that object is now included
in a summary document.
Participation evolution: Additional participations (and participation time
stamps) can be ascribed to an object over its lifetime.
Device participation: Devices do not participate as legally responsible entities,
but can participate as authors in some scenarios.
Meaningful Use Stage 2 criterion §170.314(b)(4) Clinical Information Reconciliation
requires a system to "simultaneously display (i.e., in a single view) the data from at least
two list sources in a manner that allows a user to view the data and their attributes,
which must include, at a minimum, the source and last modification date".5
CDA requires that Author and Author time stamp be asserted in the document header.
From there, authorship propagates to contained sections and contained entries, unless
explicitly overridden. Thus, all entries in CDA implicitly include Author and Author time
stamp.
3.2
Rendering Header Information for Human Presentation
Good practice recommends that the following be present whenever the document is
viewed:
Document title and document dates
Service and encounter types, and date ranges as appropriate
Names of all persons along with their roles, participations, participation date
ranges, identifiers, address, and telecommunications information
HHS, Standards, Implementation Specifications, and Certification Criteria for EHR Technology (Final
Rule). http://www.gpo.gov/fdsys/pkg/FR-2012-09-04/pdf/2012-20982.pdf
5
Names of selected organizations along with their roles, participations,
participation date ranges, identifiers, address, and telecommunications
information
Date of birth for recordTarget(s)
3.3 Unknown and No Known Information
Information technology solutions store and manage data, but sometimes data are not
available. An item may be unknown, not relevant, or not computable or measureable,
such as where a patient arrives at an emergency department unconscious and with no
identification.
In many cases, the implementation guide will stipulate that a piece of information is
required (e.g., via a SHALL conformance verb). However, in most of these cases, the
standard provides an "out", allowing the sender to indicate that the information isn’t
known.
Here, we provide guidance on representing unknown information. Further details can
be found in the HL7 V3 Data Types, Release One specification that accompanies the
CDA R2 base standard. However, it should be noted that the focus is on the
unambiguous representation of known data, and that in general, the often subtle
nuances of unknown information representation are less relevant to the recipient.
Many fields contain an "@nullFlavor" attribute, used to indicate an exceptional value.
Some flavors of Null are used to indicate that the known information falls outside of
value set binding constraints. Not all uses of the @nullFlavor attribute are associated
with a case in which information is unknown. Allowable values for populating the
attribute give more details about the reason the information is unknown, as shown in
the following example.
Figure 2: nullFlavor Example
 
Use null flavors for unknown, required, or optional attributes:
NI
No information. This is the most general and default null flavor.
NA
Not applicable. Known to have no proper value (e.g., last menstrual
period for a male).
UNK
Unknown. A proper value is applicable, but is not known.
ASKU
Asked, but not known. Information was sought, but not found (e.g.,
the patient was asked but did not know).
NAV
Temporarily unavailable. The information is not available, but is
expected to be available later.
NASK
Not asked. The patient was not asked.
MSK
There is information on this item available but it has not been provided
by the sender due to security, privacy, or other reasons. There may be
an alternate mechanism for gaining access to this information.
OTH
The actual value is not an element in the value domain of a variable.
(e.g., concept not provided by required code system).
The list above contains those null flavors that are commonly used in clinical
documents. For the full list and descriptions, see the nullFlavor vocabulary domain
in the CDA R2 normative edition.
Any SHALL, SHOULD and MAY conformance statement may use nullFlavor, unless the
nullFlavor is explicitly disallowed (e.g., through another conformance statement
which includes a SHALL conformance for a vocabulary binding to the @code attribute, or
through an explicit SHALL NOT allow use of nullFlavor conformance).
Figure 3: Attribute Required (nullFlavor not allowed)
contain exactly one [1..1] code (CONF:15407).
a. This code SHALL contain exactly one [1..1] @code="11450-4" Problem List
(CodeSystem: LOINC 2.16.840.1.113883.6.1) (CONF:15408) .
1. SHALL
or
2. SHALL contain exactly one [1..1] effectiveTime/@value (CONF:5256).
Figure 4: Allowed nullFlavors When Element is Required (with xml examples)
1. SHALL contain at least one [1..*] id
2. SHALL contain exactly one [1..1] code
3. SHALL contain exactly one [1..1] effectiveTime
New Grading system
Spiculated mass grade 5
If a sender wants to state that a piece of information is unknown, the following
principles apply:
1. If the sender doesn’t know an attribute of an act, that attribute can be null.
Figure 5: Unknown Medication Example
patient was given a medication but I do not know what it was
Value set tables are present below a template, or are referenced if they occur elsewhere
in the specification, when there are value set bindings in the template. The value set
table provides the value set identifier, a description, and a link to the source of the
value set when possible. Ellipses in the last row indicate the value set members shown
are examples and the true source must be accessed to see all members.
If a value set binding has a DYNAMIC stability, implementers creating a CDA document
must go to the location in the Uniform Resource Locator (URL) to check for the most
current version of the value set expansion.
Figure 20: Example Value Set Table (Language)
Value Set: Language 2.16.840.1.113883.1.11.11526
A value set of codes defined by Internet RFC 4646 (replacing RFC 3066). Please see ISO 639
language code set maintained by Library of Congress for enumeration of language codes.
Value Set Source: http://www.ietf.org/rfc/rfc4646.txt
Code
Code System
Code System OID
Print Name
aa
Language
2.16.840.1.113883.6.121
Afar
ab
Language
2.16.840.1.113883.6.121
Abkhazian
ace
Language
2.16.840.1.113883.6.121
Achinese
ach
Language
2.16.840.1.113883.6.121
Acoli
ada
Language
2.16.840.1.113883.6.121
Adangme
ady
Language
2.16.840.1.113883.6.121
Adyghe; Adygei
ae
Language
2.16.840.1.113883.6.121
Avestan
af
Language
2.16.840.1.113883.6.121
Afrikaans
...
4.2.9 Containment Relationships
Containment constraints between a section and its entry are indirect in this guide,
meaning that where a section asserts containment of an entry, that entry can either be
a direct child or a further descendent of that section.
For example, in the following constraint:
1. SHALL contain at least one [1..*] entry (CONF:8647) such that it
a. SHALL contain exactly one [1..1] Advance Directive Observation
(templateId:2.16.840.1.113883.10.20.22.4.48) (CONF:8801).
the Advance Directive Observation can be a direct child of the section (i.e.,
section/entry/AdvanceDirectiveObservation) or a further descendent of that
section (i.e., section/entry/…/AdvanceDirectiveObservation). Either of these are
conformant.
All other containment relationships are direct, for example:
1. SHALL contain exactly one [1..1]
templateId/@root="2.16.840.1.113883.10.20.22.2.21" (CONF:7928).
The templateId must be a direct child of the section (i.e., section/templateId).
4.2.10 Data Types
All data types used in a CDA document are described in the CDA R2 standard. All
attributes of a data type are allowed unless explicitly prohibited by this specification.
4.2.11 Document-Level Templates "Properties" Heading
In Volume 2 of this implementation guide, each document-level template has a
"Properties" heading for ease of navigation. The Properties heading is an organizational
construct, underneath which relevant CDA act-relationships and roles are called out as
headings in the document.
4.3
XML Conventions Used in This Guide
4.3.1 XPath Notation
Instead of the traditional dotted notation used by HL7 to represent RIM classes, this
document uses XML Path Language (XPath) notation10 in conformance statements and
elsewhere to identify the Extensible Markup Language (XML) elements and attributes
within the CDA document instance to which various constraints are applied. The
implicit context of these expressions is the root of the document. This notation provides
a mechanism that will be familiar to developers for identifying parts of an XML
document.
XPath statements appear in this document in a monospace font.
XPath syntax selects nodes from an XML document using a path containing the context
of the node(s). The path is constructed from node names and attribute names (prefixed
by a ‘@’) and concatenated with a ‘/’ symbol.
10
XML Path Language. http://www.w3.org/TR/xpath/
Figure 21: XML Document Example
...
| File Type | application/pdf | 
| Subject | Implementation Guide for CDA Release 2.0 Consolidated CDA Templates (US Realm) July 2012 | 
| File Modified | 2024-03-28 | 
| File Created | 2016-08-19 |