State Public Health Data Reporting Policies and Practices Vary Widely

Nationwide analysis outlines opportunities to improve data for disease detection and prevention

State Public Health Data Reporting Policies and Practices Vary Widely
Jeff Greenberg Universal Images Group via Getty Images

Overview

When public health agencies lack access to clinical data, illnesses spread undetected, the health system becomes overburdened, and health care costs, illnesses, and deaths rise. The water crisis in Flint, Michigan, and the COVID-19 pandemic demonstrate shortcomings in the collection of public health data and their ramifications.1

By collecting and analyzing better data, public health agencies can more effectively identify and prevent the spread of emerging threats; anticipate surges of illnesses and deploy resources where they are needed most; identify communities that are disproportionately threatened by disease and lack access to care; and design more effective and equitable interventions. Agencies can also share more information—with health care providers to improve clinical practice, with scientists to aid research, with elected officials to inform policy, and with members of the general public to guide their daily behavior.

Most health care providers already use electronic health records (EHR) to manage their patients’ medical information. By employing those products to automatically report data about infectious diseases and other health threats to public health agencies, health care providers can significantly reduce or eliminate burdensome effort on their part to report and improve the timeliness, accuracy, and completeness of the data they share. Unfortunately, a sizable proportion of case data is still reported via manual processes such as fax and phone, which creates administrative work, introduces human errors, and slows the analysis and use of information. And when diseases spread hour by hour and day by day, speed matters.

Policies and practices designed to drive automated electronic reporting vary among providers and states. To modernize public health data, federal and state policymakers, public health officials, health care providers, and health IT developers need a baseline understanding of how jurisdictions are regulating and promoting automated electronic reporting

This report from The Pew Charitable Trusts provides a first-of-its-kind analysis of these policies and practices across all 50 states and the District of Columbia (referred to as jurisdictions throughout the report) that govern how clinical data flows to state public health agencies via four methods: case reports, lab reports, syndromic surveillance, and immunization information systems.

  • Case and lab reports provide state and local health departments with information about individual patients with conditions of public health importance, such as communicable diseases, environmental illnesses, or other conditions of public health importance (for example, some types or instances of cancer or lead poisoning).2 Public health agencies use the data to monitor and detect threats and design strategies to reduce disease spread and impact.
  • Syndromic surveillance is an early-warning system that uses de-identified reports (with identities redacted) of symptoms and syndromes captured primarily from emergency departments. The data enables agencies to detect and identify emerging threats quickly and perform real-time monitoring of known health risks.
  • Immunization information systems collect data on individual vaccinations—especially for childhood vaccines—to track immunization rates, identify communities that may lack access to vaccines, and investigate cases and outbreaks of vaccine-preventable diseases. Patients and providers also use the systems to keep track of their immunization records.

The report is informed by the direct examination of each jurisdiction’s public health statutes and regulations (conducted from May to August 2021) and by interviews with 266 public health officials in the District of Columbia and every state except Kansas, Maine, and Wyoming (conducted from October 2022 to April 2023). Researchers designed the interviews to supplement the policy analysis with more on-the-ground feedback about how these policies are implemented. Together, the findings about policy and practice provide a comprehensive view of what is driving data modernization. Although these interviews covered topics related to jurisdictions’ exchange of data with health care providers and other government institutions, researchers interviewed only state officials; thus, this report does not reflect the perspectives of health care providers or federal, local, tribal, or territorial institutions, except for the District of Columbia.

Among the key findings:

  • Each state has unique needs and capacities. There is no single approach to modernizing public health data, but jurisdictions can still learn from each other and adopt common approaches and solutions.
  • Federal and state agencies that govern public health data can do more to improve reporting by enabling and encouraging health care providers to meet requirements, rather than through enforcement. For example, although only 13 jurisdictions require syndromic surveillance reporting, participation is widespread nationally, in part because of Medicare incentives.
  • State public health officials commonly cited staffing shortages, outdated IT infrastructure, and inadequate funding as the reasons for not investing in modern IT systems and expert staff to collect, analyze, and use electronic data.
  • State public health officials said some providers—including smaller and rural providers—do not have the resources to invest in automated electronic reporting systems, and officials are concerned that requiring them to do so may discourage reporting altogether. For smaller providers, some jurisdictions have created alternative methods to enable them to report more data electronically without having to adopt expensive technologies.
  • Positive signs demonstrate that more case reports can be automated and electronic, as opposed to manually transmitted via fax and phone. This will help improve the timeliness and accuracy of the data that public health agencies need to detect and prevent diseases more effectively.

Policymakers and public health practitioners can take some steps to modernize public health data reporting.

  • States should measure their baseline performance on data reporting—such as automated electronic reporting as a proportion of all reports, data timeliness, and completeness—to prioritize areas that need the most improvement, inform policies and strategies, and track progress. Officials could also benefit from engaging their counterparts in peer states to better understand how policy, staffing, funding, and other practices affect data quality. States can also share their experience with federal agencies to inform national data-modernization efforts.
  • Federal and state policymakers should explore ways to improve data from under-resourced health care providers that cannot afford automated electronic systems, such as incentivizing the use of web portals or batch uploads for providers that report at low volumes. These systems are not fully automated, but they can still reduce the burden on providers and public health agencies and improve data quality.
  • As the federal government works to standardize public health data exchange, it should engage health care providers and public health agencies to ensure that the incentives, standards, and processes they design are practical and achievable for a broad array of data reporters and users. In addition, state and local governments should provide their public health agencies with the resources needed to participate in federal data-modernization efforts.

Glossary

These terms are defined in the context of public health data reporting. Some may have different connotations in other fields of health and technology.

Automated electronic reporting. The use of health information technology (e.g., electronic health record systems) by health care providers (such as doctors, pharmacists, vaccine administrators) to send clinical data (e.g., symptoms, diagnoses, treatments, outcomes, demographic details) to state, tribal, local, territorial, and/or federal public health agencies.

Batch upload. Transmitting data files that contain large sets of clinical data, typically for more than one patient and optimally in standardized formats that allow for rapid processing and analysis.

BioSense Platform. Defined by the Centers for Disease Control and Prevention (CDC) as “a cloud-based early detection and monitoring system that helps the public health community protect Americans from injuries, emerging diseases, and environmental disasters.” See also National Syndromic Surveillance Program (NSSP).3

Case report. A set of individualized patient data generated by a clinical care organization and submitted to a public health agency as required by law or regulation. It often includes symptoms, lab results, diagnoses, treatments, outcomes, demographic data, and other details extracted from an electronic health record.

Data Modernization Initiative. The CDC’s “multi-year, multi-billion-dollar effort to modernize data across the federal and state public health landscape.”4

Electronic case reporting (eCR). Per the CDC, “the automated, real-time exchange of case report information between electronic health records and public health agencies.”5 The general, unabbreviated term may refer to any electronic technology used to transmit clinical case data but not necessarily via automated systems. All eCR technologies fall into the category of electronic case reporting, but not all electronic case reporting qualifies as eCR.

Electronic health record (EHR). Per the Office of the National Coordinator for Health IT (ONC), “a digital (computerized) version of patients’ paper charts. … EHRs are real-time, patient-centered records. They make information available instantly, ‘whenever and wherever it is needed.’ And they bring together in one place everything about a patient’s health.”6

Electronic initial case report (eICR). A standards-based electronic file that is generated by an EHR and transmitted to a public health agency to support eCR.

Electronic lab reporting (ELR). According to the CDC, ELR is “the transmission of digital laboratory reports from laboratories to health care and public health partners.”7

Electronic reporting. The use of any electronic tool to report data to public health. All eCR and ELR technologies fall into the broader category of electronic reporting, but states may define electronic reporting in policy in broad ways that include manual technologies such as fax or email.

Flat file. A single table of data in a simple, text-based format.

Health care provider. A trained and licensed clinical care professional (for example, primary care physician or nurse practitioner, specialist, school nurse, pharmacist, vaccine administrator) and/or facility offering health care (such as a hospital, emergency department, pharmacy, school). Throughout the report, context indicates when the term refers to individuals, hospitals, and/or health systems.

Health equity. The guiding principle that health is a fundamental human right and that all people—regardless of their identity, income, or location—should have access to the resources they need to be as healthy as possible, such as effective medical care, quality education, safe and affordable housing, nutritious food, and clean air and water.

Health information exchange (HIE). An organization or technological platform that facilitates the electronic sharing of health data, in compliance with all applicable patient privacy laws. It may also refer to the act of exchanging data via those organizations or platforms.

Health Information Portability and Accountability Act (HIPAA). The 1996 federal law that establishes the rights of patients to protect the privacy and confidentiality of personally identifiable information in their medical and health insurance records.8 It allows patients to access their own records and health care providers to share certain information with authorized recipients, including public health agencies when states designate the information as reportable.

Health Level Seven International (HL7). A not-for-profit “standards developing organization dedicated to providing a comprehensive framework and related standards for the exchange, integration, sharing and retrieval of electronic health information that supports clinical practice and the management, delivery and evaluation of health services.”9

Immunization Information Systems (IIS). Per the American Immunization Registry Association, “previously known as immunization registries, [IIS] are confidential, population-based, computerized databases that record all immunization doses administered by participating providers to persons residing within a given geopolitical area.”10

Interoperability. The ability of two or more electronic systems to exchange health information and use the information once it is received.

Lab report. A set of data that a lab generates and submits to a public health agency that may include lab test orders from health care providers, results, and other patient information.

Manual reporting. The use of fax, phone, mail, and other technologies to report case, lab, or immunization data to public health agencies. As opposed to automated electronic reporting, these processes require health care providers and public health officials to enter, transmit, receive, and process the data, which causes delays and may introduce errors.

National Syndromic Surveillance Program (NSSP). The CDC’s effort to collect electronic data from emergency departments across the United States. It collects and manages the data via the BioSense Platform.11

Notifiable diseases. Diseases and other health conditions that state and local health agencies report to the CDC.

Policy. Unless otherwise noted in this report, it refers to the CDC’s definition: “a law, regulation, procedure, administrative action, incentive, or voluntary practice of governments and other institutions.”12

Promoting Interoperability Program. A Medicare program that uses reimbursement incentives to promote public health data reporting via electronic health records that are certified by the ONC.

Reportable diseases. Diseases and other health conditions that health care providers must report to state, tribal, local, or territorial public health agencies, usually communicable diseases and environmental hazards that can pose a significant health threat. State, tribal, local, or territorial policies designate which diseases must be reported.

Reporters/data reporters. Health care providers who submit clinical data to public health agencies

Surveillance/disease surveillance. The public health practice of using data to detect and monitor emerging and existing threats, especially communicable diseases.

Syndromic surveillance. An early-warning system that uses de-identified reports of symptoms and syndromes captured primarily from emergency departments. The data enables agencies to detect and identify emerging threats quickly and perform real-time monitoring of known health threats.

Key findings and themes

The following table summarizes data-reporting policies of the 50 states and District of Columbia.

Table 1

Case Lab Syndromic Immunization
Where is reporting required in statute or regulation? 51 jurisdictions 51 jurisdictions 13 jurisdictions* 34 jurisdictions for all ages; 11 jurisdictions for only childhood immunizations†
Where is automated electronic reporting required in statute or regulation? 0 3 4 0
Where is electronic reporting required in statute or regulation? 3 7 4 5
What methods are specified as optional in statute or regulation? Phone, 46
Electronic, 34
Fax, 25
Mail, 19
Email, 3
Web portals, 3
Automated
electronic, 2
Phone, 38
Electronic, 27
Fax, 20
Mail, 16
Automated
electronic, 7
Web portals, 6
Email, 2
Automated
electronic, 2
Electronic, 1
File transfer protocol, 1
Batch messages, 1
Electronic, 14
Automated
electronic, 7
Web portals, 7
Fax, 1
Mail, 1
Phone, 1
Who is required to report data? Health care providers, 51
Hospitals, 45
Schools and school officials, 37
Child or day care centers, 30
Labs, 21
Correctional facilities, 14
Pharmacists, 13
Veterinarians, 10
Labs (including in hospitals), 51
Health care providers, 2
Emergency departments, 10
Urgent care centers, 3
Health care providers, 2
Inpatient facilities, 2
Pharmacists, 37
Health care providers, 32
Hospitals, 15
Schools and school officials, 11
Child or day care centers, 8
Insurers or health plans, 7

* This excludes states that collect syndromic surveillance data under authorities established through broader statutes or regulation.

† States that require reporting for children’s immunizations alone vary in how they define ages of children.

In addition to analyzing state policy on paper, Pew interviewed state public health officials to understand how those policies were implemented in practice. The following themes emerged from those interviews.

State and federal public health agencies can improve public health data reporting by learning from each other. Each jurisdiction has unique populations and diverse health priorities. Resources vary, too, which affects the capacity of health departments and health care providers to buy and implement modern data-management systems. Reflecting this diversity, states take a wide variety of approaches to enable, encourage, and require health care providers to share timelier, standardized, and accurate clinical data with public health agencies—and they have a diverse set of outcomes to show for it. There is no one-size-fits-all solution, but that does not mean 50-plus different solutions are necessary. Common approaches and solutions are useful to modernize data even within this variability.

States and the federal government do more by enabling and encouraging data reporters to meet requirements than by enforcing them. To compel reporting, nearly every state has enforcement mechanisms that range from fines to imprisonment. However, none of the interviewees could recall an instance in which any sanction was imposed on a noncompliant reporter. Instead, they said that informing providers about data quality, timeliness, and completeness—and sometimes escalating communications to more senior officials—often improved the issues without stronger action. Interviewees also reported that incentives, such as Medicare’s Promoting Interoperability Program, effectively improved reporting.

Automated electronic reporting has been shown to generate timelier, more complete, and more accurate data than manual reporting.13 However, the functionality and performance of the EHR systems and the implementation of technological data-sharing standards also significantly affect data quality. States do not necessarily collect better data if they require automated electronic reporting without also investing substantial time and effort into working with data reporters to evaluate and improve data quality.

The rapid adoption of eCR during the pandemic for reporting COVID-19 and the increasing use of ELR over the past 20 years or so indicate that case reporting can shift from manual to automated electronic systems given enough time, funding, staff, training, and political will.

Case reporting made strides during the COVID-19 pandemic but still needs urgent improvement. Although every state requires health care providers to notify public health agencies about infectious diseases, environmental illnesses, and other threats via case reports, officials in many jurisdictions said that cases were under reported and most likely to be reported for rare diseases or immediately reportable conditions (such as measles or meningococcal disease). As a result, officials said they tend to rely on lab reports, rather than case reports, as their primary source of information about emerging diseases. However, health departments need both case and lab data to have the most comprehensive and timeliest possible view of their communities’ health.

Further, when this research was conducted, health care providers and public health agencies widely used electronic case reporting (eCR) systems to report and collect COVID-19 data, but they continue to use fax, mail, and other reporting methods for other diseases. These older technologies require providers and practitioners to manually enter, transmit, and synthesize data, which slows analysis and introduces human error. A few jurisdictions are using eCRs for some additional conditions, but none is using it for all reportable conditions. Officials in about half of the jurisdictions Pew interviewed said that, although their state health departments receive some eCR, they do not incorporate the data into their surveillance systems. And even though most state health agencies and providers use technological standards such as Health Level 7 (HL7) to help automate data sharing, several officials said data quality still suffers and additional national standards are still needed.

By comparison, interviewees from about two-thirds of jurisdictions estimated that at least 90% of lab reports were transmitted via electronic lab reporting (ELR). Only two jurisdictions’ officials estimated the figure was less than 70%. Officials from a few jurisdictions did not provide estimates. Almost all interviewees estimated that they received most vaccination reports from health care providers—ranging from 50% to 95%—via automated electronic means. By design, virtually all syndromic surveillance data is reported by health care facilities via automated electronic systems.14

The rapid adoption of eCR during the pandemic for reporting COVID-19 and the increasing use of ELR over the past 20 years or so indicate that case reporting can shift from manual to automated electronic systems given enough time, funding, staff, training, and political will.

To ensure that reporting requirements do not exacerbate health inequities, public health officials want to accommodate providers with fewer resources. In interviews, officials in many jurisdictions said it is cost prohibitive for small and often rural providers to use their IT systems for automated electronic public health data reporting. They also said staffing shortages made it difficult for providers to adopt them. And officials in several jurisdictions said that some providers lacked adequate internet access to report data electronically. As a result, interviewees stressed the need for flexible reporting policies that can still improve data without setting unachievable requirements.

For example, during the pandemic, Oklahoma shifted away from receiving faxes of lab reports by establishing a laboratory reporting system that could still accommodate health care providers without ELR capabilities to submit lab results for COVID-19 via spreadsheet-type .CSV files or manual entry.15 The laboratory system was designed to check for missing data fields for all reporting methods. The state’s health department reported that this flexibility has significantly improved data completeness and it is considering expanding this approach for other conditions.

Most jurisdictions cited lack of fundamental resources to improve reporting. When asked to identify the biggest barriers to improving public health data exchange, most respondents said limited capacity, citing staffing shortages, outdated IT infrastructure, and inadequate funding. Without sustained and predictable funding, public health officials said they are reluctant to invest in the IT systems and expert staff that they need to collect, analyze, and use electronic data.

Because of the qualitative nature of the interview data presented throughout this report, specific terms are often used to refer to numerical ranges of jurisdictions. A “few” jurisdictions means three or four; “several” means five to 10; “many” means 11 to 30; “most” means 31 to 37; and “almost all” means 38 or more.

Background

Why does data matter to public health?

Just as doctors need data to diagnose and treat their patients, public health agencies rely on data to measure and improve the well-being of their communities.

Data enables public health agencies to provide essential services such as:16

  • Detecting and investigating infectious and environmental health threats.
  • Monitoring the health of the population.
  • Identifying health inequities, which the World Health Organization defines as “differences in health status or in the distribution of health resources between different population groups, arising from the social conditions in which people are born, grow, live, work, and age.”17
  • Designing and implementing evidence-based policies and practices to prevent and manage illness, improve health, and reduce inequities.
  • Allocating funds and workforce more efficiently and effectively.
  • Evaluating the impact of their interventions and adjusting as needed.

In turn, individuals and organizations rely on information from public health agencies, including health care providers who use it to guide clinical practice, scientists to conduct research, elected officials to develop policy, and the general public to guide their daily behavior.

Where do public health agencies get their data?

Public health agencies receive data from a variety of sources within and outside the health sector, such as emergency departments, pharmacies, and schools. Doctors’ offices, hospitals, health systems, and labs generate troves of data from millions of daily patient interactions.18 This report examines the state-level policies and practices that govern how clinical data flows to state public health agencies via four streams: case reports, lab reports, syndromic surveillance, and immunization information systems.

Case reports provide state and local health departments with information about individual patients with communicable diseases, environmental illnesses, or other conditions of public health importance (e.g., cancer, lead poisoning).19 Generated and shared by health care providers, these reports help agencies detect threats, conduct surveillance, and design strategies to reduce their spread and impact. Public health agencies can use demographic information in case reports (e.g., age, sex, race, ethnicity, ZIP code) to identify communities and populations at heightened risk of different diseases and most urgently in need of public health intervention.

Lab reports are based on orders from health care providers on tests that a lab performs. Labs send results to state and local health departments when patients’ results are positive for conditions of public health importance. In some cases, negative test results are also reported for certain conditions, such as hepatitis C. Positive lab reports help public health agencies confirm the presence of a reportable disease, initiate investigations, and conduct surveillance of reportable diseases if agencies receive them before case reports. Public health agencies can identify emerging threats more quickly by analyzing lab test orders, which reflect preliminary diagnoses and can be reported days before the results are in.

Syndromic surveillance is often anonymous, derived from de-identified data captured during visits, admissions, discharges, and transfers in health care facilities, primarily emergency departments. This data includes patients’ chief complaints, such as shortness of breath or vomiting, and diagnosis codes. The system is designed for speed; it can enable officials to perform real-time monitoring of known health threats and detect and identify emerging threats earlier than they can with confirmed case and lab reports.

Immunization information systems collect data on individual vaccinations, especially for childhood vaccines. These systems are managed by states and local jurisdictions and help providers and health agencies track immunization rates, identify communities that may lack access to vaccines, and investigate cases and outbreaks of vaccine-preventable diseases. Patients and providers also use the systems to keep track of their immunization records.

Automation: A Key Factor in Electronic Data Reporting

Electronic technologies can improve the flow of timely, standardized, and complete data, but not always. It depends on how states define “electronic,” if they do so at all. Some count fax and email as electronic technologies, even when these kinds of reporting require significant manual input. Automation is the key factor that makes electronic reporting optimally effective.

  • Often piggybacking on certified electronic health record systems, automated electronic data reporting requires limited to no manual input or reformatting from the health care entity sending the data. This can help ensure that the data is timely, standardized, and complete.
  • Some electronic technologies are partially automated. Web-based portals and digital spreadsheets (e.g., .CSV files), for example, may require providers to enter data manually but then allow public health agencies to receive it automatically in standardized formats for rapid processing and analysis.
  • Some electronic reporting processes feature no automation at all. This can include secure fax, email, or file transfers that require health care providers to manually input and transmit data to public health agencies, which must then manually reformat and reenter the data in their database. Manual input can introduce human error and delay analysis.

The researchers defined automated electronic reporting for the report as an electronic data transmission from a reporter, such as an EHR or laboratory information management system, that integrates into a public health data system with minimal manual intervention.

Many jurisdictions specify that automated electronic methods may be used to share data with public health agencies, but most still allow manual transmission. Some jurisdictions require automated electronic reporting for lab, syndromic surveillance, and/or immunization data; none mandates it for case reports. The findings in this report detail these differences.

How can public health agencies get the most value out of data?

To effectively detect, monitor, and prevent disease, public health agencies need data that is timely, standardized, and complete.20

  • Timely. Infectious and environmental threats can spread quickly. Public health agencies need data in real time (or as close to it as possible) to effectively trace the spread of diseases, limit the population’s exposure to them, and anticipate surges in illnesses and deploy resources (e.g., ventilators, masks, tests, vaccines, medication) where they are needed most.
  • Standardized. To receive and analyze data quickly, public health agencies need it to be formatted consistently. This requires both sender and receiver to use interoperable technology (meaning different systems can exchange information seamlessly) and uniform conventions for labeling and recording data. When public health agencies receive information from multiple sources via different methods in different formats, it can take significant time, staff, and money to prepare the data for analysis and use.
  • Complete. Public health agencies can detect and prevent diseases more effectively if they have comprehensive and granular data (e.g., sex, race and ethnicity, vaccination status, ZIP code) that enables them to identify communities disproportionately threatened by a disease or most in need of care. More narrowly and immediately, public health agencies also need accurate contact information to conduct case investigations, confirm illnesses, and identify risk factors; perform contact tracing; and connect people to treatment or prevention measures

Automated electronic reporting helps ensure that data meets these criteria. For example, a North Carolina study found that, compared with manually submitted reports, ELR produced more accurate information and allowed the recipients to process the data more efficiently.21 A study in Florida found that ELR nearly halved the time from symptom onset to health department reporting for salmonellosis and shigellosis, two infections once commonly reported by postal mail.22 And researchers found that New York City’s immunization data was significantly more timely and complete when it was exchanged electronically using a common technical standard (HL7) compared with manual transmission.23

How is health data regulated?

State oversight

States set most policies related to the reporting of public health data. A mix of statutes, regulations, and agency policies dictates:

  • Who must report data (e.g., physicians, hospitals, doctors’ offices, urgent care providers, labs, veterinarians, schools).
  • For which diseases or conditions reporters must share data (e.g., communicable diseases, environmental illnesses).
  • Time frame to report (e.g., immediately, 24 hours, seven days).
  • What types of data must be reported.
  • To whom reporters must report data.
  • The systems, methods, standards, and formats that must be used to report.
  • Patient privacy and confidentiality protections.
  • Penalties for failure to comply with reporting requirements.

That said, states do not always specify each of these elements. And further, some local governments layer additional reporting requirements over state policies.

Federal oversight

The federal government has significant influence but limited regulatory authority over the management of health data. While health care providers are required to report certain diseases to state public health agencies, notification from state to federal public health agencies is voluntary. Three agencies within the U.S. Department of Health and Human Services play a direct role in national efforts to modernize public health data reporting.

The Office of the Assistant Secretary for Technology Policy (ASTP), formerly known as the ONC, influences the development of health IT tools and standards.

  • As a condition of certification, it requires EHR products to allow their users to electronically share standardized health data with public health agencies.
  • It leads the development of the U.S. Core Data for Interoperability, which standardizes how data elements (e.g., name, address, diagnosis, treatment) are formatted, cataloged, and exchanged.
  • The office also oversees the Trusted Exchange Framework and Common Agreement, which provides a common set of legal agreements, infrastructure models, and governance structures that health care providers, health information exchanges, public health agencies, and other entities can use to share data.24

The Centers for Medicare & Medicaid Services (CMS) sets Medicare and Medicaid reimbursement policies that require participating health care facilities and providers (e.g., hospitals, physicians) to send clinical data to public health agencies.

  • Through its Promoting Interoperability Program and Merit-Based Incentive Payment System, Medicare provides participating hospitals and physicians with financial incentives to report case, lab, syndromic, and immunization data to state and local public health agencies.
  • During the pandemic, CMS required hospitals participating in Medicare and Medicaid to report COVID-19 case and hospitalization data (e.g., admissions, occupied and available beds) to the CDC, either directly or through their state’s health department.25

The CDC has broad responsibilities to protect the public’s health, including by collecting disease surveillance data from a wide variety of sources and analyzing it to inform public health decision-making. For example:

  • The agency maintains national disease reporting systems populated by data from local and state health departments and provides those agencies with funding and technical assistance to improve their collection, analysis, and use of the information.
  • It provides funding and grants to state and local health departments for data modernization as well as disease surveillance, prevention, and control activities for a wide range of conditions; it may require data reporting as a condition of funding.26
  • The CDC has developed a Public Health Data Strategy and is facilitating data sharing between health care providers and public health agencies via the Data Modernization Initiative, a multiyear effort started in 2019 “to get better, faster, actionable insights for decision-making at all levels of public health.”27 It also develops and promotes data-sharing standards, such as HL7, which enables health care providers, health information exchanges, public health agencies, payers, and other organizations to securely share clinical data with each other even if they are using different electronic systems to manage and store the information.

Federal laws also set a floor for health privacy, chiefly via the Health Insurance Portability and Accountability Act of 1996 (HIPAA), which prohibits the disclosure of patients’ health information without their knowledge or consent. Balancing individual and societal needs, Congress designed the law with exemptions that allow health care providers, hospitals, and labs to share some patient information with public health agencies to prevent the spread of communicable diseases. Similarly, the Family Educational Rights and Privacy Act of 1974 protects the privacy of students’ education records, though it allows schools to share some data with health departments in cases of health and safety emergencies.28

Findings by data source

The next three sections present the findings of the legal analysis and interviews by data source: case and lab reporting, syndromic surveillance, and immunization reporting. Each section opens with some brief background information about the nature and value of the data source and presents key takeaways and action items. The complete findings follow, including jurisdictions’ general data-reporting policies and practices, its required reporters, required and acceptable reporting methods, how data flows from health care to public health, and how jurisdictions assess and improve data quality.

Case and lab reporting

When physicians first systematically reported diseases in the United States in 1874, they mailed postcards.29 Since then, doctors, hospitals, labs, and other health care providers used telegrams, phone, and fax to report data.

Today, for labs, automated digital systems are mature and used widely, providing lessons and inspiration for the future of case reporting, which still relies heavily on fax and phone.

Compliant with HIPAA privacy protections, case reports from health care providers can include a patient’s name, contact information, race and ethnicity, symptoms, diagnoses, treatments, and outcomes.30 Lab reports come from labs and are usually generated when people test positive for a reportable illness; they typically include less information than a case report. As happened during the COVID-19 pandemic, labs are sometimes required to report negative results.31

Case and lab reports are inherently delayed as it takes time for people to be exposed to an infectious or environmental threat, develop symptoms, see a provider, get diagnosed and tested, and then have their case reported. Manual reporting causes further delays as the paper-based reports often lack key information and require people to transfer the data into public health databases by hand, which also introduces human error.32

One promising solution is eCR, which automates the transmission of reportable data from EHRs to public health agencies. (See “Automation: A Key Factor in Electronic Data Reporting” on p. 10 for examples of what is automated and what is simply electronic.) Although most health care providers and organizations already use EHR systems to digitize their patients’ medical charts, eCR adoption is not as widespread, and barriers are slowing its adoption.33

Like eCR, ELR automates the reporting process, allowing labs to consistently and automatically transmit detailed, standardized data to public health agencies. Compared with eCR, however, ELR is more widely used. Case reporting made strides during the COVID-19 pandemic, and while eCR efforts have only recently gained momentum, the CDC has been supporting ELR in state and local health departments since 2010.34 All state health agencies have reported to the CDC that they receive at least some laboratory data via ELR.35

Public health agencies that rely on labs for data may be missing patients who have a reportable illness but did not take a lab test. In addition, lab reports often lack key details (e.g., race and ethnicity, onset date, outcomes) when providers do not share that information with labs or when labs cannot get it directly from patients.

Key takeaways

  • Every jurisdiction requires case and lab reporting, but none requires eCR, three require generic (that is, not necessarily automated) electronic case reporting, only three jurisdictions require ELR, and seven require generic electronic reporting for lab results. Automated electronic reporting is an optional method specified in statutes or regulations in two jurisdictions for case and in seven jurisdictions for lab. Almost every jurisdiction specifies in statute or regulation when data must be reported (e.g., within 24 hours of seeing a patient) and what data elements must be included (e.g., date of birth, sex, age, race/ethnicity).
  • While the pandemic effectively forced health care providers and public health agencies to adopt eCR for reporting and collecting COVID-19 data, these automated electronic systems are still not widely used for other reportable diseases.36 (See “Lessons Learned From the COVID-19 Pandemic” on p. 42 for more information about how states modernized eCR during the pandemic and are applying the lessons for other diseases.) Worse still, public health officials in many jurisdictions said that cases are underreported overall, with providers more likely to report cases of rare diseases or immediately reportable conditions. As a result, public health agencies rely more on lab reports to create their internal cases, rather than requiring both case and lab reports. However, public health agencies that rely on labs for data may be missing patients who have a reportable illness but did not take a lab test. In addition, lab reports often lack key details (e.g., race and ethnicity, onset date, outcomes) when providers do not share that information with labs or when labs cannot get it directly from patients.
  • Public health officials expressed concern that small and under-resourced providers often cannot afford to transition from manual to eCR-capable systems.
  • There are challenges with eCR on the receiving end as well. Officials in about half of the jurisdictions Pew interviewed said that, although health departments receive some eCR, they were not incorporating the data into their disease surveillance systems when the research for this report was conducted. A number of these jurisdictions described plans or current activities to use the data and eventually incorporate it into their disease surveillance systems.
  • Few respondents could specify or even estimate the proportion of case reports they receive electronically. Without this baseline data, health departments cannot know definitively whether their efforts to promote eCR—through requirements, investments, and/or incentives—are having the intended effect.
  • No jurisdiction requires eCR, but ELR is required in seven. Officials in about two-thirds of jurisdictions said they received 90% or more of their lab reports via ELR. Many respondents reported that timeliness of ELR was generally acceptable. Several jurisdictions reported receiving ELR in near-real time or daily from most labs. Only two reported receiving less than 70% of their lab reports via ELR.
  • Officials in many jurisdictions described technological challenges with eCR, including duplicate reporting, large file sizes, and lack of storage.

Action items

  • Jurisdictions should inventory their reporting capacities, statutes, regulations, and agency policies to identify opportunities to improve the quality of case data.
  • Federal and state policymakers should consider flexible electronic reporting policies for under-resourced or low-volume reporters.
  • Jurisdictions should build on the progress made during the pandemic to use eCR for COVID and expand it for reporting a broader range of diseases. The CDC reports that eCRs are being used to report a wider range of conditions beyond COVID-19 and is helping facilities implement eCR for more than 170 reportable conditions.37

Complete findings

How do states govern case and lab reporting?
  • Case and lab reporting is conducted in all 50 states and the District of Columbia, and every jurisdiction has statutes or regulations that require reporting of infectious and environmental diseases of public health concern. It is not optional in any jurisdiction.
  • Every jurisdiction maintains a list of reportable conditions that must be submitted to the state health department via case reports or lab reports, or both. Forty-five set their lists in formal regulations while others maintain online lists in other published documents.
  • Almost every jurisdiction—49 for case and 45 for lab—specifies timeliness requirements for reports. In many jurisdictions, those requirements relate to specific conditions (e.g., immediately for anthrax, within 24 hours for measles, within a week for flu, quarterly for methicillin-resistant Staphylococcus aureus).
  • Almost every jurisdiction—46 for case and 45 for lab—requires certain data elements such as name, address, date of birth, sex, or race and ethnicity in statutes or regulations. (See “Race and Ethnicity Data Across Use Cases” on page 41 for more information.)
  • Forty-one jurisdictions have both statutes and regulations pertaining to case reporting. The remaining 10 have regulations only. Fifty have regulations related to lab reporting, and 36 also have relevant statutes.
    • In interviews, officials in 23 jurisdictions said that changing case and lab reporting requirements could be accomplished through the regulatory administrative process. Officials in five indicated that changing regulations in their state requires legislative approval or involvement.
  • Forty jurisdictions have enforcement mechanisms to penalize reporters for failing to comply with case and lab reporting requirements and related provisions (e.g., privacy and confidentiality). Two have enforcement mechanisms for not complying with case reporting requirements only. Penalties include fines, reports to medical licensing boards, or short-term imprisonment. However, no one we interviewed knew of any instances in which required reporters were penalized for noncompliance.
    • Respondents from several jurisdictions indicated that compliance with reporting requirements generally relies on strong relationships and communication with reporters. They said that informing health care facilities about data quality or reporting issues often improved the issues without any stronger action.
    • At least two jurisdictions reportedly work with the Clinical Laboratory Improvement Amendments program to address reporting issues via lab certification inspections and audits.
    • A few respondents described escalating reporting issues to agency leaders or sending letters to reporters to address reporting problems.
    • A few respondents indicated that incentives were preferable to penalties.

Who is required to report case and lab data?

Providers
  • Every state and the District of Columbia require providers (physicians, nurses, and other clinical care professionals) to submit case reports.
  • Only two jurisdictions require providers to submit lab reports. Providers are not required to report lab results in the other 49.
Laboratories
  • Twenty-one jurisdictions require labs to submit case reports; 30 do not.
  • All 51 jurisdictions require labs, including those within hospitals, to submit reportable test results.
Hospitals
  • Almost all jurisdictions require hospitals to submit case reports, with 45 listing hospitals as required reporters in statutes and regulations. Six do not list hospitals as required case reporters.
Other reporters
Forty-seven jurisdictions require other individuals and organizations to report case data, including:
  • Schools and school officials in 37.
  • Child or day care centers in 30.
  • Dentists in 21.
  • Correctional facilities in 14.
  • Pharmacists in 13.
  • Veterinarians in 10.

Other required reporters include paramedics, emergency medical services, or emergency medical technicians; medical examiners or coroners; local health departments, officers, or boards of health; food or beverage establishments or handlers; and even broad categories using language such as “any persons with knowledge of disease.” States vary in where and how they specify these reporter requirements. For example, states may include paramedics, emergency medical personnel, and medical examiners within their definition of health care providers or list those entities separately from health care providers. States may require groups such as food or beverage establishments to report foodborne outbreaks specifically, while other states may require those same groups to report a broader set of illnesses

How are case and lab data reported?

Officials described how enabling and encouraging more automated electronic reporting—for example, through improvements to standards and EHR systems, and by offering financial incentives through Medicare—would more effectively improve reporting.

In policy

Automated electronic reporting
  • Of the four data sources covered in this research, eCR is in the earliest stages of widespread adoption both in policy and in practice. However, automated electronic reporting for lab reports (i.e., ELR) is required in state policy and used in practice most often.
  • For case reporting, no jurisdiction requires eCR and only two reference automated electronic reporting as a method in statute or regulation; the other 49 jurisdictions do not specifically include or exclude it. Thirty-four also allow electronic methods of reporting more broadly.
  • In interviews, officials in 15 jurisdictions offered that mandating eCR would not improve reporting within their state, compared with officials in 10 who said it would. Instead of requirements, officials described how enabling and encouraging more automated electronic reporting—for example, through improvements to standards and EHR systems, and by offering financial incentives through Medicare—would more effectively improve reporting.
  • For lab reporting, three states require ELR and seven specify that it is an optional method. Seven jurisdictions require that lab results be reported electronically without specifying automation, while 27 refer to electronic reporting as an option. Twenty-four jurisdictions allow lab reporting via fax (20), web portals (six), and/or email (two). This reflects how ELR has progressed compared with eCR and provides a model for state policymakers and health officials as they look to expand eCR.
    • In interviews, officials in 20 jurisdictions said that mandating ELR would improve reporting within their state, compared with officials in 10 who said it would not.
    • In states that require ELR, officials said it increased the flow of electronic data. One interviewee from a state with large commercial labs described how requirements there prompted those facilities to send ELRs to “a lot of other states as well.”
    • However, other officials were also cautious about broad mandates for all reporters and preferred a tailored mandate that accommodates smaller and rural reporters. Indeed, a few jurisdictions require ELRs only from labs that exceed a certain threshold, such as 30 reports a month in one state.
    • As one official said, “In order to have a provider or an organization to report electronically, they first need to have the capacity. And if we make that regulation or make that rule [requiring ELR], what will happen to the small entity that doesn’t have the capacity to do so?”
Web portal
  • Web portals are an option for reporting case data in three jurisdictions. They are not specified in the other 48.
  • Six jurisdictions, including the three that allow web portals for case reporting, specify that labs can report via web portal.

Email
  • Reporters can use email to report case data in three jurisdictions and lab results in two. The remaining jurisdictions do not specify.
Fax
  • Twenty-five jurisdictions allow case data to be reported via fax; 20 accept lab results via fax. The rest do not specify
Mail
  • Mail is an option for reporting case data in 19 jurisdictions and for reporting lab results in 16. The rest do not specify.
Telephone
  • Forty-six jurisdictions list telephone as a method for case reporting; 38 list it as a method for labs reporting. The rest did not specify.
  • Jurisdictions also require providers and labs to report suspicions of certain diseases (e.g., anthrax, measles, tuberculosis) to their state health department by phone within 24 hours. Confirmed cases of other diseases may be required to be reported within 24 hours or other time frames, but not necessarily by phone.

Under-Resourced Providers Benefit From Flexible Data-Reporting Requirements

Certified EHR systems that enable eCR and ELR are complicated and expensive to purchase, install, and maintain. Some health care providers can invest in this technology and the workforce to manage it more easily than others.

As one official said, “Smaller providers do not have the capacity for HL7 reporting or for establishing electronic case reporting … so we try to meet them in the middle. … We could accept, for example, comma delimited files [such as .CSV spreadsheets] or we set up a system where they could just enter the data directly to the system and it comes directly to us. We try to find ways to make it easier for them.”

Another official voiced a concern shared by others that rigid requirements could shut out some providers: “The concern with … mandating [ELR] from the smaller reference labs would be that we would essentially not get that data.”

At the same time, health departments may lack the infrastructure to securely receive, process, and store electronic data from every reporter in their state. As a result, taking a stepwise approach to increase capacity gradually, as budgets and workforce allow, may be the most pragmatic course.

During the COVID-19 pandemic, the Texas Department of State Health Services started receiving a significant number of lab reports via fax, which normally requires staff to manually process and input the data from paper into databases.38 This was common across the country during the pandemic, as new, temporary facilities, such as pop-up sites, were reporting results but lacked the IT infrastructure of conventional labs to accommodate ELR.39 To meet reporters where they were, the agency developed an “eFaxing” system that automatically converted faxes into data that could go into the disease surveillance system. This ensured that every lab could report test results while also reducing the burden on public health agencies to manually review and input data from paper-based faxes. These intermediate improvements in reporting are valuable in special situations, but because eFaxing still requires labs to take additional administrative steps, facilities that report regularly at higher volumes would still benefit more from the faster, more automated data transmission that ELR offers.

In practice

Although statutes and regulations require providers to report case data for all reportable diseases, many interviewees said that providers substantially underreport cases and are likely to report only cases of rare diseases or immediately reportable conditions. When public health agencies do receive case reports, providers commonly wait until after they get positive results from the lab to share case data, unless they are otherwise required to report when they reasonably suspect a patient has a reportable disease. As a result, case reports often arrive after lab results and with no new information. So, officials said that lab reports were their primary source of initial information about cases of disease.

Because eCR is new in most states and providers report case data inconsistently by any means, few interview respondents were able to estimate the proportion of case reports they receive electronically. However, respondents said fax and telephone were the most common methods for case reporting, followed by web entry, and, rarely, email or paper mail.

Officials in almost all of the jurisdictions Pew interviewed said that they receive some data via eCR, but many do not fully use it. As a result, they are likely missing opportunities to use that data to improve disease surveillance, prevention, and control.

  • At the time of these interviews, about half of the officials said they were not incorporating eCR data into their surveillance systems. Only one-third said they bring eCR data into their disease surveillance systems and process that data for epidemiology programs to use for analysis. Expanding the use of eCR data into surveillance systems across all conditions requires significant investments of time and resources.
  • Officials in several jurisdictions said they stored eCR data outside of their surveillance systems, such as in other databases, where staff epidemiologists can access the data if needed to fill additional information (e.g., patient demographics) to advance case investigations.
  • Of the jurisdictions whose officials said they received eCRs for specific conditions, all received COVID-19 reports. Many also received eCRs for mpox, a viral infection that the World Health Organization declared a public health emergency of international concern in July 2022.40 A few jurisdictions received eCRs for additional conditions, but no jurisdiction was receiving eCRs for all reportable conditions when this research was conducted.
  • Several jurisdictions described the status of their eCR implementation as a test and validation or “parallel production” phase. They continue to use legacy reporting systems while they receive data via eCR in a test environment, evaluate data quality, and work out how that data might be integrated with existing case information.
  • Several jurisdictions said their surveillance systems were simply not technologically capable of receiving and processing eCR data. Some are waiting to implement new systems rather than incorporate eCR capabilities into their existing platforms.

By contrast, officials in every jurisdiction interviewed reported receiving and using ELRs frequently, whether jurisdictions required it or not.

  • Officials representing about two-thirds of the jurisdictions Pew interviewed—with and without ELR requirements—estimated receiving 90% or more of their laboratory reports via ELR.
  • Ten jurisdictions estimated receiving between 70% and less than 90% via ELR.
  • Two estimated receiving less than 70% via ELR.

National Platforms Facilitate Sharing of Case Reports Across State Lines

Because case reporting requirements differ from state to state, health care providers who treat patients from multiple states—for instance, a doctor in a tristate area—may be required to comply with multiple policies using different mechanisms to report different diseases under different timelines to different health departments. To help providers overcome these challenges, the CDC has partnered with the Association of Public Health Laboratories (APHL) and the Council of State and Territorial Epidemiologists to establish two platforms—the APHL Informatics Messaging Services (AIMS) and Reportable Condition Knowledge Management System, which together route data to the appropriate jurisdiction.41

This relieves providers of the burden of navigating multiple jurisdictions’ policies and ensures that those jurisdictions receive the data they need to detect, prevent, and control disease. Many officials reported to Pew that they rely on AIMS’ centralized eCR reporting infrastructure.

How does data flow to public health agencies?

  • Many jurisdictions require case and lab reports to go straight to their state health department. Many other jurisdictions require case and lab reports to go to local health departments first, which then submit them to the state health department. Two jurisdictions allow case and lab reports to go to local and state health departments concurrently. In some jurisdictions, providers can report to state or local agencies.
  • Many jurisdictions that described their reporting structure said regardless of whether the eCR and ELR must go to state or local health departments first, they can be made available to both at the same time. With simultaneous access for state and local public health entities, health care providers can fulfill their legal requirement to report diseases to either jurisdiction by sending data to the centralized system. Several interview respondents also noted that, although eCRs were submitted to state agencies, manual reports were more likely to go to local health departments first, especially where local health departments play a key role in case investigation and might have strong relationships with local providers. One benefit of automated electronic systems is that they can transmit data to state and local departments simultaneously without increasing the burden on health care providers; this ensures that agencies at both levels have timelier, more complete, and more accurate data to better detect, prevent, and control diseases in their communities.
  • In many jurisdictions, reporters can connect to a health information exchange (HIE) to facilitate case and lab reporting. Compared with eCR, this practice is more common for lab reports, which is consistent with their higher proportion of ELR reporting.

What data quality issues challenge health departments?

Interview respondents described a variety of issues that undermine the quality of electronic case and lab data they receive and their ability to analyze and use it.

  • eCR
    • Duplicate and erroneous reports are often triggered by outdated information in patients’ history.
      • One official estimated that up to 40% of their state’s eCR data were duplicate records.
      • Erroneous reports might be triggered by updates or changes to the health record that prompt EHR systems to automatically generate new reports for the same illness. As one official described it, “The disease of interest was diagnosed four years ago and yet it’s still [marked as] an active problem in the person’s health record. … And you’re like, whoa I need to look at this. And then you spend time looking through this like massive PDF that they sent you and you realize, oh, they had TB in 1998. Great. … And then they come back in for a recheck and you see their chart again because, guess what, their TB from 1998 is still showing up.”
    • Officials also noted that EHRs may send patient information unrelated to the reportable condition, raising privacy concerns.
    • Because eCRs are large files, duplicate and erroneous reports increase data storage costs, which can be prohibitive for under-resourced health departments.
    • Respondents noted that a lack of national standards often drives data quality issues. This is consistent with recommendations from numerous entities working on data modernization at the national level, such as the Health Information Technology Advisory Committee and the CDC Advisory Committee to the Director Data and Surveillance Workgroup.42
  • ELR
    • Many respondents reported issues with completeness and accuracy of key data elements, including patient contact information, addresses, race, and ethnicity, and identified these elements as active areas for improvement. Patients’ phone numbers and addresses are critical for contact tracing or further investigation and for accurately determining jurisdiction, which is usually based on where a patient lives.
      • Several respondents also reported issues with appropriate use of standardized codes for reporting lab tests and results. These codes are key for identifying the disease or condition the report is about, determining the geographic location of outbreaks or clusters, accurately and automatically processing reports into state surveillance systems, and ensuring the data can be analyzed and used by epidemiologists.
      • Many respondents reported that timeliness of ELR was generally acceptable. Several jurisdictions reported receiving ELR in near-real time or daily from most labs.

During the COVID-19 pandemic, Michigan’s Department of Health and Human Services observed that lab reports were often missing race and ethnicity data and other key details necessary to track the disease in different communities and populations.43 By publishing report cards that compared the completeness of lab reports, the agency stoked “friendly competition” and incentivized labs to submit more complete reports and freed up staff to focus on core public health functions rather than assessing and improving data quality.

What are states doing to ensure data quality for eCR and ELR?

Existing messaging standards
  • Many interviewees reported using the HL7 standard for eCR, known as eICR (electronic initial case report).
  • For ELR, most respondents reported using HL7 standards, with many specifically citing the HL7 Version 2.5.1 Implementation Guide for ELR. Several said they also accepted the older 2.3.1 standard, converting to the current version in their own systems.
  • For ELR, respondents also reported different approaches to promoting standards. Several jurisdictions described their approach as strict, adhering closely to the requirements or applying additional requirements beyond the national HL7 implementation guide. The same number of jurisdictions described their approach as more permissive, enforcing fewer requirements than the national standard. Interviewees described a tension between strict adherence to the standard to prevent data quality issues downstream and being permissive to avoid missing reports and discouraging reporting
    • One example of different approaches relates to the use of standard terminology for lab tests and results. The HL7 implementation guide calls for the use of two common sets of codes to standardize the way specific lab tests and results are reported. Labs, however, often use their own local codes in their systems, which they must manually translate to those common codes before reporting them to their health departments. A few officials said that their states strictly require labs to adhere to those common codes, whereas officials in two jurisdictions said health agencies accepted local lab codes and performed the manual translation themselves.
Onboarding new reporters

For eCR and ELR, the process of onboarding new reporters is a critical point for testing initial connections between providers’ and state health departments’ systems, assessing data quality, and preventing issues once reporting goes live.

  • Many jurisdictions reported that the CDC onboarding team handled setup for connections to AIMS for eCR as well as initial validation. A few jurisdictions described performing additional validation for reports after onboarding
  • For ELR, many jurisdictions described extensive validation checks and testing during onboarding to ensure that messages adhere to the HL7 standard and key data elements are populated correctly.
  • At least two respondents reported having disease domain experts check the data to ensure accuracy and usability.
  • A few jurisdictions described a period of parallel production after going live. If they find that a reporter’s manual data matches its ELR data, the lab can stop sending manual reports and rely solely on ELR
Monitoring data quality

Interview respondents described different approaches to monitoring ongoing data quality.

  • Data quality practices for ELR were well established. Many respondents reported using routine validation checks, often automated, to ensure that incoming messages conform to HL7 standards and that essential data elements are complete before incorporating data into their surveillance systems.
  • Several jurisdictions described manual elements to ELR data quality processes, particularly for handling errors identified by validation checks. Staff review messages that fail these checks and can manually correct them or follow up with labs to correct them. Reports submitted in a .CSV format often require more manual review and cleaning than HL7 messages.
  • Many respondents reported monitoring ELR data feeds for reporting volumes and frequencies, looking for any drops or changes in established patterns according to reporting facilities and reportable conditions. Several of these jurisdictions also reported using dashboards or reports to assist with this monitoring and notify staff about changes.
  • Several respondents said that their jurisdictions are still working out data quality processes for eCR. Officials also described data quality processes for eCR that were similar to those used for ELR, including validation checks for conformance to standards and completeness of critical data elements and monitoring of reporting volumes.
Improving data quality

Most respondents described working with providers and facilities to improve data quality, stressing the importance of communication and strong relationships with partners to facilitate better reporting.

  • Officials in several jurisdictions said they provided feedback to reporters on their own data quality via report cards or dashboards, comparing their performance with other facilities or state averages.
  • A few respondents said they tried to fill missing data elements (e.g., contact information, race) by matching to other available data sources, such as an HIE with patient demographic data, state drivers’ license databases, and online address verification services.
  • States are working to improve automated integration of ELR into disease surveillance systems through technical workflows and validation rules. Although states manage most infectious disease data in integrated disease surveillance systems, some also have separate, siloed systems for specific conditions, such as sexually transmitted infections, that may require manual steps to incorporate ELR data.

Syndromic surveillance

Syndromic surveillance is designed to be an early-warning system for urgent threats. It received renewed attention in the wake of the 9/11 and anthrax attacks of 2001.44 Coinciding with the development of electronic data systems, syndromic surveillance benefits from automated, real-time data reporting with little to no work required from health care providers once they connect to the system.

Local, state, or national syndromic surveillance systems collect patient data when providers—mostly doctors and nurses in emergency departments* —see, admit, transfer, or discharge patients.45 The information is anonymized and includes some combination of patients’ chief complaints, symptoms (e.g., sore throat), signs (e.g., temperature), diagnoses, demographic details, and location.46

The system is designed for speed. It can often take days or weeks to test and diagnose a patient, and longer still for the related case and lab data to reach a public health agency. By comparison, because syndromic data is based on symptoms and signs that define a syndrome (e.g., influenza-like illness), data can be generated and reported more quickly than other types of public health reporting.47

State and local health departments expanded their syndromic surveillance capacity after the CDC began redesigning its BioSense program in 2010.48 BioSense has since evolved into the National Syndromic Surveillance Program (NSSP). Many jurisdictions use the analytic tools and services within NSSP’s cloud-based BioSense Platform to track seasonal influenza-like illnesses and were able to build on this experience to provide critical information about COVID-19 symptoms during the pandemic.49 States have used the data for a wide range of purposes, including to improve support for people experiencing homelessness in the state of Washington, to understand and mitigate the impact of wildfire smoke on the public’s health in Oregon, and manage a dengue outbreak in Florida with smarter mosquito-control strategies.50

By monitoring for spikes in syndromes or other trends, officials have been able to identify and respond to threats quickly and develop targeted interventions and messaging to address the health conditions facing their communities (e.g., wildfire smoke, exposure to chemical hazards in occupational settings). One official described syndromic surveillance as “a very robust tool for a lot of nontraditional public health issues—drug overdoses, suicide, heat stroke—things where there’s no case-based reporting. There’s no lab test to do and so it’s getting utilized more and more beyond traditional situational awareness.”

* Syndromic surveillance can also include data from urgent care centers, poison control centers, emergency medical service agencies, and other providers.

Key takeaways

  • Compared with case, lab, and immunization reporting, syndromic surveillance comes with the fewest reporting requirements; only 13 jurisdictions mandate syndromic surveillance reporting (mostly by emergency departments). This excludes jurisdictions that collect syndromic surveillance data under authorities established through broader statutes or regulations. Participation is higher: Of the jurisdictions whose public health officials Pew interviewed, about two-thirds reported receiving syndromic data from 75% or more of the emergency departments in their state.
  • Officials in many jurisdictions said that the Centers for Medicare & Medicaid Services’ Promoting Interoperability Program effectively incentivized hospitals to report syndromic surveillance data.
  • Even though patients are increasingly visiting urgent care clinics instead of emergency departments, only three jurisdictions require these facilities to report syndromic surveillance data.51 Without broader reporting requirements, state health departments may be missing opportunities to detect and respond quickly to emerging threats. Interviewees said that a main barrier to expanding urgent care reporting is the cost to the state for establishing connections and onboarding these facilities.

Action items

  • States’ public health agencies should measure the participation of urgent care centers in syndromic surveillance reporting and consider—to the extent these places provide a significant and growing proportion of emergency care—whether new policies or initiatives are needed to encourage or require more urgent care centers to report syndromes.
  • State public health agencies should take an inventory of their reporting capacities, statutes, regulations, and policies to identify opportunities for increasing the reporting among eligible health care facilities where no data has been recently sent to the NSSP BioSense Platform.52

Complete findings

How do states govern syndromic surveillance reporting?
  • While almost all jurisdictions have active statewide syndromic surveillance programs, two state public health agencies interviewed had limited programs, one that covered symptoms for a few conditions only (e.g., COVID-19) and another in which data was reported in only some counties.
  • Most syndromic surveillance data is provided voluntarily. Only 13 jurisdictions require statewide syndromic surveillance reporting. Three have statutes or regulations that enable their state health department to establish a syndromic surveillance system, but these laws either do not mandate reporting or only specify that the health authority may require reporting. Pew’s count excludes jurisdictions that collect syndromic surveillance data under authorities established through broader statutes or regulations.
    • Four jurisdictions explicitly require providers to report syndromic surveillance data via automated electronic means, while four require generic electronic reporting.
  • Officials in 10 jurisdictions that did not already require syndromic surveillance reporting said that instituting a reporting requirement would improve reporting within their state, whereas officials in eight believed that it would not, in part because participation was high already and incentivized by CMS’ Promoting Interoperability Program.
  • In interviews, officials in five jurisdictions said that requiring syndromic surveillance reporting could be accomplished through the regulatory process. Officials in two indicated that they would probably need legislation to require syndromic reporting.
  • One state includes a list of specific syndromes in regulation. Others provide a list of syndromes on their websites or in nonbinding guidance. Limiting the symptoms, signs, and syndromes that should be reported may prevent public health departments from tracking conditions of interest and identifying unexpected threats.
  • Five of the 13 jurisdictions that require syndromic surveillance reporting specify data standards in statute or regulation. It is more common for states to communicate required standards via onboarding and implementation guides. States may benefit from the more common approach, which enables public health agencies to adapt to new standards and technologies
  • Five of the 13 jurisdictions that require syndromic surveillance have an enforcement provision in statute or regulation. However, no one shared instances in which reporters were penalized for not complying with data-reporting requirements.

In practice, state public health officials reported that they receive syndromic surveillance data most often from emergency departments, less often from urgent care centers, and least often from outpatient clinics.

  • Emergency departments
    • Of the 43 jurisdictions whose public health officials Pew interviewed, about two-thirds reported receiving data from 75% or more of the emergency departments in their state.
    • Only one state official reported receiving less data than that; the remaining responses were not specific enough to further categorize.
    • Several jurisdictions did not specifically comment on the presence of emergency department data, though the majority of these reported receiving hospital data that likely encompassed emergency as well as inpatient data.
  • Urgent care
    • At the time of these interviews, only one jurisdiction reported receiving data from 75% or more of the available urgent cares.
    • Several jurisdictions reported receiving limited urgent care data. Some had invested in building electronic connections to urgent cares in communities that lack an emergency department or urgent cares affiliated with hospitals or health systems that were also providing emergency department data.
    • Urgent care participation may increase in the coming years, as several jurisdictions reported that they are working to expand relationships with and connections to these facilities.
    • Two jurisdictions, however, explicitly reported difficulty in previous expansion attempts, including legal limits to current authorizations. Officials said cost may be a barrier to getting small and independent urgent care clinics to connect with syndromic surveillance systems. It is important to note that CMS’ Promoting Interoperability Program does not incentivize urgent care centers to report syndromic surveillance data.

Urgent Care Centers Can Be a Vital Source of Syndromic Surveillance Data

Urgent care centers could be a growing source of syndromic data. A 2022 literature review conducted by researchers at the University of California-San Francisco found that urgent care visits increased substantially from the late 2000s to the late 2010s and that the increase was associated with a drop in emergency department visits.53 A 2021 study found that “having an open urgent care center in a ZIP code reduced the total number of ED [emergency department] visits by residents in that ZIP code by 17.2%.”54 If states have limited authority to compel urgent care facilities to provide this data, they may miss opportunities to detect emerging threats and estimate the burden of illnesses.

  • Outpatient clinics
    • Several jurisdictions reported receiving data from a subset of outpatient clinics, particularly those that are part of or affiliated with hospitals or health systems.
    • One agency reported that it lacked authorization to require outpatient clinics to report syndromic surveillance data.
How does data flow from reporters to public health?

Syndromic surveillance data usually ends up in the NSSP BioSense Platform and with state health departments, but the data takes different paths depending on how states govern the reporting process.

  • Of the 13 jurisdictions that require syndromic surveillance reporting, about half use their own platform, while in the other half, reporters submit data directly to the NSSP BioSense Platform, which states can then access and analyze.
  • Officials in 28 jurisdictions, including those with and without reporting requirements, described their respective state’s NSSP BioSense reporting structure. Consistent with the legal analysis, slightly more than half said they collect and manage syndromic surveillance data before routing it to the platform, while the others reported that they established direct connections to that platform.

States Can Benefit if They Collect Syndromic Surveillance Data Directly From Hospitals

States that rely on the data reported directly to the NSSP BioSense Platform may have access to less data than states that use their own platform and can require or encourage the reporting of data elements that are relevant to each jurisdiction’s particular health issues. Officials from one state agency noted that collecting data directly from providers rather than from BioSense helped them tackle specific health issues that required more detailed data than what the platform could provide, including special studies related to injury and violence prevention programs.

  • Facilities may report data first to a HIE, a third-party platform that serves as an electronic hub for different organizations to share data, including with the NSSP BioSense Platform. States employ HIEs in a variety of ways: for example, to supplement data that public health agencies need from health care providers, send clinical information securely across providers who see the same patients (such as primary care providers and specialists), or enable an emergency department to quickly access patient medical records.55
    • Respondents in many jurisdictions said that HIEs supported reporting, including several that did so for a substantial proportion of data.
    • Several jurisdictions explicitly reported an absence of HIE involvement in syndromic surveillance.
How do states ensure data quality?

Respondents described a variety of methods to monitor and improve the quality of syndromic surveillance data.

  • Data quality efforts typically begin with the onboarding process, when health departments help health care providers connect their EHR platform to the state’s syndromic surveillance system and ensure that the automated messages are coming through consistently, quickly, and accurately.
    • As one respondent noted, establishing syndromic surveillance connections is the first opportunity for health care facilities to demonstrate that their data conforms to technological and formatting standards.
    • Seven jurisdictions identified challenges related to onboarding, noting that the process is often lengthy and time-consuming, and can be burdensome for both the department and the provider or facility reporting data, that switching vendors sometimes complicates the process, and that the processes and standards required are not always clear and can be difficult to communicate to providers.
      • As one official said, onboarding “can take about one to two years sometimes. So, it would be nice if you could have some kind of requirement or … time constraint about how long an onboarding process or vendor switch can take, because a lot of times during these vendor switches, we will experience the syndromic outage [a period during which a facility is not reporting syndromic data]. So, we don’t have complete coverage over all of our counties right now or all of our facility region.”
      • Another official said onboarding required more funding: “With COVID, [funding] got better but prior to COVID it was very minimal. … It takes a lot of training and time to get someone to the point where they can even be comfortable doing these sorts of activities."
    • To support participation in the NSSP, state agencies rely on national messaging guidelines developed by the CDC’s Public Health Information Network (PHIN).56 Used during onboarding, this PHIN messaging guide incorporates standard HL7 message formats.
      • A few respondents acknowledged receiving syndromic surveillance data in an older version of the HL7 standard or otherwise not adhering as strictly to the standard. For example, one respondent described giving less experienced facilities “leeway because, as the state, we have to collect the data.” As the respondent explained, they “accept some data that is not up to par, per se” while they work with facilities to upgrade and improve their submissions.
  • After onboarding, state agencies use a mix of ad hoc and systematic methods to ensure that facilities continue to report data of sufficient quality.
    • For example, one respondent mentioned reaching out “if a facility stops sending a variable that I find useful” so the facility knows it needs to reconnect and send that data.
    • Another respondent described an automated method using a computer program to make sure required data is present.
    • Many respondents found NSSP tools useful and reported reviewing their data summary dashboards to ensure they were receiving sufficiently detailed data from all facilities with which they were connected. However, the frequency with which respondents assess data quality varied dramatically, with some monitoring daily and others quarterly.
    • Respondents also reported that they use dashboards from HIEs and other vendors involved in data transmission to assess data quality.
    • One respondent felt confident that their general monitoring could ensure the timeliness of everything that arrived and the inclusion of the chief complaint but did not think the state agency had sufficient insight into information that might be missing. As this respondent explained, “If they don’t send it, we don’t know.”
    • Another respondent said staff were so focused on onboarding new facilities that they did not have time to work with existing reporters to improve data quality.

Many respondents found National Syndromic Surveillance Program tools useful and reported reviewing their data summary dashboards to ensure they were receiving sufficiently detailed data from all facilities with which they were connected. However, the frequency with which respondents assess data quality varied dramatically, with some monitoring daily and others quarterly.

Immunization reporting

After a measles outbreak claimed dozens of children’s lives and sickened more than 27,600 others from 1989 to 1991, public health leaders concluded that more illnesses might have been prevented if doctors could have more easily determined when their patients were missing vaccinations.57 Within a few years, state and local health departments began establishing Immunization Information Systems (IIS), originally termed immunization registries, to confidentially and electronically record vaccinations administered by health care providers within their jurisdiction.58

In addition to helping providers identify which vaccines their patients have received and still need, the data also helps public health officials identify communities and populations with inadequate vaccine access, determine if vaccination rates for certain diseases are low, and plan interventions and allocate resources to increase immunizations. By the time of the COVID-19 vaccine, many IIS enabled providers and public health officials to manage vaccine administration and supply.59

These systems are now used in every U.S. state and territory and some cities. The CDC required providers participating in the agency’s COVID-19 vaccination program to report the vaccinations to their jurisdiction’s IIS or another CDC-designated system.60 Under normal circumstances and for all other vaccinations, each IIS operates according to a unique set of its jurisdiction’s policies, differing in terms of whether reporting is required or voluntary, who must report, which vaccinations they must report and what details they must include, how quickly and by what method they must report, and how health departments manage the immunization data and ensure its quality, among other variables.

Key takeaways

  • Officials in almost every state estimated that most immunizations are electronically reported—50% to 95% via automated electronic systems and 8% to 50% via web portals. Unlike case reports, a small portion of data is transmitted via other, more labor-intensive means such as fax. This tracks with policy: State statutes and regulations most commonly cite electronic reporting as an acceptable method for submitting immunization data. In addition, several jurisdictions reference automated electronic reporting and web portals as acceptable reporting methods. However, automated electronic reporting is optional and not required by statute or regulation.
  • Policies may not reflect the full range of vaccination providers or vaccinated individuals. However, practices focused on enabling or encouraging electronic immunization reporting and data exchange across states, along with decades of efforts among practitioners to standardize IIS reporting, have contributed to high percentages of children, adolescents, and adults whose data is in their jurisdiction’s IIS.61
  • The HL7 standard is designed to facilitate data reporting, but its implementation can affect data quality. As a result, respondents had mixed opinions about its impact. Roughly a third of the officials who expressed an opinion regarding HL7 said it produced better data, a third said it undermined data quality, and a third said it had no impact.

Action items

  • Because immunization data comes from multiple sources, jurisdictions with narrow reporting requirements, such as those that specify certain types of health care providers and settings or reporting for some ages, may not capture all immunizations being administered in the state, despite robust reporting in their IIS. Jurisdictions should examine how their immunization reporting requirements and activities to improve electronic reporting align with where and from whom people are receiving vaccinations. For example, if schools and workplaces are administering a significant proportion of vaccines but represent a smaller proportion of electronic reports, jurisdictions could consider what balance of policies and practices can most effectively increase the quantity and improve the quality of immunization data they receive.
  • Jurisdictions should assess the extent to which vaccine providers are using manual tools to report immunizations and identify opportunities to help providers—particularly low-volume and under-resourced providers—go digital.

Complete findings

How do states govern immunization reporting?
  • Immunization reporting is required in some form in 45 jurisdictions and voluntary in three. Seven jurisdictions specify automated electronic reporting as an option, but none requires reporting via that method. Five require reporting via generic electronic methods.
    • While almost all jurisdictions enable adults to determine whether their data is reported to the IIS—either by allowing them to opt out of data sharing or by requiring them to explicitly opt in—several mandate reporting without giving adults the option.
  • Of the 45 jurisdictions with immunization reporting requirements, 34 have requirements for all ages; 11 require reporting for children’s immunizations alone.
    • States that require reporting for children’s immunizations alone vary in how they define ages of children.
    • In interviews, a few respondents identified newborns as a particularly difficult population in which to track vaccinations because the child might not have a name when vaccines are administered (e.g., when newborns are immunized against hepatitis B). Hospital staff sometimes enter generic names (for example, “Baby boy”) into the hospital record and never update the record to reflect the child’s given name before the EHR closes the health encounter and submits the vaccination record to the IIS. Because these records are designed to help patients and providers keep track of immunizations, the records have little to no value if they lack a valid or correct name.
  • Several jurisdictions have requirements for specific immunizations. Examples of this include reporting requirements specifically for COVID-19 vaccines, publicly funded vaccines, or vaccines administered during a public health emergency.
  • Nearly every state has statutes or regulations that address immunization reporting; 30 address it in statutes and regulations, 13 in statute only, and five by regulation only. Only three have neither.
    • In interviews, officials in 13 jurisdictions said that changing immunization policies would require legislative approval or involvement to change statutes. Officials in 10 jurisdictions said they could shift policy through the regulatory process, though two indicated that their states’ regulatory process would require legislative approval or involvement.
  • In almost all jurisdictions, immunization data is reported straight to the state department of health. One state allows immunization data to be reported to local health departments concurrently with the state department, and a few jurisdictions have unclear reporting structures.
  • Forty jurisdictions have immunization statutes and regulations that specify confidentiality or privacy of data shared, though jurisdictions also have policies that apply to medical and public health data broadly, along with laws governing privacy and confidentiality that may affect immunization data exchange. Seven do not list policies that address the confidentiality of immunization data.
  • Eighteen jurisdictions have enforcement mechanisms for providers who are not in compliance with reporting requirements. These most often consist of fines and fees, with some jurisdictions specifying that noncompliance could result in a misdemeanor. No one Pew interviewed reported issuing fines or other penalties.
How is immunization data reported?
  • Fourteen jurisdictions’ statutes or regulations refer to electronic reporting as an acceptable method for submitting immunization data, and five require it. Seven other jurisdictions describe automated electronic methods as an option for reporting in statute or regulation. As previously stated, none requires it. Electronic reporting methods are not mentioned at all in 24.
    • Officials in 15 jurisdictions said that requiring electronic reporting for immunizations would not improve data within their state. Officials in five believed that it would. Officials who did not want requirements pointed to capacity issues for small and rural providers. As one said, “We have a couple of rurally located public health facilities that serve a very small population of patients, and they only give a few vaccinations a month. And for them to purchase an EHR that would allow them to do electronic documentation and electronic data reporting really just … doesn’t make sense for their facility.”
  • Seven jurisdictions cite web portals as an option for reporting. These systems require manual data entry by the reporting vaccine provider
  • One jurisdiction’s statute refers to mail as an acceptable method of reporting immunization data, one state refers to fax, one refers to telephone, and none refers to email.
  • Twenty-two jurisdictions do not specify any method of transmission. This includes jurisdictions where reporting is required and where it is voluntary
  • In practice, officials in almost every jurisdiction shared that they most often received data via automated electronic reports and other electronic reporting methods such as web portals. They estimated that:
    • Automated electronic reports accounted for about 50% to 95% of all immunization submissions from health care providers.
    • Web portals accounted for an estimated 8% to 50% of all immunization received from providers.
    • Faxes, spreadsheets, and batch uploads made up 5% or less of the total volume of immunization submissions for any one state’s IIS.
  • Many jurisdictions specify in guidance that health departments can receive immunization data via an HIE. Three jurisdictions specify that in statute or regulation. Other jurisdictions do not specify that HIEs can play this role but also do not disallow it.
    • In practice, the officials Pew interviewed said that reporters often use HIEs, though responses ranged from estimating less than 5% of all electronic submissions in one jurisdiction to 100% in another.
    • In interviews, officials from several jurisdictions stated that the HIE only transported the message. In other jurisdictions, however, HIEs may store the data, giving public health agencies access to technology tools readily available in an HIE, such as business intelligence software programs to help analyze or visualize immunization data. Although their capabilities differ, during the pandemic, some HIEs augmented the data by using their master patient indexes (comprehensive, accurate lists of people who receive health care services in their geographic areas) to prevent duplicate records in the IIS and to add new data elements, such as race and ethnicity, to the immunization record.62

Of the officials in 41 jurisdictions that participated in interviews about immunizations, 35 confirmed that doctors can query their IIS to see patients’ history of vaccinations, but providers cannot do so in six jurisdictions. … If doctors can only submit immunization data to an IIS but cannot query the system for their patients’ vaccine records, then they cannot identify which of their patients need vaccines—a core problem that IIS were designed to solve.

Health care providers submit vaccination data directly to the IIS via an HL7 messaging standard connection or through an intermediary, such as an HIE or EHR vendor transmission hub (that is, a technical solution offered by an EHR vendor to collect data from various users of the EHR, combine that data, and report via a single data connection). Providers can also use a web portal administered by the public health agency to directly enter immunization data or submit batch files.

How do states govern access?

Bidirectional access to data

A concept known as bidirectionality means that data is available to health care providers who administer and report vaccinations as well as to local or state health departments that receive the data. If doctors can only submit immunization data to an IIS but cannot query the system for their patients’ vaccine records, then they cannot identify which of their patients need vaccines—a core problem that IIS were designed to solve.63

  • Of the officials in 41 jurisdictions that participated in interviews about immunizations, 35 confirmed that doctors can query their IIS to see patients’ history of vaccinations, but providers cannot do so in six jurisdictions.
  • Many jurisdictions have statutes or regulations that grant people access to their own immunization data or prescribe methods for them to access it. One specifies that patients can access their immunization data via the IIS. Other jurisdictions do not allow patients to access the IIS specifically, but do provide methods for them to get their own data sent to them.
  • Officials in several jurisdictions described successes with new consumer access portals that allow individuals to access their own or their children’s immunization records and determine when they are due for a vaccine.
  • One official described the value of bidirectionality: “It’s always a burden on our health care providers in July and August trying to pull kids’ records. I think this really allows parents to take that step and to see ahead of time what kids are due for and what their school reporting requirements are. … That’s great for our health care providers that maybe can have an easier time this summer and also great for [patients] who can see what they’re due for and what their vaccine record looks like.”
Interstate data exchange

When Americans receive vaccinations in more than one state, their records may be stored in multiple IIS. Patients, providers, and public health officials can benefit from accessing out-of-state data to assess their own, their patients’, or their populations’ vaccination status.

Exchanging data across states lines can be technologically challenging, however. IIS may use different hardware and software, collect different data elements, and record and format data differently. One solution is IZ Gateway, the CDC’s centralized data hub that enables state health departments, federal agencies, health care providers, and patients to share immunization data.64

  • In interviews, respondents from a few jurisdictions said they use the IZ Gateway, although the CDC reported in May 2023 that 60% of all jurisdictions were using the tool to exchange data with each other.65
  • One state limits the immunization data it shares with the federal government, allowing only specific purposes of sharing IIS data.
    • As one official described their state statutes: “It talks about sharing, using information in that system for the benefit of public health, for the benefit of the registry, for the benefit of [the state]. And so, we interpret that very literally, and if we cannot see a clear, straight line [that] sharing this information with the CDC is going to … enhance and improve our service … we’ve denied requests from our federal partners for that information.”
  • To supplement the immunization records of the veterans in their state, a few state officials reported that they had recently connected to the Department of Veterans Affairs (VA), which administered about 7.4 million COVID-19 vaccines, according to a January 2023 report from the Office of the Inspector General of the U.S. Department of Health and Human Services.66 However, these states are in the minority: According to the same report, as of January 2021, only 12 of 56 state and local immunization programs reported that they received individual-level vaccination data from the VA.
    • An official from one of the states said, “This collaboration with the national VA gives us daily data in real time. That helps with the continuum of care, and we can provide a better service to providers.”
    • Another official described the role that the CDC played: “I will give kudos to CDC and their IZ Gateway project because, for the first time ever starting just a couple of months ago, we are now getting data from the Veterans Health Administration. We have 10 or 11 facilities that serve our VA population that have never reported to the IIS before, and we are getting their data now through the IZ Gateway and with CDC’s provider jurisdictional data exchange initiative.”

One example of interstate cooperation, from Massachusetts and Rhode Island, has been ongoing since the spring of 2022.67 Because they share a border, they also have significant populations that live in one state but receive care in the other. And like many states, their IIS had gaps in patients’ vaccination histories. They started the partnership by assessing the extent to which they could fill in the COVID-19 vaccination records of people tracked in Massachusetts’ immunization system using information in Rhode Island’s immunization system. The initial analysis found that the data exchange could increase the completeness of COVID-19 vaccination records for 62,782 patients.

Respondents from almost all jurisdictions reported using HL7 as their electronic messaging standard. As several respondents reported, this allowed them to use an HL7 implementation guide to maintain quality by setting rules for required or optional data elements.

How do states ensure data quality?

Messaging standards
  • In interviews, respondents from almost all jurisdictions reported using HL7 as their electronic messaging standard. As several respondents reported, this allowed them to use an HL7 implementation guide to maintain quality by setting rules for required or optional data elements.
  • Respondents in many jurisdictions said their systems sent error reports to health care providers when they had errors of varying degrees of severity. For example, “fatal” errors result in the rejection of the immunization message, and “informational” errors communicate to the submitter about the absence of optional data elements after an IIS accepted a message. Error reports could also summarize data quality results with immunization message submitters via a web page or other feedback mechanism.
Timeliness
  • Respondents in many jurisdictions reported that they monitor the time between when a vaccine was administered and when it was reported to the IIS. These include jurisdictions without timeliness requirements.
  • Respondents in many jurisdictions reported that they cross-referenced the inventory of vaccines released to a particular provider and the doses that provider recorded in the IIS. Health departments can use that information to determine whether providers are sitting on unused products and to identify communities with disproportionately low rates of access to vaccines. However, at least two respondents said that they could not perform inventory analysis for vaccines that are purchased and distributed commercially, as opposed to those purchased and distributed by the government, such as immunizations covered under the Vaccines for Children program.
  • Beyond this research, studies indicate that immunization data was more likely to be accurate and timely if it was reported electronically using the HL7 standard.68
Variation in data quality

Respondents had divergent opinions regarding the value of receiving data via HL7 feeds:

  • Respondents in several jurisdictions found that HL7 electronic feeds from providers (for example, EHR or HIE) produced better data because, as intended, these feeds use a standardized format that allows data to be more easily processed and analyzed.
  • Several others, however, reported that HL7 feeds produced lower-quality data or required more time to correct errors or erroneous data feeds. For example, HL7 messaging feeds often come from health care providers that administer high volumes of vaccinations, and the data from these vaccinations often comes from a single EHR system. When the EHR undergoes routine upgrades, it can introduce errors that affect millions of immunization messages.
  • A similar number of respondents found that HL7 feeds and other reporting methods, such as web entry or batch files, produced equivalent levels of data quality.

Even With Standards, Data Quality Remains a Challenge

Past evaluations of IIS statutes and regulations across the nation have highlighted how varying requirements pose challenges for the exchange of high-quality data.69 Efforts to improve data quality have largely centered on practices to standardize and monitor incoming data and resolve issues that arise.70 Automated electronic reporting is a solution to some but not all data problems. As many interviewees expressed, data quality remains an issue, even when the information is transmitted electronically in compliance with HL7. Indeed, American Immunization Registry Association wrote in an August 2022 paper: “It is possible for a message to meet HL7 standards while having low data quality. Likewise, it is possible for a message to have high data quality without fully meeting HL7 standards.”71

Improvements in data quality
  • Many jurisdictions have data quality teams that actively monitor and assess vaccine data using dashboards or scorecards or by manually examining submissions.
  • One respondent identified the American Immunization Registry Association’s Data at Rest program as a means to measure the completeness, validity, and timeliness of immunization data. The program allows state health agencies to submit de-identified vaccination data to the American Immunization Registry Association for data scoring, the results of which the IIS can use to improve data feeds.72

Additional findings by topic area

Race and Ethnicity Data Across Use Cases

The U.S. health system does not serve all Americans equitably. Indeed, biased systems and policies deny people access to clean air and water, livable wages, health care, and other basic needs—and, as a result, health inequities emerge.73 That’s why public health agencies collect race and ethnicity data: to identify these inequities and design interventions to narrow them. Federal agencies, including the ASTP, CDC, and CMS, are taking steps to address them as well.74 Indeed, equity is central to the 10 essential services of public health.75 As a result, public health organizations have published recommendations to improve the collection of race and ethnicity data.76

In policy, this means:

  • Forty jurisdictions require race or ethnicity data be included in case reports and 34 require it in lab reports.
  • Many require it for immunization reports.
  • Several require it for syndromic surveillance.

In practice, among the jurisdictions whose officials answered questions about race and ethnicity data:

  • All reported collecting data on race and ethnicity through case and lab reporting.
  • Almost all reported collecting race and ethnicity data for immunizations.
  • Several reported collecting race and ethnicity data for syndromic surveillance.

Despite these requirements, however, respondents shared concerns about the quality of race and ethnicity data.

  • Officials in many jurisdictions reported issues with incomplete data, particularly from lab reports.
    • Race and ethnicity data might not be present on lab requisition orders from providers, and labs do not have direct access to patients and cannot ask them for race or ethnicity information. In the HL7 implementation guide for ELR, race and ethnicity fields are required but can be empty, meaning that reporters must send relevant data if they have it but can omit the field and still conform to the standard.
    • A related issue: Officials reported that providers can report race and ethnicity as “unknown” or “other,” resulting in data that is technically complete but not informative. Interview respondents did not quantify the magnitude of missing information, but a 2022 Council of State and Territorial Epidemiologists study estimated 74% completeness for race and 65% completeness for ethnicity in COVID-19 case data and 29% completeness for race and ethnicity in ELR for COVID-19.77 Officials in a few jurisdictions reported that they are working with labs to encourage reporting and limiting the use of hard-coded options such as “unknown” in the race/ethnicity field that allow reporters to submit incomplete data.
    • One respondent said eCRs could be used as a potential source of improved race and ethnicity data in lab reports because this information is more likely to be recorded in a patient’s EHR than in a lab information system.
    • One respondent said that the HL7 implementation guides limit valid values for race and ethnicity fields in immunization reports, which makes it difficult for states to collect race and ethnicity values with a level of granularity that might better represent the vaccinated population.
    • Officials from at least two jurisdictions said they worked with reporters to increase completeness and standardize the race and ethnicity data they receive in syndromic surveillance reports.

Adapting quickly is good, but proactively anticipating is better. Rather than reacting to the next pandemic after it hits, state public health agencies should take steps to update statutes, regulations, agency policies, practices, infrastructure, and expertise today.

Lessons Learned From the COVID-19 Pandemic

The COVID-19 pandemic created many challenges for public health data reporting. First, the proliferation of COVID testing and immunization sites increased the number of labs and health care providers that needed to report test results and vaccinations, including entities such as long-term care facilities, correctional facilities, schools, and mobile sites that did not have the technical capacity to send standardized data in electronic formats. Additionally, many state public health agencies struggled to keep up with the unprecedented volume of reports, which was exacerbated by national requirements to report all test results, not just the positive ones. (Understanding the proportion of negative and positive tests was an important public health indicator during the pandemic.78) On top of that, public health agencies in general have not had access to the millions of results from point-of-care tests that people took—and continue to take—at home.

The pandemic drove a significant increase in the use of eCR for reporting COVID-19. Whereas fewer than 200 health care facilities used the technology in March 2020, that number grew to more than 41,700 in all 50 states and two territories, as of Sept 25, 2024.79 In addition, the widespread use of ELR during the pandemic enabled public health agencies to count and report millions of test results per day.80 These experiences provide lessons for states to improve automated electronic reporting overall.

One example of progress during the pandemic comes from Idaho, where the Division of Public Health is using eCR to improve the detection and treatment of multisystem inflammatory syndrome in children (MIS-C).81 This condition is associated with COVID-19 and may not show up in lab results and, subsequently, the lab data reported to public health agencies. Idaho children are often treated in the state’s sole children’s hospital in Boise or across state lines in Salt Lake City, where eCR is widely used. The interstate exchange is facilitated by nationwide eCR infrastructure. As a result, Idaho’s public health agency receives 85% of its notifications about MIS-C via eCR.

  • Batch uploads and web portals can serve as effective, interim solutions for providers who do not have access to automated electronic reporting systems.
    • Many jurisdictions created an option for reporters to submit lab results via .CSV spreadsheets, leveraging the national flat file template or their own formats to allow reporters to complete a spreadsheet that could be converted to ELR messages accepted by the state.82
    • Many jurisdictions included .CSV reporting in their definition of ELR because the health department receives the data electronically, even if the process of reporting remains manual on the provider side.
    • Several jurisdictions also created specialized COVID reporting mechanisms, such as web portals for direct data entry or uploading results in bulk for point-of-care testing or facilities unable to create HL7 messages.
    • Several jurisdictions mentioned using SimpleReport, a free, web-based tool developed by the CDC and the United States Digital Service and launched in early 2021 for reporters to manually enter test results and send them to public health as HL7 ELR.
    • State officials described how they changed their reporting protocols to accommodate mass vaccination clinics, long-term care facilities, and other nontraditional reporters that do not have the infrastructure to report immunizations via automated electronic means. For example:
      • One jurisdiction allowed reporters to share large files with multiple records (e.g., via Microsoft Excel spreadsheets) rather than submitting individual reports for each immunization. As the official said, “Our flat file module during COVID really helped with the timeliness of all the vaccinations getting into the system. … I think that was an extremely great thing for mass clinics and providers [that they] did not have to immediately add 1,000 shots into the system. They could just upload the file.”
      • Another state tailored systems for day care centers and school nurses that enabled providers to not only share data more easily but also to monitor population-level trends, such as the student body’s immunization rates.
      • One official described how they worked with community nursing facilities—which lacked the IT infrastructure and staff to buy expensive electronic reporting systems—to develop a system that scanned their paper records and auto-generated and transmitted HL7 messages. “It’s weird, but it works,” the official said.
  • The collection of race and ethnicity data can be improved.
    • Several jurisdictions changed their implementation guides (that is, instructions for submitting immunization data to the IIS) to require race and ethnicity data in immunization reports.
    • One official described how they harnessed social and demographic data from the state’s health information exchange to identify “geographic and racial and ethnic disparities in COVID and COVID vaccination.”
    • Another official reported some progress: “We were able to dramatically improve our race data for COVID. … I think it’s hopefully helping to … serve as a guiding light for how we’re trying to work with data-modernization efforts and to improve reporting and data completeness overall.”
  • States can respond quickly to emerging threats when necessary.
    • Several jurisdictions changed existing reporting rules or issued emergency orders to require electronic reporting of COVID-19 lab results and vaccinations. Several other jurisdictions began mandating submission of syndromic surveillance data through emergency regulations, launched new surveillance focused on COVID-19 symptoms or related ICD codes (a standardized set of codes that health care providers, insurers, and other health care stakeholders use to classify diseases, symptoms, and other health-related issues), added new data to their feeds, or onboarded additional reporting facilities.
    • Some officials said that COVID-19 accelerated their onboarding of facilities or expansion of activities for syndromic surveillance, but a few others said that the pandemic kept staff too busy to onboard new reporters and undertake other efforts related to improving syndromic surveillance for their state.
    • A few jurisdictions reported setting up separate surveillance systems to handle the high volume and unique reporting requirements for COVID data.

Adapting quickly is good, but proactively anticipating is better. Rather than reacting to the next pandemic after it hits, state public health agencies should take steps to update statutes, regulations, agency policies, practices, infrastructure, and expertise today. If state public health agencies continue to build progress now, it will enable them to continue advancing data modernization, not only during the next pandemic but on a day-to-day basis for more routine disease detection, prevention, and response.

Opportunities for action

Based on its findings, Pew has identified the following opportunities for federal and state policymakers, health care providers, and public health practitioners to improve data sharing.

States should assess their baseline reporting capacities and policies. While some data-sharing barriers are common across jurisdictions, each state faces its own particular balance of challenges and opportunities. To improve the quantity and quality of data they receive from health care providers, state public health agencies first need to measure their own baseline performance and then prioritize areas that need the most improvement, develop and implement evidence-based policies and strategies, and track progress. State public health agencies should also review the improvements they made to electronic reporting during the COVID-19 pandemic and identify actions for sustaining that progress. This is especially important for case reporting, which needs more overall improvement than other data streams in most states. The CDC and groups such as the Public Health Informatics Institute, as part of Data Modernization Initiative activities, have made tools available for such assessments.83

Sustaining data exchange and quality requires ongoing maintenance and validation; state officials should optimally be able to regularly evaluate the capacity and capability of providers that report data and the public health agencies that receive it. Metrics can include the volume and quality of data being reported, such as timeliness, standardization, and completeness (especially inclusion of the patient’s contact information and race/ethnicity data). States can also measure the extent to which under-resourced and low-volume reporting providers can use automated electronic reporting systems or other options, and the capacity of public health agencies to receive, store, and analyze data. With this information, states can develop more tailored and effective strategies for increasing, improving, and sustaining the data that flows from health care to public health agencies. Officials would benefit from comparing the quality of their state’s data, the maturity of their reporting infrastructure, funding and staffing levels, and data-sharing and related privacy policies to that of peer states—and then connecting with and learning from peers that are either experiencing or have already addressed similar issues. Clearly understanding their state’s particular strengths and needs is also critical for officials as they invest in public funding for public health data modernization. States can also share their baselines with federal agencies to inform national data-modernization efforts.

The experience of the Minnesota Department of Health demonstrates the value of assessing the volume and quality of automated electronic reports. During certain months of the COVID-19 pandemic, it measured the intake of more than 20, 000 electronic initial case reports per month.84 It further determined that reports it received and evaluated included a high proportion of complete data on gender, race, and phone number.

Federal and state policymakers should consider flexible electronic reporting policies for under-resourced or low-volume reporters. Rather than requiring providers to use tools they cannot afford, some jurisdictions allow providers that fall below reporting-volume thresholds (e.g., fewer than 30 lab reports per month in one state) to use alternative technologies such as web portals and batch uploads (e.g., .CSV files). These options are not fully automated like eCR and ELR, but they still enable public health agencies to receive the data in a standardized format that requires less manual processing than data submitted via fax, email, or phone. Web portals and batch uploads also require less investment and technical expertise from providers.

As they promote the development and adoption of automated electronic reporting systems, federal policymakers at the CDC, CMS, and ASTP and state policymakers should continue to examine the extent to which under-resourced and low-volume reporting providers can comply with reporting requirements and consider how blanket reporting requirements can exacerbate health inequities. For example, if under-resourced providers in rural areas cannot report data, public health agencies may not be able to design programs that effectively address health concerns in those communities. Federal and state policymakers should also work with providers, public health agencies, and other stakeholders to set more flexible policies that may not make automated electronic reporting universal but nevertheless incrementally improve reporting from all providers. Particularly as they move toward certifying aspects of public health systems, federal agencies should also examine how some state public health agencies themselves are under-resourced and lack access to the technological infrastructure and staffing to collect, analyze, and store more electronic data. Over the long term, policymakers should also address the underlying inequities that deny health care providers access to the resources they need to invest in electronic reporting systems.

The federal government and public health agencies should collaborate well to ensure that data modernization accommodates as many different agencies and providers as possible. Health care providers and public health agencies are modernizing systems to exchange data in different ways. During the pandemic, local health departments reported a staggering array of systems that varied in sophistication to manage and exchange data.85 State health departments also have methods for exchanging data that span from nascent to highly mature.86 Even when providers and health agencies use the same standards, the quality and completeness of the data they exchange can vary significantly. This can happen when health care providers and health agencies implement standards into different legacy systems—akin to variations in performance that may emerge if two people try to run the same new operating system on two different 25-year-old computers. These circumstances contribute to a wide range in how states and localities electronically exchange data.

As the federal government leads efforts to facilitate, streamline, and further standardize public health data exchange, the CDC and ASTP need to ensure that improvements are based on what health care providers and public health agencies can viably achieve. Yet, providers and public health agencies have limited capacity to modernize their own data exchange systems, let alone capacity to proactively and robustly participate in these federal efforts to strengthen data-sharing policies, processes, and standards. Federal, state, and local governments should help these stakeholders participate meaningfully in efforts such as the Data Modernization Initiative. In addition, data exchange partners should take advantage of those opportunities to evolve and build on existing standards that facilitate data exchange through interoperable systems.

Conclusion

Without timely, complete, and standardized clinical data from health care providers, public health practitioners are struggling to detect, prevent, and treat some of America’s most pressing health threats. New technologies can improve this flow of information, but practical, financial, legal, and technological barriers are standing in the way

Fortunately, there is cause for optimism. For example, although case data is still shared too often via fax and phone, the widespread use of ELR and the rapid adoption of eCR for COVID-19 shows that it is possible to automate and digitize case reporting broadly. States also developed novel solutions during the pandemic to improve the collection of case, lab, and immunization data. And even though syndromic surveillance reporting is required in only 13 jurisdictions, participation is high nationwide—and public health agencies are using it in innovative ways to address a wide variety of diseases and conditions for which the system was never designed.

Health care providers and public health agencies are modernizing their data infrastructure and reporting and collecting more data using automated electronic systems, and they must keep this momentum going. While each state has particular strengths and needs, and there are few one-size-fits-all solutions, the following actions could help states determine which next steps are right for them:

  • Taking an inventory of their data-reporting statutes, regulations, agency policies, and practices to identify where they are robust and where they need to improve and using that information to drive change.
  • Ensuring that under-resourced health care providers who cannot afford automated electronic reporting systems can still use other tools that allow them to share timely, complete, and standardized data, while, over the longer term, helping those providers adopt modernized data systems.
  • Identifying the extent to which nontraditional health care providers such as urgent care centers, pharmacies, and schools are generating valuable data but are not reporting it.
  • Actively participating in federal data modernization initiatives—and encouraging local health departments to do the same—to ensure that these efforts meet the diverse needs of each state and community.

When Benjamin Franklin said an ounce of prevention is worth a pound of cure, he wasn’t far off. By some estimates, every dollar spent on public health yields about $14 in health care savings—not to mention the incalculable benefits for people who stay well. By dedicating time and resources to modernize their public health data infrastructure, states can harness the power of health information to design even more informed, effective, and equitable interventions that bolster public health’s already significant return on investment.87

Methodology

State policy scan

Pew contracted with Mathematica to review and catalog policies (both administrative and statutory) in all 50 states and Washington, D.C., regarding the use of electronic data exchange in case reporting, lab reporting, syndromic surveillance, and immunization information systems.

From May through August 2021, researchers examined three aspects of public health data reporting:

  1. Statutes, regulations, and agency policies that govern reporting and set forth requirements for who, what, when, and how information should be reported.
  2. Technical requirements and methods for public health data exchange.
  3. Information about the systems used by public health agencies to receive, track, and analyze data.

Mathematica established a search strategy based on keywords for each use case and a hierarchy of sources, prioritizing state and local health agency websites and using national sources and other general literature available via public search engines to fill gaps as needed.

One member of an 11-person research team conducted a primary search for the keywords for each use case in each state or local jurisdiction, and another member of the team performed a secondary review to confirm findings and fill gaps. A senior member of the team reviewed for quality assurance. Two subject matter experts who were not embedded in the project team conducted an additional quality assurance review of the first 13 state or local jurisdictions to evaluate the effectiveness of the search strategy.

Mathematica staff analyzed findings by documenting and categorizing pertinent details from sources based on research questions developed for the three aspects of public health reporting described above. Mathematica subsequently tabulated categories and produced reports summarizing state policy findings for Pew.

Public health agency interviews

To validate the findings of the policy scan and gain practitioners’ perspectives on challenges and opportunities related to improving public health data reporting, Mathematica interviewed 266 state epidemiologists, immunization registry managers, informatics managers, health department legal counsels, and other public health officials in 48 jurisdictions between October 2022 and April 2023. Mathematica conducted interviews with legal counsels in 22 jurisdictions, with surveillance program staff in 43 jurisdictions, and with immunization program staff in 41 jurisdictions. Mathematica developed a structured interview guide based on research questions developed for the three aspects of public health reporting described above. Questions for legal counsel were focused on verifying findings from the state policy scan while questions for surveillance and immunization program staff centered on current practices in receiving public health data. Core components of questions about case, lab, syndromic, and immunization data were largely the same, such as those that asked respondents to describe their state’s organizational structure for reporting or efforts to monitor compliance. However, the interview guide was also tailored to reflect each state’s laws and type of reporting. Pew, the Council of State and Territorial Epidemiologists, and the American Immunization Registry Association reviewed and provided feedback on the interview protocol before interviews began.

A team of eight interviewers, initially working in pairs, conducted the interviews. Most interviews were recorded and professionally transcribed; when necessary, a second interviewer took long-form notes instead in the rare instances when respondents preferred not to be recorded. Researchers who did not conduct the interviews then coded relevant passages in the transcriptions or notes based on a pre-defined coding scheme established to identify the type of data (e.g., case, lab) and specific topic (e.g., data management, reporting methods requirement and experience) addressed in each passage. A separate team of three Mathematica and three Pew staffers reviewed coded passages to identify and summarize themes using NVivo qualitative analysis software.

Because of differences in follow-up questions between interviews, this report includes the relative prevalence of observed themes more commonly than a numerical count of how often specific phenomena occurred (see Table 2).

Table 2

Common Terms for Measuring Prevalence of Themes From Interviews

Term Prevalence
Few 3-4
Several 5-10
Many 11-30
Most 31-37
Almost all 38 or more

Verification

Pew provided officials opportunities to verify the content and interpretation of their jurisdiction’s policies.

Mathematica developed jurisdiction-specific documents (policy summaries) with information it obtained during the policy scan. Each policy summary contained details on (1) laws and regulations governing reporting; (2) required reporting entities; (3) reporting methods specified by law; (4) whether automated electronic reporting is required by law; (5) privacy and consent requirements; and (6) enforcement mechanisms specified in law. Mathematica verified policy summaries either via email exchange with legal counsel or in discussion with them during the public health agency interviews.

Following verification, Pew and Mathematica additionally reviewed all policy summaries and made revisions where inconsistencies existed; for example, summaries that indicated emergency departments as required reporters for syndromic surveillance, despite the absence of any syndromic surveillance requirement specified in statutes or regulations in that state. Revisions were supported by published policies, interview findings, or both. Mathematica contacted all jurisdictions with previously verified summaries, either to provide an opportunity to review any revisions or reconfirm the summary in instances where no revisions were made.

Pew also conducted additional rounds of fact and data checks of all report content between 2023 and 2024 to ensure that findings from the policy scan and interviews were supported by relevant language contained in statutes, regulations, and transcripts. Finally, two external reviewers—one with expertise in epidemiology practices at health departments and one with expertise in public health law—provided feedback on this report.

Limitations

Policy findings in this report reflect relevant language specified in statutes or regulations pertaining to case, lab, syndromic, and immunization reporting.

While policy scans included statutes, regulations, and agency policies focused on case, lab, syndromic, and immunization reporting, the research did not comprehensively capture relevant or applicable mentions of public health reporting in, for example, broader hospital licensing and data privacy laws or other authorities not further specified in regulation.

Internal standard operating procedures were established to consistently search keywords and apply inclusion and exclusion criteria for relevant language. However, the variation of where keywords appeared in text, along with ambiguity of language that at times surrounded keywords, may have still led to variations in applying inclusion and exclusion criteria.

Although lawyers were involved in verifying state summary policies and reviewing report content, lawyers were not involved in developing the research questions or conducting the policy scan and final fact and data checks. Therefore, policy findings in this report do not necessarily reflect the full range of requirements states may establish that are based on (1) legal interpretations of laws; (2) how laws are implemented in jurisdictions; (3) laws beyond those that specifically pertain to case, lab, syndromic surveillance reporting, and immunization reporting; or (4) other administrative policies or guidance. Also, while Pew staff made every effort to ensure that policies described in this report were current as of spring 2023, jurisdictions may have implemented new policies or practices that are not reflected in this report.

Pew and Mathematica asked all jurisdictions to verify our interpretation of relevant statutes and regulations. We were not able to obtain state policy verification from the following jurisdictions: Arizona, the District of Columbia, Florida, Idaho, Illinois, Iowa, Kansas, Maine, Massachusetts, Nebraska, New Hampshire, New Jersey, North Dakota, Oklahoma, South Carolina, Texas, Wisconsin, and Wyoming. Mathematica pursued interviews with health officials in all 50 states and Washington, D.C., but we were not able to conduct any interviews in Maine, Wyoming, or Kansas.

Key informant interview findings may be subject to biases (e.g., recall, information) that could have been present during the public health agency interviews or data analysis, despite efforts to standardize the approach and processes for each stage. Although our interviews covered topics related to jurisdictions’ exchange of data with local, territorial, and tribal governments, we interviewed only state and District of Columbia officials and thus this report does not reflect the perspectives of local, territorial, or tribal institutions.

Endnotes

  1. “Improved State and Federal Data Policies Needed to Strengthen Public Health Systems,” Molly Murray, The Pew Charitable Trusts, Oct. 26, 2021, https://www.pewtrusts.org/en/research-and-analysis/articles/2021/10/26/improved-state-and-federal-data-policies-needed-to-strengthen-public-health-systems. Sharon LaFraniere, “‘Very Harmful’ Lack of Data Blunts U.S. Response to Outbreaks,” The New York Times, Sept. 20, 2022, https://www.nytimes.com/2022/09/20/us/politics/covid-data-outbreaks.html.
  2. “Surveillance Case Definitions for Current and Historical Conditions,” Centers for Disease Control and Prevention, https://ndc.services.cdc.gov/.
  3. “BioSense Platform,” Centers for Disease Control and Prevention, April 22, 2024, https://www.cdc.gov/nssp/php/about/about-nssp-and-the-biosense-platform.html.
  4. “Data Modernization Initiative,” Centers for Disease Control and Prevention, https://www.cdc.gov/surveillance/data-modernization/index.html.
  5. “What Is eCR?” Centers for Disease Control and Prevention, Sept. 9, 2024, https://www.cdc.gov/ecr/php/about/index.html.
  6. “What Are Electronic Health Records (EHRs)?” Office of the National Coordinator for Health Information Technology, https://www.healthit.gov/topic/health-it-and-health-information-exchange-basics/what-are-electronic-health-records-ehrs.
  7. “Electronic Labratory Reporting (ELR),” Centers for Disease Control and Prevention, April 11, 2024, https://www.cdc.gov/electronic-lab-reporting/php/about/index.html.
  8. “Summary of the HIPAA Privacy Rule,” U.S. Department of Health and Human Services, https://www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html.
  9. “HL7 International,” Health Level Seven International, https://www.hl7.org/
  10. “About AIRA,” American Immunization Registry Association, https://www.immregistries.org/about-aira.
  11. “National Syndromic Surveillance Program (NSSP),” Centers for Disease Control and Prevention, https://www.cdc.gov/nssp/index.html.
  12. “The CDC Policy Process,” Centers for Disease Control and Prevention, March 4, 2021, https://www.cdc.gov/polaris/php/policy-resources-trainings/definition-of-policy.html.
  13. Guthrie S. Birkhead, Michael Klompas, and Nirav S. Shah, “Public Health Surveillance Using Electronic Health Records: Rising Potential to Advance Public Health,” Frontiers in Public Health Services and Systems Research 4, no. 5 (2015): 25-32.
  14. Kenneth D. Mandl et al., “Implementing Syndromic Surveillance: A Practical Guide Informed by the Early Experience,” Journal of the American Medical Informatics Association 11, no. 2 (2004): 141-50, https://academic.oup.com/jamia/article/11/2/141/883007?login=false.
  15. Lori Walker, “COVID-19 Electronic Reporting and Data Quality in Oklahoma,” Oklahoma State Department of Health, 2023.
  16. “10 Essential Public Health Services,” Centers for Disease Control and Prevention, May 16, 2024, https://www.cdc.gov/public-health-gateway/php/about/?CDC_AAref_Val=https://www.cdc.gov/publichealthgateway/publichealthservices/essentialhealthservices.html.
  17. “Health Inequities and Their Causes,” World Health Organization, Feb. 22, 2018, https://www.who.int/news-room/facts-in-pictures/detail/health-inequities-and-their-causes.
  18. “Characteristics of Office-Based Physician Visits, 2016,” Centers for Disease Control and Prevention, https://www.cdc.gov/nchs/products/databriefs/db331.htm. “Fast Facts on U.S. Hospitals, 2024,” American Hospital Association, https://www.aha.org/statistics/fast-facts-us-hospitals.
  19. “Surveillance Case Definitions for Current and Historical Conditions,” Centers for Disease Control and Prevention.
  20. Celia Hagan, Emily Holubowich, and Tamara Criss, “Driving Public Health in the Fast Lane,” Council of State and Territorial Epidemiologists, 2019, https://resources.cste.org/data-superhighway/mobile/index.html.
  21. Erika Samoff et al., “Improvements in Timeliness Resulting From Implementation of Electronic Laboratory Reporting and an Electronic Disease Surveillance System,” Public Health Reports 128, no. 5 (2013): 393-98, https://doi.org/10.1177/003335491312800510.
  22. “Potential Effects of Electronic Laboratory Reporting on Improving Timeliness of Infectious Disease Notification—Florida, 2002-2006,” Centers for Disease Control and Prevention, 2008, https://www.cdc.gov/mmwr/preview/mmwrhtml/mm5749a2.htm.
  23. Amy E. Metroka et al., “Effects of Health Level 7 Messaging on Data Quality in New York City’s Immunization Information System, 2014,” Public Health Reports 131, no. 4 (2016): 583-7, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4937120/.
  24. “Trusted Exchange Framework and Common Agreement (TEFCA),” Office of the National Coordinator for Health Information Technology, https://www.healthit.gov/topic/interoperability/policy/trusted-exchange-framework-and-common-agreement-tefca.
  25. “Guidance for Hospitals and Acute Care Facilities Reporting of Respiratory Pathogen, Bed Capacity, and Supply Data to CDC’s National Healthcare Safety Network (NHSN),” U.S. Department of Health and Human Services, 2023.
  26. “Public Health Infrastructure Grant Overview,” Centers for Disease Control and Prevention, March 14, 2024, https://www.cdc.gov/infrastructure-phig/about/?CDC_AAref_Val=https://www.cdc.gov/infrastructure/phig/program-overview.html. “Programs & Projects: Structure of ELC,” Centers for Disease Control and Prevention, April 11, 2024, https://www.cdc.gov/epidemiology-laboratory-capacity/php/our-work/.
  27. “Public Health Data Strategy,” Centers for Disease Control and Prevention, https://www.cdc.gov/public-health-data-strategy/php/index.html. “Data Modernization Initiative (DMI),” Centers for Disease Control and Prevention, May 15, 2024, https://www.cdc.gov/data-modernization/php/about/dmi.html?CDC_AAref_Val=https://www.cdc.gov/surveillance/data-modernization/basics/what-is-dmi.html.
  28. “34 CFR Part 99—Family Educational Rights and Privacy,” U.S. Department of Education, https://studentprivacy.ed.gov/ferpa.
  29. Stephen B. Thacker, Judith R. Qualters, and Lisa M. Lee, “Public Health Surveillance in the United States: Evolution and Challenges,” Centers for Disease Control and Prevention, 2012, https://www.cdc.gov/mmwr/preview/mmwrhtml/su6103a2.htm.
  30. “Health Information Privacy: Public Health,” U.S. Department of Health and Human Services, https://www.hhs.gov/hipaa/for-professionals/special-topics/public-health/index.html.
  31. “COVID-19 Pandemic Response, Laboratory Data Reporting: CARES Act Section 18115,” U.S. Centers for Disease Control and Prevention, 2022, https://archive.cdc.gov/#/details?url=https://www.cdc.gov/coronavirus/2019-ncov/downloads/lab/HHS-Laboratory-Reporting-Guidance-508.pdf.
  32. Sarah Kliff and Margot Sanger-Katz, “Bottleneck for U.S. Coronavirus Response: The Fax Machine,” The New York Times, July 13, 2020, https://www.nytimes.com/2020/07/13/upshot/coronavirus-response-fax-machines.html. Sharon LaFraniere, “‘Very Harmful’ Lack of Data.”
  33. “National Trends in Hospital and Physician Adoption of Electronic Health Records,” Office of the National Coordinator for Health Information Technology, 2023, https://www.healthit.gov/data/quickstats/national-trends-hospital-and-physician-adoption-electronic-health-records.
  34. “COVID-19 Electronic Case Reporting for Public Health Agencies,” Centers for Disease Control and Prevention, Nov. 12, 2021, https://archive.cdc.gov/www_cdc_gov/coronavirus/2019-ncov/php/electronic-case-reporting.html. Kimberly Knicely et al., “Electronic Case Reporting Development, Implementation, and Expansion in the United States,” Public Health Reports 139, no. 4 (2024): 432-42, https://doi.org/10.1177/00333549241227160. Emilie Lamb et al., “Update on Progress in Electronic Reporting of Laboratory Results to Public Health Agencies—United States, 2014,” Morbidity and Mortality Weekly Report 64, no. 12 (2015): 328-30, https://pubmed.ncbi.nlm.nih.gov/25837244/.
  35. Catherine J. Staes et al., “Response to Authors of ‘Barriers to Hospital Electronic Public Health Reporting and Implications for the COVID-19 Pandemic,’” Journal of the American Medical Informatics Association 27, no. 11 (2020): 1821-22, https://doi.org/10.1093/jamia/ocaa191.
  36. “COVID-19 Electronic Case Reporting for Public Health Agencies,” Centers for Disease Control and Prevention.
  37. “eCR in Action,” Centers for Disease Control and Prevention, Aug. 13, 2024, https://www.cdc.gov/ecr/php/stories/; “Building the Right Foundation,” Centers for Disease Control and Prevention, https://archive.cdc.gov/www_cdc_gov/surveillance/data-modernization/priorities/building-right-foundation.html.
  38. Monica Gamez, “Electronic Faxing (eFaxing) in Texas,” Office of Public Health Data Strategy and Modernization, Texas Department of State Health Services, 2023.
  39. “Laboratory Data: Blazing New Pathways for Connection,” Centers for Disease Control and Prevention, https://www.cdc.gov/surveillance/data-modernization/snapshot/2022-snapshot/stories/laboratory-data-connection.html.
  40. “About Mpox,” Centers for Disease Control and Prevention, Aug. 6, 2024, https://www.cdc.gov/mpox/about/. “WHO Director-General’s Statement at the Press Conference Following IHR Emergency Committee Regarding the Multi-Country Outbreak of Monkeypox,” news release, July 23, 2022, https://www.who.int/director-general/speeches/detail/who-director-general-s-statement-on-the-press-conference-following-IHR-emergency-committee-regarding-the-multi--country-outbreak-of-monkeypox--23-july-2022.
  41. “eCR General Information,” Association of Public Health Laboratories, https://ecr.aimsplatform.org/general/.
  42. “Final Report of the Health Information Technology Advisory Committee on Public Health Data Systems,” U.S. Department of Health and Human Services’ Health Information Technology Advisory Committee (HITAC), 2022. “CDC Advisory Committee to the Director (ACD) Data and Surveillance Workgroup (DSW),” Centers for Disease Control and Prevention, 2022, https://www.cdc.gov/about/advisory-committee-director/pdf/February23-2024-ACD-DSW.pdf.
  43. Jim Collins, “Michigan Uses Report Cards to Enhance Electronic Laboratory Reporting Data Completeness,” Michigan Department of Health and Human Services, 2023.
  44. Nicholas E. Kman and Daniel J. Bachmann, “Biosurveillance: A Review and Update,” Advances in Preventive Medicine (2012): 301408, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3254002/.
  45. Nicholas E. Kman and Daniel J. Bachmann, “Biosurveillance: A Review and Update.”
  46. “About NSSP,” Centers for Disease Control and Prevention, https://www.cdc.gov/nssp/php/about/.
  47. “About NSSP,” Centers for Disease Control and Prevention.
  48. Deborah Gould, David Walker, and Paula Yoon, “The Evolution of BioSense: Lessons Learned and Future Directions,” Public Health Reports 132, no. 1 (2017): 7S-11S, https://www.researchgate.net/publication/318948726_The_Evolution_of_BioSense_Lessons_Learned_and_Future_Directions
  49. Kavya Sekar and Angela Napili, “Tracking COVID-19: U.S. Public Health Surveillance and Data,” Congressional Research Service, 2020, https://crsreports.congress.gov/product/pdf/R/R46588.
  50. Kacey Potis and Amanda Dylina Morse, “RHINO Community of Practice: Using Syndromic Data to Monitor Visits for Patients Experiencing Homelessness,” Washington State Department of Health, 2023, https://doh.wa.gov/sites/default/files/legacy/Documents/5230//420-258-RHINOCoP-Homelessness.pdf; “Oregon Wildfire Health Impact Tracked,” Centers for Disease Control and Prevention National Syndromic Surveillance Program (NSSP), April 4, 2024, https://www.cdc.gov/nssp/php/story/oregon-wildfire-health-impact-tracked.html; “Florida Uses Emergency Department and Urgent Care Data to Strengthen Case-Based Surveillance of Dengue,” Centers for Disease Control and Prevention, https://archive.cdc.gov/www_cdc_gov/nssp/success-stories/Florida_Uses_Emergency_and_Urgent_Care_Data_to_Strengthen_Surveillance_of_Dengue.html.
  51. Lindsay Allen, Janet R. Cummings, and Jason M. Hockenberry, “The Impact of Urgent Care Centers on Nonemergent Emergency Department Visits,” Health Services Research 56, no. 4 (2021): 721-30, https://pubmed.ncbi.nlm.nih.gov/33559261/.
  52. “National Emergency Department Visits for COVID-19, Influenza, and Respiratory Syncytial Virus,” Centers for Disease Control and Prevention, https://archive.cdc.gov/#/details?q=https://www.cdc.gov/nssp/participation-coverage-map.html&start=0&rows=10&url=https://www.cdc.gov/ncird/surveillance/respiratory-illnesses/index.html “About NSSP,” Centers for Disease Control and Prevention.
  53. “Annual Report: Trends in Utilization of Urgent Care and Telehealth Services,” Healthforce Center, University of California-San Francisco, 2022, https://www.ucop.edu/uc-health/_files/prop-56/2022-prop-56-gme-report-on-urgent-care-telehealth-final.pdf.
  54. Lindsay Allen, Janet R. Cummings, and Jason M. Hockenberry, “The Impact of Urgent Care Centers on Nonemergent Emergency Department Visits.”
  55. “Methods States Use to Organize and Promote Health Information Exchange,” Civitas Networks for Health, 2022, https://www.civitasforhealth.org/wp-content/uploads/2023/01/01-Methods-States-Use-to-Promote-HIEs-FINAL.pdf.
  56. “PHIN Tools and Resources,” Centers for Disease Control and Prevention, https://www.cdc.gov/phin/php/index.html.
  57. “Origin Story: Creating a Culture of Collaboration,” Public Health Informatics Institute, 2021, https://phii.org/wp-content/uploads/2021/07/iis_history_spotlight-_origin_story.pdf.
  58. “Funding: The Pursuit of Sustainability for IIS,” Public Health Informatics Institute, https://phii.org/wp-content/uploads/2021/07/iis_history_spotlight-_funding.pdf.
  59. Andrew B. Trotter et al., “Preparing for COVID-19 Vaccination: A Call to Action for Clinicians on Immunization Information Systems,” Annals of Internal Medicine 174, no. 5 (2021): 695-97, https://doi.org/10.7326/M20-7725.
  60. Rebecca Coyle, Mary Beth Kurilo, and Miriam Muscoplat, “Discovery Session: IIS 101: An Introduction/Refresher” (presentation, AIRA Discovery Session, Feb. 27, 2023).
  61. “Origin Story: Creating a Culture of Collaboration.” “IISAR Data Participation Rates and Maps,” Centers for Disease Control and Prevention, July 10, 2024, https://www.cdc.gov/iis/annual-report-iisar/rates-maps-table.html?CDC_AAref_Val=https://www.cdc.gov/vaccines/programs/iis/annual-report-iisar/rates-maps-table.html.
  62. Rebecca Coyle et al., “Immunization Information Systems and Health Information Exchanges,” Association of State and Territorial Health Officials, 2023, https://www.astho.org/globalassets/report/immunization-information-systems-and-health-information-exchanges.pdf.
  63. “IIS Frequently Asked Questions,” Centers for Disease Control and Prevention, Aug. 1, 2024, https://www.cdc.gov/iis/resources-refs/faq.html.
  64. “Immunization Information Systems Resources,” Centers for Disease Control and Prevention, May 17, 2024, https://www.cdc.gov/iis/about/index.html?CDC_AAref_Val=https://www.cdc.gov/vaccines/programs/iis/iz-gateway/information-sheet.html.
  65. Jennifer Ratliff et al., “Public Health Impact of Interjurisdictional Immunization Data Exchange” (presentation, 2023 AIRA National Meeting, May 4, 2023), https://repository.immregistries.org/files/resources/6464099a9db64/6b__public_health_impact_of_interjurisdictional_immunization_exchange.pdf.
  66. Ann Maxwell, “Challenges With Vaccination Data Hinder State and Local Immunization Program Efforts to Combat COVID-19,” U.S. Department of Health and Human Services Office of Inspector General, 2023, https://oig.hhs.gov/oei/reports/OEI-05-22-00010.pdf.
  67. Madison Lyman, “COVID-19 Immunization Status Among Massachusetts (MA) Residents After Data Exchange With Rhode Island (RI)” Bureau of Infectious Disease and Laboratory Sciences, Massachusetts Department of Public Health, May 4, 2023).
  68. Amy E. Metroka et al., “Effects of Health Level 7 Messaging on Data Quality in New York City’s Immunization Information System, 2014.” Kellyn Engstrom et al., “Timeliness of Data Entry in Wisconsin Immunization Registry by Wisconsin Pharmacies,” Journal of the American Pharmacists Association 60, no. 4 (2020): 618-23, https://www.sciencedirect.com/science/article/pii/S1544319119305394.
  69. “State Immunization Information System Laws—Demographic Data Collection,” Centers for Disease Control and Prevention, https://www.cdc.gov/phlp/docs/IIS_Sociodemo.pdf. Lara A. Heersema et al., “Intersection of Policy and Immunization Information Systems (IIS),” BMC Public Health 23, no. 1 (2023): 1828, https://doi.org/10.1186/s12889-023-16457-2. Lynn G. Scharf et al., “Current Challenges and Future Possibilities for Immunization Information Systems,” Academic Pediatrics 21, no. 4s (2021): S57-s64, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8112731/.
  70. “IIS Operational Best Practices (Formerly MIROW),” Centers for Disease Control and Prevention, July 10, 2024, https://www.cdc.gov/iis/activities/mirow.html?CDC_AAref_Val=https://www.cdc.gov/vaccines/programs/iis/activities/mirow.html.
  71. “Data Quality Assurance in Immunization Information Systems,” American Immunization Registry Association, 2022, https://repository.immregistries.org/files/resources/62fbd92c465fe/data_quality_assurance_8_2022.pdf.
  72. Rebecca Coyle, Mary Beth Kurilo, and Miriam Muscoplat, “Discovery Session: IIS 101: An Introduction/Refresher.”
  73. “Media Statement From CDC Director Rochelle P. Walensky, MD, MPH, on Racism and Health,” news release, April 8, 2021, https://archive.cdc.gov/#/details?url=https://www.cdc.gov/media/releases/2021/s0408-racism-health.html. Latoya Hill, Samantha Artiga, and Anthony Damico, “Health Coverage by Race and Ethnicity, 2010-2022,” Kaiser Family Foundation, 2024, https://www.kff.org/racialequity-and-health-policy/issue-brief/health-coverage-by-race-and-ethnicity/. “Disparities in the Impact of Air Pollution,” American Lung Association, Nov. 2, 2023, https://www.lung.org/clean-air/outdoors/who-is-at-risk/disparities. Michigan Civil Rights Commission, “The Flint Water Crisis: Systemic Racism Through the Lens of Flint,” 2017, https://www.michigan.gov/-/media/Project/Websites/mdcr/mcrc/reports/2017/flint-crisis-report-edited.pdf?rev=4601519b3af345cfb9d468ae6ece9141. Ruqaiijah Yearby, “The Impact of Structural Racism in Employment and Wages on Minority Women’s Health,” Human Rights Magazine, https://www.americanbar.org/groups/crsj/publications/human_rights_magazine_home/the-state-of-healthcare-in-the-united-states/minority-womens-health/; “Disparities in Health and Health Care: 5 Key Questions and Answers,” Nambi Ndugga, Drishti Pillai, and Samantha Artiga, KFF, Aug. 14, 2024, https://www.kff.org/racial-equity-and-health-policy/issue-brief/disparities-in-health-and-health-care-5-key-question-and-answers/.
  74. “Health Equity,” Assistant Secretary for Technology and Office of the National Coordinator for Health Information Technology, https://www.healthit.gov/topic/health-equity. “CMS Framework for Health Equity,” Centers for Medicare & Medicaid Services, https://www.cms.gov/priorities/health-equity/minority-health/equity-programs/framework. “Health Equity,” Centers for Disease Control and Prevention, https://www.cdc.gov/health-equity/index.html.
  75. “The 10 Essential Public Health Services,” Public Health Accreditation Board, https://phaboard.org/center-for-innovation/public-healthframeworks/the-10-essential-public-health-services/.
  76. “Transforming Public Health Data Systems,” Robert Wood Johnson Foundation, https://www.rwjf.org/en/insights/our-research/2021/09/transforming-public-health-data-systems.html. Aila Hoss et al., “Disaggregation of Public Health Data by Race & Ethnicity,” The Network for Public Health Law, 2022. Revisions to OMB’s Statistical Policy Directive No. 15: Standards for Maintaining, Collecting, and Presenting Federal Data on Race and Ethnicity, 2024, https://www.federalregister.gov/documents/2024/03/29/2024-06469/revisions-to-ombs-statistical-policy-directive-no-15-standards-for-maintaining-collecting-and.
  77. Brooke Beaulieu, “Addressing Gaps in Public Health Reporting of Race and Ethnicity Data for COVID-19: Findings and Recommendations Among 45 State and Local Health Departments,” Council of State and Territorial Epidemiologists, https://preparedness.cste.org/wp-content/uploads/2022/04/RaceEthnicityData_FINAL.pdf.
  78. “COVID-19 Testing: Understanding the ‘Percent Positive,’” David Dowdy and Gypsyamber D’Souza, Johns Hopkins Bloomberg School of Public Health, Aug. 10, 2020, https://publichealth.jhu.edu/2020/covid-19-testing-understanding-the-percent-positive.
  79. “Healthcare Facilities Live for eCR,” Centers for Disease Control and Prevention, Oct. 15, 2024, https://www.cdc.gov/ecr/php/healthcare-facilities/?CDC_AAref_Val=https://www.cdc.gov/ecr/facilities-map.html.
  80. “Laboratory Data: Blazing New Pathways for Connection,” Centers for Disease Control and Prevention, https://www.cdc.gov/surveillance/data-modernization/snapshot/2022-snapshot/stories/laboratory-data-connection.html.
  81. Kathryn Turner, “Adoption of Electronic Case Reporting for MIS-C Surveillance in Idaho,” Idaho Department of Health and Welfare, Council of State and Territorial Epidemiologists.
  82. “09/19/2020: Lab Advisory: COVID-19 ELR Flat File for Laboratory Data Reporting,” Centers for Disease Control and Prevention, https://www.cdc.gov/locs/2020/covid-19_elr_flat_file_for_lab_data_reporting.html.
  83. “Data Modernization Initiative Planning Toolkit,” Public Health Informatics Institute, https://phii.org/course/dmitoolkit/. “Assessment and Evaluation of Data and Information Technology Systems,” Centers for Disease Control and Prevention, https://www.cdc.gov/csels/dmi-support/guidance-portal/assessment-and-evaluation.html.
  84. Sripriya Rajamani et al., “Electronic Case Reporting (eCR) of COVID-19 to Public Health: Implementation Perspectives from the Minnesota Department of Health,” Journal of the American Medical Informatics Association 29, no. 11 (2022): 1958-66, https://doi.org/10.1093/jamia/ocac133.
  85. Laura J. Bosco, Aaron A. Alford, and Karla Feeser, “Heterogeneity and Interoperability in Local Public Health Information Systems,” Journal of Public Health Management and Practice 27, no. 5 (2021): 529-33, https://journals.lww.com/jphmp/fulltext/2021/09000/heterogeneity_and_interoperability_in_local_public.15.aspx.
  86. “Public Health Data Systems Task Force 2022,” Office of the National Coordinator for Health Information Technology, Aug. 24, 2022, https://www.healthit.gov/hitac/events/public-health-data-systems-task-force-2022. Daniel Weber, “Preliminary Findings From the DMI Assessment,” Center for Surveillance, Epidemiology, and Laboratory Services, 2022, https://www.healthit.gov/sites/default/files/facas/2022-08-24_PDHS_TF_Meeting_Slides_Daniel_Weber.pdf.
  87. Rebecca Masters et al., “Return on Investment of Public Health Interventions: A Systematic Review,” Journal of Epidemiology and Community Health 71, no. 8 (2017): 827-34, https://pubmed.ncbi.nlm.nih.gov/28356325/.