Accurate, readily accessible data is essential for public health. During the COVID-19 pandemic, we recognized the truth of this statement more than ever as we gathered and shared information between healthcare providers, researchers, registries, and beyond. We also realized all too well the limitations of our IT systems. Too many health professionals struggled to access the data they needed in a timely fashion and analyze it for life-saving insights.
This is a problem we need to fix, both to prepare for the next public health emergency and to strengthen our response to other acute and chronic conditions. The solution will be complex. Healthcare inherently deals with multiple types and sources of information—research findings, lab results, anonymized data from hospital EHRs, the list goes on—and inclusive public health decision-making demands even more data.
Take COVID-19 testing and vaccines for example. Which communities are most vulnerable and therefore most in need of resources? What financial, logistical, technological, and social barriers hinder access? With this equity piece added in, the roster of systems and data types one must work with expands exponentially, as do the complexities of data collection and analysis. Infrastructure now must be able to accommodate data in areas like employment and housing and from organizations like retail pharmacies and social services agencies.
Fortunately, we have a path forward. This path rethinks how we collect, manage, and share information. It’s collaborative, innovative, and offers a model for all of health IT.
Moving from aggregation to interoperability
What would a modern public health system look like? I worked on a task force for Executives for Health Innovation (EHI) to define it in a recent report. A holistic solution would enable:
· Rapid, automated submissions for lab results, immunizations, and other vital public health information
· A timely, automated bidirectional flow of data between clinical and public health entities
· Machine learning and artificial intelligence tools, to glean public health surveillance and insights at a population level
These capabilities require aggregated data and interoperable systems, which requires significant effort by both the public and private sector. At the March 2022 HIMSS conference, National Coordinator for Health IT Micky Tripathi described our current health IT infrastructure as a “loosely cobbled constellation of systems fragmented in a number of ways” across 3,000 local health departments, 59 state and territorial health departments, numerous tribal health departments, and a growing number of other organizations.
The healthcare sector has been making progress on the consolidation front. Yet the data lakes organizations often use to aggregate data pose challenges as well. As volumes of structured and unstructured data increase, processes bog down for getting this information in and out of data lakes and analyzing it in a timely, meaningful fashion.
“Healthcare organizations need to overcome the limitations of legacy systems, and they need to make sense of a lot of very complex data,” Jeff Needum of Mongo DB explained in a recent HL7® International blog. “A lift-and-shift approach migrating data into a data lake will not solve these problems.”
One solution gaining adoption across the public and private sectors involves data fabrics, like the operational data layer (ODL) powering many of the apps on your phone, which will enable currently incompatible systems across joint mission spaces to seamlessly share, secure, and synchronize data for complex combat operations.
“An ODL lets you build new features without existing system limitations,” Needum wrote. “It lets you summarize, analyze and respond to data events, in real-time and helps you migrate from legacy systems, without incurring the cost and complexity of replacing legacy systems.”
Achieving a shared understanding
As proposals and budget requests progress for funding health IT modernization, it’s essential that organizations move forward in the same direction.
The EHI Task Force recommends a baseline of proven, industry-developed standards and technologies, and I support this approach. Standardization would not only guide organizations in the development of their data lakes and layers, but it would also strengthen data collection, the bedrock of analysis and insight.
Right now, information used in public health research and analysis is collected under different standards and systems, with variance across states, regions, and organizations. The resulting mélange is a problem when using data from one system and purpose, like anonymized electronic health record (EHR) information from patient visits, for another, like disease surveillance.
Shared standards can help public health teams establish such provenance, revealing how each piece of information was originally captured, for what purpose, and any characteristics that would limit its applicability to a new application. By increasing the utility of raw information, such a standardized approach to data collection and management can lead to a more holistic understanding of public health.
Equipped with these modernized health IT systems, a health commissioner or municipal decision-maker can better see where social determinants of health put populations at risk, which community partners to tap for additional education or support, and how to reallocate and reprioritize services and resources for greater access, inclusivity, and equity.
Any standards the health sector develops should support both security and bidirectional flows of data from a variety of federal, state, and local sources. To encourage adoption, the Task Force recommends a voluntary certification program, similar to what’s in place for EHRs.
“Technology has the power to make our healthcare systems more innovative and inclusive than ever before,” Booz Allen COO and EHI Board member Kristine Martin Anderson said in the launch of EHI’s Digital Health Equity Pledge. Modern health IT infrastructure, collaboratively developed, will help us achieve this vision, enabling us to better leverage our vast stores of data to get ahead of the next threat and achieve stronger, more equitable health outcomes.