The pandemic revealed the importance of interoperability in healthcare. Learn how the right digital foundation is essential to facilitating communication more easily and handling complex analytics to ensure better outcomes, efficacy, and equity.
Technologists often talk about the importance of delivering the right information to the right people in the right context—but rarely is this idea more important than in healthcare, as the COVID-19 pandemic has made abundantly clear.
Doctors need full medical histories to treat patients, whether the appointment is in a facility or conducted online, and whether that history involves records from different hospitals, urgent care facilities, lab tests, or devices used in the home.
It’s vital that information is streamlined to other areas of healthcare beyond primary care doctors. Patients might use a web portal to confirm their vaccination status, for example, or a patient with a cough might interact with a bot to reach the proper clinician. Researchers and public health officials need to collaborate across geographies and analyze data collected and stored by multiple parties. Hospital administrators need to understand not only how many beds are full today, but how many are likely to be full next week or next month.
These examples are numerous but they circle the same point: so much in healthcare depends on the secure and timely exchange of the right information. Consequently, the meaningful transformation of the industry depends on how healthcare organizations build an underlying digital foundation to support data interoperability.
Though COVID-19 has demonstrated that IT is essential to delivering care and researching diseases, the stakes are not just managing the pandemic but cultivating a foundation for using technology to redefine care for years to come. From making care available to more people in more contexts to leveraging AI for new diagnosis techniques, a modern approach to a digital foundation can unlock many paths to improved outcomes and equity.
In this article, we’ll explore what a transformative digital foundation in healthcare entails and how it sets the stage for sustained advances.
Redefining interoperability in the digital age
Table of Contents
Interoperability has been an overarching goal in healthcare for over a decade. Put simply, patient records are scattered across systems that were never meant to communicate. Rather than creating a custom schema each time a data source needs to be integrated, healthcare organizations can use interoperability principles to repeatedly and efficiently connect data sources, all while maintaining patient privacy and consent over records.
The 21st Century Cures Act was designed to address this lack of seamless data exchange. The regulation focuses particularly on interoperability practices, encouraging the use of standards-based application programming interfaces (APIs) for data access and exchange, while putting patients at the center.
At their broadest, these principles involve abstracting the complexity of legacy applications and data formats behind APIs. APIs provide a consistent interface with which developers can access data and functionality in different systems, combine these assets to create new digital services, and augment them with AI for new insights and capabilities. Federal regulations specify standards such as FHIR (Fast Healthcare Interoperability Resources) to define how APIs should be built and interact—so competency in APIs and API management is a core interoperability skill for any organization in this industry.
Why does this matter? According to research produced in 2021 by Google Cloud and the Harris Poll, 96% of physicians agree that easier access to critical information may save lives. Some 95% believe increased data interoperability will improve patient outcomes, and 86% express that improved interoperability will meaningfully cut time to diagnosis.
Beyond interoperability to the “digital front door”: how IT architecture makes care available when and where it is needed
Stepping back, interoperability rules are more than a regulation: they are mechanisms for extending care beyond clinical settings—for creating a “digital front door” to health and wellness services that, rather than being defined solely by a doctor’s office, can be available when and where a patient needs it. As such, they constitute a blueprint for empowering patients with access to their own health records through apps they choose or consent to.
Wearable devices used for continuous monitoring of patient conditions; apps that connect patients directly to doctors or resources; collaboration tools used by doctors in the field to confer with colleagues in hospitals—virtually all use cases that are poised to transform healthcare involve not only access to data but also the ability to freely, securely, and repeatedly leverage it (all with patient consent) for different applications. This is one reason the API-first architectures are so important: developer ecosystems are powered by the APIs underneath, which let developers modularly combine data and functionality for new use cases.
Data and analytics are key pillars to help identify disparities for marginalized, minority, and underserved communities.
Digital front door initiatives that are tasked with providing seamless access to care across the continuum depend heavily on these types of architectures. Suppose a patient needs to find and schedule an appointment with the right clinician, so they visit a self-service web portal to report symptoms. The patient may give some information to a bot. The bot might call APIs so it can combine this information with the patient’s previous health records. The bot might leverage machine learning (ML) to be more engaging and human-like in its interaction with the patient, and to intelligently generate suggestions for both the doctor and patient, based on insights from combined data sources. This complex intersection of data, systems, and services is only possible via a loosely coupled, cloud-native architecture.
Related: Learn how healthcare and life science leaders are using the Google Cloud Healthcare Data Engine to make smart real-time decisions through groundbreaking, scientific insights.
The right IT architecture and the vast, complex electronic health datasets it connects have the potential to provide groundbreaking health insights. For example, researchers may want to build an app that surveys data from many sources and applies ML to discern patterns, whether that involves mining unstructured text in medical studies or comparing images of a patient’s eye or skin to known visual indicators of disease.
Researchers may also want to leverage AI tools in the cloud, data from electronic health records (EHR) documented in hospitals, and data from wearable devices used at home—and they may want all of it to scale to dashboards available to all patients, all while maintaining patient privacy and security. Such use cases are key to making access to care more equitable and helping patients interact more meaningfully with providers—and they all stem from a cloud-native architecture built for interoperability.
“By moving to the cloud, we’re able to build tools more easily, at scale, in a way that takes advantage of technological advancements in security and privacy to remain at the forefront in data protection,” said Jim Buntock, vice chair of information technology at Mayo Clinic.
“There are so many applications of this,” he continued. “For example, building a ‘heads up display’ for the ICU—where moments matter—to help care teams direct their attention when and where it’s needed most. From creating better ways to care for patients remotely even after they leave the hospital to making it easier for patients to interact with us via mobile app, we’re working alongside Google Cloud to build a platform for healthcare transformation.”
Related: Visit your hub for everything that’s cutting edge in healthcare & life sciences at Google Cloud.
A checklist for transformative digital architecture in healthcare
Based on the preceding, we can identify several core ingredients that should be part of future-looking IT architectures in healthcare:
1. API-first: An API-first strategy treats the API not as middleware but as a software product that empowers developers, enables partnerships, and accelerates innovation—a big shift from integration-first operations in which APIs are typically created and then forgotten. In legacy architectures, data and functionality from one application could not be easily re-used or connected to another application, and updates to one part of an application risked breaking data or functionality somewhere else. Decoupled systems have supplanted these legacy approaches, enabling digital services to be modularly assembled from digital assets across a variety of sources—some data from one place, some data from another place, some functionality from yet another place, and so on. APIs enable these kinds of distributed architectures because they abstract backend complexity from frontend development. This decoupling is how interoperability occurs—and it’s how data from one organization might be combined with data, platforms, or AI services from another. This kind of granularity is core to a “cloud-native” architecture, and to healthcare organizations applying cutting-edge cloud services to different kinds of data. If an organization embraces cloud-native architectural principles, it can not only connect a legacy, on-premises system to a cloud-hosted service but also more impactfully move from on-premises systems to the cloud in order to take advantage of better agility, reliability, scalability, and security.
2. Interoperability stack: Interoperability starts with APIs and FHIR compliance, but excellence in this area goes much deeper. The ability to harmonize structured and unstructured data into standard formats such as FHIR formats, for example, is a significant step beyond just creating APIs. Likewise, having APIs means being able to connect data to analytics and AI tools—and robust interoperability means these connections are reliable and scalable, rather than bespoke, with the ability to consistently apply analysis and visualization tools across datasets.
3. Protection of patient data: Interoperability blends the need for private patient data to stay private while allowing data to be shared across healthcare professionals and institutions. Hospitals often use an extract-based approach that requires many copies of the data outside of the database—a potential security flaw. Instead, developers and healthcare professionals should be able to analyze data where it resides, all while observing Zero Trust principles and HIPAA compliance.
4. Health equity by design: We can’t improve what we can’t measure. IT teams must take a proactive approach to incorporating and building systems that address disparities in how we collect data, measure and analyze the findings, and tailor interventions. As an example, clinical data about patients that exist within EHR alone is not sufficient to determine health equity needs. It is a substantial effort to build a data ecosystem to get a deeper perspective, including social needs data, public health datasets, self-reported experiences, and outcomes data in addition to race, ethnicity, and language (REaL) data. Data and analytics are key pillars to help identify disparities for marginalized, minority, and underserved communities.
5. Minimizing latency: Hospitals frequently rely on faxes and weekly reports for analytics, but waiting a week for the data to aggregate often is far too long. When making life and death decisions, access to real-time information is critical. As soon as information is recorded in a system, it should be transformed into a consumable format such as FHIR so that it can be combined with other sources, analyzed with ML tools, and so on. Updates should propagate to users in seconds, minutes or hours—not weekly. These other sources are going to dramatically increase with the advent of those outside clinical settings, such as in-home monitoring scenarios.
6. Building for scale: The scale and scope of hospital data is rapidly increasing: hospitals are tracking more clinical events; clinicians are incorporating data from not only the lab but also devices used in homes; individual hospitals are consolidating into larger systems; more patients are using technology as an intermediary to negotiate interactions between themselves and clinicians; etc. Accommodating all of these requirements at scale demands not only cloud-native approaches but elastic compute and storage services, cloud-hosted ML services, and more.
7. Continuous availability: When an organization relies on staffed, on-premise data centers, it necessarily incurs significant expense and IT resources to keep everything running. Managed cloud infrastructure and services, in contrast, let organizations meet uptime requirements while allocating their in-house IT talent more flexibly.
Google Cloud is working with healthcare providers to create a more interoperable, intelligent, and collaborative future in healthcare. Learn more about our Google Cloud Healthcare Data Engine.