Patient medical data in the form of Electronic Health Records (EHR) is undergoing its own digital shift, as Big Tech partners up to bring about more advanced levels of connected care.
While players like Amazon and Apple move into everything from clinics and telehealth to wearables, Google Health is expanding its partnership with EHR platform MEDITECH to pilot Google natural language processing (NLP) technology to better understand patient data.
It’s the latest development in the reinvention of EHR for a reformulating healthcare system.
Announcing the pact Tuesday (Mar. 15), Paul Muret, VP of GM Google Health Care Studio, said in a statement, “Google Health and MEDITECH have a shared goal of supporting care teams with a complete view of the patient record. By combining our complementary areas of expertise, we can help health systems overcome challenges associated with data silos and enable care teams with the tools they need to provide the best possible care and outcomes for patients.”
MEDITECH Executive Vice President and COO Helen Waters said, “By augmenting the power of Expanse with Google’s search and summarization abilities, we’re advancing interoperable healthcare data exchange, building an EHR platform for the future, and continuing our mission to propel data liquidity and support the future of digital health ecosystems.”
See also: Oracle and Cerner Deal Another Sign of Connected Healthcare via Deep Data Analytics
On Tuesday, news site Fierce Healthcare cited a “to-be-published blog post” from Google saying “Using Google Health’s tools, Meditech will form a longitudinal health data layer, bringing together data from different sources into a standard format and offering clinicians a full view of patient records.”
By embedding Google Health search functionality into EHR, the blog reportedly said “clinicians can find salient information faster for a more frictionless experience and the intelligent summarization can highlight critical information directly in the Expanse workflow.”
Connecting Healthcare Ecosystems
It’s consistent with EHR trends of the past year, and of the connected focus more broadly.
In an Axios interview in January, Google Chief Health Officer Karen DeSalvo said, “When we think about how we’re going to support the health ecosystem, we’ve got three big buckets. Is it going to support consumers in their health journey? Is it going to support caregivers who are providing the services on the front lines…or is it going to support community context? Those are our three C’s: consumer, caregiver and community context.”
See also: Zoom, Cerner Beta-Testing Integration of Zoom With Electronic Health Records
Google also used last week’s ViVE healthcare conference in Miami Beach to preview its new Conditions capability.
In a March 8 Google blog, Muret wrote “Healthcare data is structured in numerous ways, making it difficult to organize. Clinical notes may be written differently and stored across different systems. Clinician notes also differ based on if content is meant for clinical decision making, billing or regulatory uses.”
“Further, when it comes to writing notes, clinicians use different abbreviations or acronyms depending on their personal preference, what health system they’re a part of, their region and other factors. All of this has made it difficult to synthesize clinical data — until now.”
Adding Conditions to Google Health enables algorithms to pull meaningful detail from physician and clinician notes often written in partial sentences or using non-standardized terms to describe diagnostic findings, even ferreting out misspellings that can cause problems.
Muret said, “We use Google’s advances in AI in an area called natural language processing (NLP) to understand the actual context in which a condition is mentioned and map these concepts to a vocabulary of tens of thousands of medical conditions.”
See also: Teladoc Health Launches Amazon Collaboration