Giving nurses (and even novices) the ability to perform a cardiac ultrasound with the precision that only artificial intelligence (AI) can provide is opening a world of new use cases that can save lives, transform patient experience, and create greater connected healthcare potential.
With heart disease the leading cause of death in the U.S., Caption Heath CEO Steve Cashman said he wants to know “Why can’t we make 4 million nurses have the ability to use ultrasound … in every care setting and session to understand more about the body, and particularly the heart?”
The immediate opportunity is obviously to save lives. But AI-guided ultrasounds also make doctors more relevant to patients at a time when practices need to be. Nurses performing more cardiac ultrasounds in offices (and other settings) builds a revenue stream for physicians.
It also affords a lifesaving service to patients while creating a connected healthcare experience that produces better outcomes for people, and promising new prospects for the entire sector.
“They always say a picture’s worth a thousand words, and when it comes to healthcare, it truly is,” Cashman said. “Caption is an AI software company that is empowering every clinician with the ability to do a limited echocardiogram and will go well beyond that in the future. So, the one thing we want [to know] is why don’t you have a picture of your heart right now?”
He called getting that ultrasound image “the magic moment when we know what’s going on,” adding the caveat that there are only about 74,000 trained sonographers at present who are “very, very busy, and unfortunately [echocardiograms are typically done] after you’ve had chest pains and you’re in an ambulance and headed into a hospital.”
Combined with a shortage of trained sonographers, the drop in doctor visits since the pandemic began, and a large aging population, conditions are ideal for such new connected solutions.
Of course, it’s only relevant if results can be trusted. Independent tests have said they can.
According to Cashman, tests show that a nurse using AI-guided tech “versus a sonographer” gets images as good or better “90% of the time,” adding “the opportunity to get a limited echo from somebody in … minutes by a novice really could have a huge impact on healthcare.”
See also: AI Moves Into Homes, Hospitals as Connected Home Care Grows
Seeing a Connected Future in AI Imaging
By creating software that powers connected devices and, in a sense, turns novices into nurses, Caption AI is a solution with applications that extend to every corner of connected healthcare.
“Think of what happened to analog cameras, to digital cameras, and how that just reinvented imagery [and] media,” Cashman said. “Everything that we do is now kind of connected via image on [social media]. You’re going to see ultrasound really become prolific throughout healthcare.”
Cashman said he has surprisingly found cardiologists the easiest to convince — after seeing AI at work.
“It’s really about putting this into the pathway of care, whether it’s in an ambulance, whether it’s in an ED, whether it’s in a GP setting,” he said.
One way of spreading the word — and kickstarting more innovation — is via partners.
In August, Caption Health announced in a press release its pact with Butterfly Network, maker of the Butterfly iQ+ handheld whole-body ultrasound system. By integrating Caption’s AI-guided ultrasound software, the partners said they think they have a scalable new solution with wider applications.
Caption Health said Tuesday (Nov. 9) in an announcement that it’s also partnering with U.K.-based Ultromics, maker of the EchoGo deep ultrasound analytics system. Per a statement, linking the platforms takes “advanced diagnostic capabilities that had been limited to experts in specialty care settings and [expands] their access to more doctors and patients in more places.”
Caption Health has also received a $4.9 million grant from The Gates Foundation “to develop an AI-based software product to help guide the diagnosis of lung disease in children without the need for highly trained physicians in resource-limited settings.”
Read also: Best Buy to Acquire Current Health to Advance Tech-Focused Patient Care
Following the Telehealth Playbook
As a telehealth expert, Cashman said he believes AI-guided cardiac ultrasounds may mimic the trajectory of telehealth itself. A small number of patients will want it and pay for it, whether insurance reimburses it or not. That will chip away at cost (not using trained sonographers will go a long way there) until insurance plans finally cave and start covering it like any other scan.
“The insurance landscape is complex,” Cashman said. “We’re in some pilots with some value-based care providers where they’re taking on risk,” and it’s so far so good on that front as providers see costs drop from reduced use of ambulances, ERs and pricey interventions.
“I think back a decade ago, and you could go online and pay for a telehealth visit if you wanted to long before your payer or employer or anybody else wanted to,” he said. “We’re certainly going to move in that direction with the technology and be able to offer [it] as a service to the world.”
In other words, better outcomes and reduced costs will convince insurers on the concept.
Cashman added that “the payment models are actually pretty mature because we’re just doing a test in a new way. It still has the same value, per se, as a full echo done in an echo lab. In general, most people see this as the future of healthcare.”
For that reason, he added that “I don’t like to think of Caption as an AI software company; I like to think of us as an early disease detection company.”
Putting it in perspective, he said that for many at-risk individuals, “it’s one of those things [where] if you haven’t seen your heart, do you want to wait until you have to see your heart?”
See also: Sprinter Health Aims to Solve Healthcare’s Last Mile With on-Demand Lab Tests