By Michael Leedom
The modest stethoscope has joined the Artificial Intelligence (AI) revolution, tapping into the power of machine learning to help health-care providers screen for diseases of the heart and lung.
This year, NCH Healthcare in Naples, Fla., became the first health-care system in the U.S. to incorporate AI into its primary care clinics to screen for heart disease. The health technology company Eko Health supplied primary care physicians with digital stethoscopes linked to a deep-learning algorithm. Following a 90-day pilot program involving more than 1,000 patients with no known heart problems, the physicians discovered 136 had murmurs suggestive of structural heart disease.
Leveraging this technology to uncover heart valve disease that might otherwise have gone undetected is exciting, says Bryan Murphey, President of the NCH Medical Group, which signed an annual agreement in January with Eko to use stethoscopes with the AI platform. The numbers made sense to help our patients in a non-invasive way in the primary care setting, says Murphey.
Ekos AI tool the SENSORA Cardiac Disease Detection Platform enables stethoscopes to identify atrial fibrillation and heart murmurs. The platform added another algorithm,clearedby the U.S. Food and Drug Administration (FDA) in April, for the detection of heart failure using the Eko stethoscopes built-in electrocardiogram (ECG) feature.
AI-enhanced stethoscopes showed more than a twofold improvement over humans in identifying audible valvular heart disease, according to astudypublished inCirculationin November 2023. The AI showed a 94.1 per cent sensitivity for the detection of valve disease, outperforming the primary care physicians 41.2 per cent. The findings were confirmed with an echocardiogram of each patient.
Stethoscopes join the growing number of AI health-care applications that promise increased efficiency and improved diagnostic performance with machine learning. In recent years, the FDA has cleared hundreds of AI algorithms for use in medical practice. But as the health-care field employs AI for more services, skeptics point to risks posed by over-reliance on this black box, including the potential biases built into AI datasets and the gradual loss of clinician skills.
Since its adoption more than 200 years ago, the stethoscope has served as both a routine exam tool and a visible reminder of the doctors training. It is recognizable worldwide and, for most clinicians, has remained an analog instrument. The first electronic stethoscopes were created more than 20 years ago and feature enhancements to amplify sound and allow for digital recording.
Analog and digital stethoscopes both rely on the ability of the health-care provider to hear and interpret the sounds, which may be the first indication a patient may have a new disease. However, this is not a skill every health-care practitioner masters. The faint, low-pitched whooshing of an incompetent heart valve or the subtle crackling of interstitial lung disease may go unnoticed even by the ears of experienced physicians.
Enter AI, which can mimic the human brain using neural networks consisting of algorithms that, in the case of stethoscopes, are trained with thousands of heart or lung recordings. Instead of relying on explicit program instructions, an AI system uses machine learning to train itself through advanced pattern recognition.
The effectiveness of artificial neural networks to diagnose cardiovascular disease has been demonstrated in controlled clinical trials.
AI improved the diagnosis of heart failure by analyzing ECGs performed on more than 20,000 adult patients in a randomized controlled trial published inNature Medicine. The intervention group was more likely to be sent for a confirmatory echocardiogram, resulting in 148 new diagnoses of left ventricular systolic dysfunction.
A neural network algorithm correctly predicted 355 more patients who developed cardiovascular disease compared to traditional clinical prediction based on American College of Cardiology guidelines, according to a cohortstudyof nearly 25,000 incidents of cardiovascular disease.
These machines are very good at finding patterns that are even beyond human perception. But theres both the power as well as the risk, says Paul Yi, Director of the University of Maryland Medical Intelligent Imaging Center.
The risks include limitations in performance if AI models are not properly trained. The accuracy of the AI algorithm depends on the collection of sufficient data that is representative of the population at large.
These AI models require a large amount of data, and these data are not easy to come by.
The generalizability is a big issue, says Gaurav Choudhary, Director of Cardiovascular Research at Brown University. These AI models require a large amount of data, and these data are not easy to come by. Choudhary notes that once an algorithm is approved by the FDA, it cannot be simply revised as new recordings become available. Changes to a particular AI algorithm require a new submission to the FDA before use.
In January 2024, the World Health Organization published newguidelinesfor health-care policies and practices for AI applications. Its authors warned of several risks inherent in the use of AI tools, including the existence of bias in datasets, the transparency of the algorithms employed and the erosion of medical provider skills.
AI algorithms that interpret heart and lung recordings may not have been trained on the full spectrum of possible sounds if the data does not include a wide range of patients and ambient noises.
This technology has to be validated across a variety of murmurs in a variety of clinical environments and situations, says Andrew Choi, Professor of Medicine and Radiology at George Washington University. Many of our patients are not the ideal patients, he adds, noting that initial validation typically involves patients with clear heart sounds. In real world practice, there will be older patients, obese patients and noisy emergency departments that may compromise the precision of the AI model.
Another complication is the inscrutable nature of the algorithm. Without a clear understanding of how these systems make decisions, it may be difficult for health-care providers to discuss a management plan with patients, particularly if the AI output appears incompatible with other clinical information during the examination.
Explainability is sort of a holy grail, says Paul Friedman, Chair of the Department of Cardiovascular Medicine at Mayo Clinic and one of the developers of the AI tech that Eko Health uses. Over time, he says, more studies may elucidate how these systems process information. AI uncertainty is similar to our incomplete understanding of how certain medications actually work, he suggests. Both are used because they are consistently effective.
Im not dismissive of the importance of trying to crack the black box, but I think thats a subject for research, he says.
The introduction of AI in the exam room could both enhance diagnostic performance while disrupting the relationship between health-care provider and patient. The provider may become complacent and gradually dependent on AI for answers to clinical questions, while the patient may feel that the care is becoming depersonalized and lose confidence in the doctor.
The subconscious transfer of decision-making to an automated system is called automation bias, one of many cognitive biases the health-care provider must confront. There are many reasons providers may forgo medical training and uncritically accept the heuristics of AI, including inexperience, complex workloads and time constraints, according to a systematicreviewof the phenomenon.
It is still unclear how AI will ultimately influence the physician-patient interaction, says Yi. I think thats kind of the last mile of AI in medicine. Its this human-computer interaction piece where we know that this AI works well in the lab, but how does it work when it interacts with humans? Does it make them second guess what theyre doing? Or does it give them false confidence?
The number of AI-enhanced devices submitted to the FDA hassoaredsince 2015, with almost 700 AI medical algorithmsclearedfor market. Most applications are for radiology. AI is already being integrated into academic medical centres across North America for a variety of tasks, including diagnosing disease, projecting length of hospitalization, monitoring wearable devices and performing robotic surgery.
At Unity Health in Toronto, more than 50 AI-based innovations have been developed to improve patient care since 2017. One of these is a tool used at St. Michaels Hospital since 2020 called CHARTWatch, which sifts electronic health records, including recent test results and vital signs, to predict which patients are at risk of clinical deterioration. The algorithm proved to be lifesaving during the COVID pandemic, leading to a 26 per cent drop in unanticipated mortality.
I think AI is really going to transform health care, says Omer Awan, Professor of Radiology at the University of Maryland School of Medicine. He is not concerned that AI will take over physician jobs, instead predicting that AI will continue to improve efficiency and help reduce physician burnout.
Research continues on how best to incorporate AI into the primary care setting, including ethical issues such as data privacy, legal liability and informed consent. The adoption of AI may infringe on patient autonomy if medical decisions are made using algorithms without regard for patient preferences, according to a literaturereview.
Murphey says he is eager to see Eko Healths AI-paired stethoscopes improve the screening for early heart disease but remains cautious about too much use of technology.
I want to stay connected to the patient. I take pride in my patient examinations, he says. I think thats one of the important things we provide to patients in the primary care setting, and Im not looking to sever that part of the relationship.
This post was previously published on HEALTHYDEBATE.CA and is republished under a Creative Commons license.
***
All Premium Members get to view The Good Men Project with NO ADS.
A $50 annual membership gives you an all access pass. You can be a part of every call, group, class and community. A $25 annual membership gives you access to one class, one Social Interest group and our online communities. A $12 annual membership gives you access to our Friday calls with the publisher, our online community.
Need more info? A complete list of benefits is here.
Photo credit: iStock.com
Original post:
AI Stethoscope Demonstrates 'The Power as Well as the Risk' of Emerging Technology - The Good Men Project
Read More..