false
Catalog
Revolutionizing Electrophysiology: The Role of AI ...
Revolutionizing Electrophysiology: The Role of AI ...
Revolutionizing Electrophysiology: The Role of AI in Empowering Allied Professionals
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hello, everybody. My name is Evan Robertson, and I am co-chair along with Jerry Schott for this session. It's my pleasure to welcome you to San Diego and HRS 2025, 46th annual meeting of HRS. If you haven't done so already, please download the HRS 2025 app to participate in the Q&A session. Please scan the QR code on the screen. I think it was up beforehand. You can use that mobile app to enter in questions, or feel free to use the microphone in the aisle. Please note that visual reproduction of Heart Rhythm 2025, either by video or photography, is strictly prohibited. Today, we're going to be talking about revolutionizing electrophysiology, the role of AI in empowering allied professionals. Our first speaker is Amy Tucker from Sanger Heart Institute. She's a nurse manager, and she's going to be talking to us about AI for the allied professional in electrophysiology, role, and importance. Good afternoon, everybody. Thank you for being here instead of happy hour, because that's right down the hall. I just want to thank our session chairs and HRS program committee for inviting me to speak to you today about artificial intelligence and the role in EP, how that affects allied professionals. I'm going to present an introduction to AI and talk about some key concepts and definitions that will hopefully help us as we move forward with integrating AI into our daily practice. I'm going to talk about machine learning, how machine learning has impacted EP and remote monitoring of our CIEDs. I'm going to talk about AI in ECGs and how that has helped us with arrhythmia detection and streamlining workflows. A little bit about how AI can be utilized to predict ICD therapy, atrial fibrillation, and response to CRT. And then I'll just end up with talking about some benefits and challenges of AI and how that presents to us in EP. So over on the left, you can have data without information, but you cannot have information without data. That will make more sense as we talk about the data intelligence continuum. The data intelligence continuum describes the progression of how data is transformed into intelligence. So the data is the foundational level. And data really, though, without any kind of context, is meaningless. So information is data in a more structured and organized format. And as information becomes more contextual, it progresses to knowledge. Knowledge is understanding patterns, and intelligence is the ability to acquire and apply knowledge to achieve goals. So wisdom is informed decision-making using ethics and self-reflection and values, and that makes wisdom much more difficult for AI to attain. Artificial intelligence is the capacity of a machine to use vast amounts of data to mimic human cognitive abilities and perform tasks that typically require human intelligence, such as learning, problem-solving, reasoning, and decision-making. And the whole goal is to do this possibly more efficiently than humans. So I'll talk about some subgroups of AI, natural language processing, which is a large language model, machine learning, and deep learning. So natural language processing is focused on the ability of a machine to communicate with humans using language. This technology can process and analyze text and speech, and can convert spoken words to text and vice versa. This type of AI is used in virtual assistants, dictation software, and voice-enabled devices. And we see this all the time. We see it every day with auto-correction on our iPhones, and chat GPT, and chatbots, and you know, I really want to yell, hey Siri, to see if any of your phones light up. And you know, we see that every day, Alexa, and it's just a normal part of our daily life. This technology can be used in EP by extracting data from physician notes in the EMR and by flagging clinical conditions from unstructured data. It can summarize patient histories and automate discharge summaries, and it can streamline rounding or procedures with voice-to-text dictation. Machine learning is a type of AI in which the machine learns from the data and creates algorithms to problem-solve by recognizing patterns and generating recommendations. In EP, machine learning can enhance clinical decision-making and help us provide better care for our patients by predicting outcomes. And the goal of machine learning is to outperform humans with its ability to predict. There are two types of machine learning, supervised and unsupervised. In supervised learning, the machine is provided a set of human-labeled data or images. For example, ECGs that have been labeled as sinus rhythm, ECGs that have been labeled as atrial fibrillation. With supervised learning, the labeled data help guide the machine to the right prediction or outcome. In supervised learning, data must be accurately labeled or the information that there could be an error in the feature extraction with that algorithm. Unsupervised learning is learning that's trained on raw, unstructured data. It is a method that builds the feature set itself. It surfaces whatever it thinks is important and significant. There are no answers for it, for guidance, and so it makes it more challenging than supervised learning. For example, the machine can organize itself to find factors in a patient's clinical history that make a patient more prone to sudden cardiac death. These algorithms can be further developed by reinforcement learning and dimensional reduction where the machine interacts with the environment and attempts to find an optimal way to achieve a goal. The iterations continue until there is an acceptable level of accuracy with this algorithm. Deep learning, just briefly, is a form of machine learning that uses complex algorithms and convolutional neural networks with multiple layers to train an algorithm. Deep learning models automatically extract data without requiring manual input or any kind of predefined rules. Deep learning allows machines to think in a layered and hierarchical way, processing data into abstract and complex representations, very similar to how we process information as humans by problem solving and recognizing patterns. So AI is very important in ECG interpretation because it makes it faster, more accurate, and more predictive. So deep learning can identify arrhythmias and detect abnormalities in waveforms to the millisecond with very much precision. AI serves two main purposes with ECG interpretation. Number one, it improves our workflow efficiency by performing tasks that are typically performed by a human. And secondly, it enhances the value of the ECG by detecting patterns that extend beyond the human eye and the human capability. Things like predicting disease, predicting the risk of disease, and even predicting non-cardiac disease. So AI can be beneficial in the early detection of cardiac disease. This kind of algorithm can recognize subtle patterns in ECGs that might indicate an underlying cardiac disease or predict the future development of atrial fibrillation from a sinus rhythm ECG. AI can also detect structural heart disease. And it can help us predict the risk of sudden cardiac death or life-threatening arrhythmias in our patients. AI, as we all know if you're in the device world, plays a critical role in remote monitoring. Its primary function is to enhance efficiency, accuracy, and clinical decision-making in the management of the large volumes of data that we get from these implanted devices. One of AI's most valuable roles is filtering and prioritizing alerts. AI can differentiate between clinically actionable events and non-actionable events, which helps us in the device world focus on those things that need to be addressed. AI can also help us predict risk for lead failure and battery depletion, device malfunction, and also there's optimization of programming based on data that's extracted from the device. So for CRT, AI helps us identify patients which are most likely to benefit from CRT by analyzing ECG patterns. We're looking at QRS duration and morphology. And then there's the echocardiographic features, which show us mechanical dyssynchrony. And then there's clinical data, which is valuable age, comorbidities, and medication history that our patients have. AI models can forecast the risk of heart failure hospitalization, mortality, CRT non-response, and can help clinicians intervene earlier by identifying patients who need device reprogramming or lead revisions or even medication adjustments. We see this with HeartLogic, the new heart failure technology in Boston Scientific. We see this with Medtronic's loop recorder technology. And we see this with TriageHF. And the goal of that is to get to these patients before they have symptoms to prevent hospitalizations. There are some challenges in EP that we need to think about. There are legal and ethical considerations, like who is responsible if AI fails or misdiagnoses or misses a potentially life-threatening arrhythmia. Is that the human that is supposed to be offering the human expertise? Is it the machine? Is it the algorithm? Is AI a legal entity? And then we always think about safety considerations for our patients. Maintaining patient privacy is one of our main concerns. And we have to make sure that we respect that and think about that. And then again, the potential for AI error, definitely a safety consideration for missing potentially fatal arrhythmias or misdiagnosis that either we miss a fatal arrhythmia or we misdiagnose and implant a device that's not necessary, something to that effect. One thing for us to remember, because AI is terrifying and exciting and it's in the future and it's still terrifying, is that there have been no large studies that have shown the superiority of artificial intelligence over human intelligence. And until we have clear evidence of that, we must be vigilant as practitioners, be cautious, and make sure that we're taking care of our patients so that we don't have to worry so much about the safety considerations with AI. Benefits of AI, they're obvious. We can predict faster and more comprehensive data. We can personalize treatment options and treatment plans. And again, it can free up clinician time to focus on direct patient care for the patients who need it. So in summary, AI is here to stay, but the development is ongoing in the field of EP. The role of AI is not to replace human expertise, but to enhance it. AI can streamline remote monitoring workflows. We already see this. It can predict ICD therapy and response to CRT. It can provide information beyond human capabilities, beyond what the human eye can see, for risk stratification and for disease recognition. And again, we must use this tool responsibly, always in conjunction with human oversight. I leave you with this quote. It's not man versus machine. It's man with machine versus man without. To prepare for this presentation, I found excellent resources on AI and the emerging role of AI in EP. So I have listed these resources here, if you're interested in taking a look at these. I have another page here. And I just want to thank you all for your attention so much. Thank you. Thank you so much, Amy. That was excellent. We're going to take questions at the end. So we'll get on to our second speaker, who is Madeline Oster. She's a doctorate of nursing practice that I happen to work with at Stanford as a colleague in outpatient EP. She also does inpatient EP. And she's going to talk to us about AI-driven innovations in wearable cardiac technology, transforming heart health monitoring and management. Yeah, good to hear. Good evening, I was going to say. Good afternoon. Thank you for the invitation and being able to speak here about wearable devices. I have nothing to disclose. I would like to start with some typical clinical scenarios we are basically experiencing every day. We have patients referred to us that complain about heart fluttering, skipping beats, thumping, palpitations, or occasional dizziness that might be related to any kind of abnormal heart rhythm. But we don't know exactly what's going on. How is this basically, what can wearable devices help us with figuring out what the problem of those patients is? Wearable devices bring continuous real-time data from patients in their natural environment to us. The AI actually has a component which is able to analyze those data and make sense of the findings. Together, they allow for basically earlier arrhythmia detection, better triage, more proactive interventions, and always less reliance on traditional in-clinic diagnostic. Applications for wearable devices at this time is usually arrhythmia detection itself, the evaluation of patient-driven alerts. That means these patients come to us and tell us, okay, you know, I do have an arrhythmia. My device detects what's going on. Is this something which is serious or not? We have wearable devices for post-ablation rhythm tracking as well as for ambulatory body cardio and syncope workup. Why do we need those wearable devices or why do we actually have extended usage of those? At the ACC in 2025 in March, iRhythm presented two large real-world retrospective analysis from long-term continuous monitoring devices. Those data included 1.1 million patients, and they showed that the common 24 to 84 whole-time monitoring, our whole-time monitoring failed actually detecting arrhythmias. 64% of arrhythmias were undetected in the first 48 hours. Patients that have less than one episode per day have a higher arrhythmia yield than patients with every day with arrhythmias or with any kind of episodes every day. And the mean time to the first detected episode was more than 48 hours. In addition, patient-triggered events correlated with less than 20% to arrhythmia episodes, and over half of atrial fibrillation events was asymptomatic. And cases of arrhythmias like VTs, AV blocks, or pauses did frequently not correlate with any kind of patient symptoms. In addition, those findings basically showed that the 24 to 48-hour monitor failed to detect a considerable number of actionable arrhythmias and led to a delay in treatment, increased risk of stroke and death, and actually an increase in healthcare cost. Another implication is that we are kind of managing patients with atrial fibrillation and there's always the question about when do we start and when do we stop anticoagulation. We know that with an increasing chart risk and sustained episode of atrial fibrillation, their stroke risk increases. We also know that in comparison to how long the first detected episode was, we can predict some kind of progression to maybe longer episodes and increase stroke risk. Current guidelines don't recommend for patients with a chart risk of zero and even sustained episodes of atrial fibrillation, any anticoagulation, however, for patients with high chart risk and episodes longer than 5.5 hours, anticoagulation is recommended still. We have this gray zone between where it's sometimes up to the provider in a discussion with the patient to start anticoagulation or not. We also have the, or we also face the problem that after a patient underwent an intervention like an atrial fibrillation ablation, when do we actually can stop an anticoagulation? We have high, not high, but we have recurrence rates and there's always the fear of, oh, if the patient basically has recurrence of atrial fibrillation, is not anticoagulated, do they maybe are getting a stroke? We hope a little bit that with the REACT-2 trial, we got a little bit more clarification. This is a trial which enrolls patients with Park-Sussman atrial fibrillation and a CHART-VA score of two to four and compares patients that are on a usual regimen of an DOAC to patients that only will take an anticoagulation when they actually really have an episode of atrial fibrillation. Those patients are required to have available devices and to check daily for any kind of abnormal heart rhythm and if they detect atrial fibrillation, they would start a DOAC and take it forward, I believe for 30 days after termination back to normal rhythm. So how do those devices work? They have what we call PPG. I do not try to pronounce it because it's just impossible, it's a very difficult word. So I would just call it PPG and we have the single lead AKG analysis. The PPG signal has usually two main components. There's one quasi-static direct current component which reflects light to the static arterial and venous blood, skin, and tissue. And there's the pulsatile alternate current component which absorbs the light through changes in the arterial blood volume. Two different kind of modes are available. There's one, the transmission mode, what we know from the oximeter. It's basically that the photodetector is opposite of the light source. For the reflective mode, we have photodetector on light source on the same side, what is usually present in smartwatches. PPG is the leading technology in smartwatches and fitness tracker. It's a very cost-effective technology. It's measured on the peripheral side and analyzes the pulse pressure waveform that originates from the contraction of the heart and transfers basically through the vascular tree. And those signals will create some kind of a waveform that will be analyzed. The analysis itself can happen on three different ways. There's one, it's the statistical approach. It's basically a huge database of ECGs that are collected and analyzed for atrial fibrillation, non-atrial fibrillation, the application of logistic regression as well as probability analysis will create some kind of threshold which will distinguish between atrial fibrillation and non-atrial fibrillation. And those thresholds are then correlated with the PPG signal itself. Another approach is machine learning. Machine learning, we already heard from Amy about it. It's a PPG signal such as shape time and pattern are used to distinguish atrial fibrillation from normal rhythm. It's kind of a comparison of signals that are very similar to each other or completely different. Then they are going down a decision tree and at the end we will decide if it is an atrial fibrillation or non-atrial fibrillation. Deep learning automatically learns useful features from raw data using neural networks and reducing the need for manual feature design. There's a need for a large amount of labeled data. Transferred learning refers to that if we have an ECG based model we can adapt this to PPG signals. Usually those are very short episodes. There have been a few major trials in the past. I would like to kind of point out the Apple Heart Study from 2019. There was a collaboration between Stanford Medicine and Apple. It was a huge, or a big landmark virtual clinical trial and enrolled almost 420,000 patients. Notable is that it was the Apple Watch One, Two, and Three. It was not one of the watches that had already AKG capabilities. Irregular pulse notification happens through PPT. 0.52% of those patients received an irregular pulse notification. 84% of the time, the participants who really, who had a positive irregular pulse notification did indeed have atrial fibrillation. And among those who received the notification but did not actually experience actively atrial fibrillation, 34% of patients had a subsequent ECG patch and were indeed found to have atrial fibrillation. The study demonstrated that wearable technology like the Apple Watch can safely and effectively identify heart rate irregularities including atrial fibrillation. And the study also showed, because it was a study that was virtual, it was a virtual design which the use of an app based, and this is actually feasible to do that. Single-lead AKGs are used for our patch wearable devices as well as for the smart devices. AI algorithms such as Deep Rhythm AI model or ZioXT analyze AKG data to identify any kind of subtle changes and to identify an arrhythmia which could be atrial fibrillation or ventricular tachycardia. Smart watches usually have an Elite One electrocardiogram and this is recorded through the circuit between the detector that is placed on the back side of the watch and when the user puts the finger on the digital crown. The cardia has two different kind of devices. There's one, the single-lead AKG. It's the recording of pulse waves created via the left or right thumb or fingers putting on a little metal plate. The cardia six has a third little metal plate on the bottom of the device and is able to record in six-lead AKG. Most common, and this is just basically in three samples. I know that there are more, but I wanted to kind of focus more on those. The ZioXT, the Cardio NDE patch, they are very similar in the duration that they can record or that they can do recordings usually up to 14 days. They have all high-fidelity rhythm monitoring. They have all AI analysis. They are very, very similar. The advantage of those patches, it basically, they have a high diagnostic accuracy. They usually include a physician-integrated report. They are reimbursed with CPT codes. Sensitivity is over 90%. Specificity is over 95%. Limitations of those devices are primarily that they are more expensive and require mostly a prescription and provider workflow integration. The consumer-grade devices like the Apple Watch, the Samsung Watch, the Google, the Wissing ScanWatch, and the Cardia Mobile, they are very similar, too, in what they can provide, at least for the reason we are using them, which is usually the ability to detect atrial fibrillation and to have in single-lead AKG. The advantage of those devices is that they are vitally accessible, user-friendly, and they promote really health engagement. That means patients are really engaged in their health and trying to figure out the different kind of features they can use to improve their health. Limitations are that they have limited diagnostic validity and limited reimbursement. Noticeable is that maybe some minor difference between the swatches is that, for example, the Galaxy has also an obstructive sleep apnea detection, which is very interesting for our atrial fibrillation population, as well as some of the watches have an SAO-2 measurement, which is also FDA-approved. Sensitivity and specificity is not as good as the patches itself. For the Apple Watch and the Galaxy, it's 85% sensitivity and 75% specificity if FitBits and the Wissing ScanWatch are less. Another device, which is not necessarily a wearable device, but I would like to mention it anyways, it's the FibroCheck. The FibroCheck is an CE-marked and FDA-cleared mobile health app designed to detect atrial fibrillation. It basically works that the flashlight and the camera of the phone actually detects the blood volume changes and detects them and records them as a PPG signal. It requires usually 60 seconds of recording. Those raw PPGs are filtered and then artifacts and noises are removed. Algorithms, that kind of really validated algorithms are trained to analyze those data sets and then basically to make the decision if it is an irregular rhythm, an abnormal heart rhythm, or is this possibly atrial fibrillation or is this maybe any kind of arrhythmia or artifacts. Sensitivity is 96% and specificity is 97% and those data were validated against single-lead EKG and 12-lead EKG standard. Advantage of the FibroCheck is clearly there's no additional hardware required. It's an easy on-demand use. Enables frequent check for atrial fibrillation and can be integrated into a remote monitoring program, clinical pathways for temporary rhythm monitoring post-ablation. And limitations are, as we already mentioned, PPGs are less accurate than EKGs. They are suspectable to motion artifacts and poor lightening and they require an active participation, not like the smartwatches, which are passively doing the continuous rhythm monitoring. Back to our clinical scenarios, the first one was the curious case of Benjamin's palpitation. Benjamin's worse, he has always heart fluttering and skipping beat when he argues with his HRA. Fortunately, he did have an Apple Watch Series 6. He noticed that one of the episodes he recorded was found to have atrial fibrillation and he was started on a Doac. He was counseled to lifestyle modification, possible down the road, maybe an ablation or anti-arrhythmic drugs, and he was advised to avoid any kind of HRA meetings in a fast to reduce the stress. Our second case was our 29-year-old software engineer who reports thumps and palpitation while coding at 2 a.m. at night. He had a cardiomobile sex and took 6L and documented PSCs and PVCs during episodes of stress. He underwent a negative basic workup like with an echo and a stress test and got the reassurance that his palpitations are not life-threatening and he might need some lifestyle adjustments in the sense of drinking a little bit less of his energy drinks and going more out on daytime, not at nighttime. And Martha's dense floor dizziness, the Fitbit Charge 5 recorded low heart rates and S38 and lower. Martha got a body cam and confirmed symptomatic sinus pauses received a dual chamber pacemaker and was able to get back to dancing without any kind of episodes of dizziness anymore. Summary, patch-based monitors offer extended high-fidelity AQG recordings with high sensitivity and specificity. Smart watches provide on-demand passive or active rhythm monitoring using PPG or single-lead AQGs. They are usually less sensitive than ECG patches but their accessibility and patient engagement potential makes them valuable for opportunistic screening. All of those devices do have AI-driven algorithm which work in different ways. We have had large-scale studies that have shown that using those wearer devices is feasible for remote atrial fibrillation detection. Wearables are no longer just accessories. They are emerging as a clinical meaningful tool in the early detection, ongoing management, and potential prediction of atrial fibrillation with ongoing integration into patient-centered models of care. Thank you so much. Thank you. Thank you, Madeline, that was great. We're gonna move on to our last speaker, who is Sally Gustafson, and she is from Emory and is a senior device clinic manager there, and she's going to talk to us a little bit shifting over to cardiac monitoring for insertable cardiac monitors and redefining precision in arrhythmia detection and management. Alright. Okay, I'm a great one to talk about technology, so sorry about that. I do have one new disclosure I want to say. After 18 years working at Emory in the device clinic, I have just transitioned to Merge to be their subject matter expert, but I did go through the society and the panel, and they very kindly invited me to continue on with the presentation, so. And now I just hit the down arrow, right? Okay. Do you guys want to do this audience participation thing? Do you have your... Okay, get your coat out if you want to submit a question. And then I also have this little question and answer thing, but I don't know if we can do that. So just think about this. How many people in the room are new to remote monitoring to loop recorders? Anybody less than one year? Oops. Oh, we do? Are we able to do it? Oh, how do I go back then? You know what? Let's forget this. We're just gonna go through it. So think about it. If you were brand new, like you were trying to figure out all these things. You went to your friend, you phoned a friend, you tried to figure out what was going on. Maybe in the first year, two years, you looked at 1,000 episodes. How many people in here think they've seen 1,000 to 5,000 loop recorder transmissions? How many think more than 5,000? Okay. All right, so we're gonna skip that. Just keep that in mind. We'll come back to it. All right. So let's look at a little bit of the data. So back in 2021, O'Shea et al. did a very large study. The loop recorders have been out for a while. They were awesome. They were the answer to finding AFib with cryptogenic stroke. And then they also looked at the alert burden for pacemakers and ICDs. And let's just find out what it is our device clinics are doing. Not surprisingly to all of us, that big yellow box on the right-hand side, over 41,000 alerts were from the loop recorders. And the loop recorders represented less than 19% of all the devices in the study. So we knew there was an issue. Fortunately, our industry partners realized it also, and they started taking some steps to see what they could do to alleviate some of that burden of things that we didn't need to see. You can see in 2014, Medtronic came out with TruRhythm. Boston Scientific put their LuxDX in 2020. Abbott had the SharpSense in 2019. And then there were also some design changes. One example was Biomonitor 3, where they widened the separation of the electrodes to try to get better R-wave discrimination. So here's a study from 2024. This one did go back to 2013 to 2023, looking at data. Although most of their studies focused on studies from 2019, 20, and 21, they found that still almost 60% of all loop recorder alerts were false positives, and the most common was AFib. So now we have maybe our hero, we'll see, artificial intelligence. What is AI? You've heard a lot. Amy did a great job explaining what it is. Basically, really good AI can build some associations, in this case probably based on training data, can make some associations that are imperceptible to the human eye. So let's see how that works. This is a case study that we're going to look at with Medtronic, and a big shout out to Kyle Zablocki, who's here with us today, because he really helped me look at some of this information. So Medtronic AcuRhythm came out, released in 2021. To develop this AI, Medtronic looked at one million of their own link tracings that had been clinically adjudicated as true or faulty fibs. So that was a big burden right there, just to adjudicate all those episodes and pauses. They wanted to look at both. They labeled the tracings, and then they fed them into the AI program, I call it the machine. They also augmented with a lot of tracings with noise. They were trying to trick it, right? So noise or inverted signals. We know that race or body habitus can change the way a signal looks, so they were trying to eliminate bias and get a really comprehensive set of signals. And when all was said and done, after submitting all this through the AI, they found that they could reduce the false positive AFib alerts by 74%, and the pause alerts by 97%. So that's a huge burden relief, you know, for us as clinicians. But Medtronic wasn't really happy with that AFib result. They wanted to get better. So they took an additional 500,000 unlabeled strips, AFib, and they put them through that same program. So again, as to Amy's point, you can teach the AI, and then you want it to think on its own with unlabeled strips, right? So you want to see what it does. So it came out with a certain percentage that it said, okay, 100%, this is AFib, and a certain percentage that were 100% not AFib. And we know that it was good at that from their previous study. So they got rid of those, and they took those indeterminate strips, and they adjudicated them by clinicians. So we're trying to, like, just keep teaching this AI so that it can really learn to discriminate what's going on. Resubmitted them, adjudicated them, combined with the original 1 million. This was a really heavy, heavy project to try and really get this AI to learn correctly and to be thorough and to be trustworthy. So there we go. Rinse and repeat. Rinse and repeat. Do I have that right, Kyle? So they finally did come out with an 88% reduction in false positive alerts for AFib, and not missing the true AFib as well. So this was a big benefit to us, and that was released in 2023. So we're still seeing the benefits of that. It hasn't truly been isolated and studied yet, to my knowledge, but how does this work? So we talk about how the AI can look at these patterns Amy talked about. It can look at the whole segment of the EKG, and this, again, is compliments of Kyle and the Medtronic team. They let me use this slide. So we kind of look. We look to see, is there a P wave in there? We look at the R to R regularity, but the AI is evaluating things that we can't even see. It's looking for patterns and similarities, and that's how it comes up with the ability to really say, this is AFib. Those of us that have seen more than 10,000 strips, how many times have you, like, studied it for, like, five, 10 minutes? You're trying to figure it out, and then you just, like, hold it up, and you go, nah, it doesn't look like AFib, right, or VT, or whatever. So we kind of have this built into our neural pathway, but we can't explain it to someone else, but AI is probably a little more reliable. Definitely they can do it on a larger scale than we can. This is how Medtronic applies their AI. There are different ways to do it, but their transmissions send all the episodes that are detected by the device up to the cloud, and then any AFib and pause episodes are run through the AI platform. The ones that are adjudicated as true go to the clinic. The false ones are not sent to the clinic, and not represented here, if there are still inclusive episodes, and there are, those are sent to the clinic for adjudication. There have been some other manufacturers that have made some changes to increase precision in the loop recorders, so Biotronic also has, uses AI in the cloud to evaluate their AFib episodes. This came out in 2023. Just a note, they do send their false AFib episodes to the clinic to be adjudicated, so then the clinic needs to decide, you know, are we going to look at those, or are we not? But it may not have reduced the burden, but it gives the clinic a little bit of opportunity to see if they believe this or not. Abbott came out with their Assert IQ in 2023. This is a signal processor in the device. It has a five-step AFib discriminator, and they are planning an AI interface for Merlin to help them discriminate with the AFib. And Boston Scientific Lux DX in 2023, machine learning was used and created their dual-stage algorithms for several different arrhythmia episodes, and then they also have hourly morphology template updates, which is a form of machine learning. So we, where are we? This is a little bit of a newer study, January 22 to March of 2023, the data was there, so this was not with the new Medtronic AI algorithm or the Boston Scientific one, so there's still some legacy stuff in there. And this one, I think we're just looking at the positive predictive value. So this is kind of a busy slide, and it does include a lot of old data, but I will say even with the changes that were made, that the positive PPV for AFib is still only 0.65%. Now I think we're going to see some changes with the new technology once we have some more studies, but we still have some opportunity there. AI and their more specific algorithms are new, and they haven't been exclusively studied, so as time goes by, I think that we will be very interested to see the data with these new AI, the updated AI interfaces. And then there's also other arrhythmias that we haven't really focused on, so what about, you know, PACs versus PVCs, patterns of bigeminy, trigeminy, we haven't really, some devices use some AI or machine learning to detect those, and we haven't really studied those. And then all AI, as we've learned through many of the sessions this week, are not created equal, and so they all should be thoroughly validated and vetted. So I think our takeaways, AI has learned from more EKGs than we have ever seen, or will ever see. So even us 10,000ers, the AI really has seen a lot. AI doesn't get interrupted, it doesn't get tired, it doesn't need a vacation day, but AI has to be taught and trained. The data out from AI is only as good as the data in. There will always be inconclusive episodes. And AI does not have the clinical context of the patient, at least not in the context of loop recorders. So when we think about who's smarter, mirror, mirror on the wall, for specific pattern recognition of the EKG, AI probably has the edge. But for full clinical interpretation and decision making, human clinicians are still the MVPs. So I think as all the panelists have pointed out, we are probably all smarter together. And congratulations, we all still have a job. Thank you. Thank you, Sally. That was excellent. I'd like to open the floor for questions, if there's any questions in the audience. I have a comment. I think that it's amazing, coming from seeing the first ILRs and now the more recent ILRs, the AI has definitely made a big difference, I think, in noticing AFib and declaring what's AFib and what's not AFib. So that's been excellent. So I'm definitely for this AI thing. And I like how our presentations really built on each other this afternoon. I still want to know why I can't get rid of AFib notifications on the Apple Watch that are PACs. So if anybody can answer that question for me, please come to the stand. But thanks for coming. I know it's a late session. If no one has any questions or comments, then we can adjourn.
Video Summary
The session at the HRS 2025 annual meeting in San Diego focused on the integration of AI into electrophysiology (EP), highlighting its impact on allied professionals. Evan Robertson introduced the session, followed by Amy Tucker from Sanger Heart Institute, who discussed AI's critical role in EP by improving efficiency and accuracy in areas such as ECG interpretation, remote monitoring, and disease prediction. Key technologies like machine learning and natural language processing were highlighted for their capabilities in problem-solving and decision-making.<br /><br />Madeline Oster spoke on AI-driven wearable cardiac technology, which is transforming heart health management by providing continuous, real-time data. She emphasized the importance of these devices in arrhythmia detection and proactive interventions. Despite limitations in accuracy, wearable devices are essential for engaging patients in their health management.<br /><br />Lastly, Sally Gustafson from Emory delved into AI advancements in cardiac monitoring, particularly with insertable cardiac monitors. She highlighted the significant reduction in false-positive alerts due to AI, improving workload management for device clinics. The session underscored AI's role in enhancing human capabilities rather than replacing them, and emphasized the ongoing need for human oversight in clinical contexts.
Keywords
AI integration
electrophysiology
ECG interpretation
wearable cardiac technology
arrhythmia detection
machine learning
insertable cardiac monitors
human oversight
Heart Rhythm Society
1325 G Street NW, Suite 500
Washington, DC 20005
P: 202-464-3400 F: 202-464-3401
E: questions@heartrhythm365.org
© Heart Rhythm Society
Privacy Policy
|
Cookie Declaration
|
Linking Policy
|
Patient Education Disclaimer
|
State Nonprofit Disclosures
|
FAQ
×
Please select your language
1
English