Advertisement
Mayo Clinic Proceedings Home

Clinician Adoption of an Artificial Intelligence Algorithm to Detect Left Ventricular Systolic Dysfunction in Primary Care.

      Abstract

      Objective

      To compare the clinicians’ characteristics of “high adopters” and “low adopters” of an artificial intelligence (AI)–enabled electrocardiogram (ECG) algorithm that alerted for possible low left ventricular ejection fraction (EF) and the subsequent effectiveness of detecting patients with low EF.

      Methods

      Clinicians in 48 practice sites of a US Midwest health system were cluster-randomized by the care team to usual care or to receive a notification that suggested ordering an echocardiogram in patients flagged as potentially having low EF based on an AI-ECG algorithm. Enrollment was between June 26, 2019, and July 30, 2019; participation concluded on March 31, 2020. This report is focused on those clinicians randomized to receive the notification of the AI-ECG algorithm. At the patient level, data were analyzed for the proportion of patients with positive AI-ECG results. Adoption was defined as the clinician order of an echocardiogram after prompted by the alert.

      Results

      A total of 165 clinicians and 11,573 patients were included in this analysis. Among patients with positive AI-ECG, high adopters (n=41) were twice as likely to diagnose patients with low EF (33.9%) vs low adopters, n=124, (16.9%); odds ratio, 1.62; 95% CI, 1.21 to 2.17). High adopters were more often advanced practice providers (eg, nurse practitioners and physician assistants) vs physicians, Family Medicine vs Internal Medicine specialty, and tended to have less complex patients.

      Conclusion

      Clinicians who most frequently followed the recommendations of an AI tool were twice as likely to diagnose low EF. Those clinicians with less complex patients were more likely to be high adopters.

      Trial registration

      Clinicaltrials.gov Identifier: NCT04000087.

      Abbreviations and Acronyms:

      ACG (adjusted clinical group), AI (artificial intelligence), APP (advanced practice provider), EAGLE (ECG AI-Guided Screening for Low Ejection Fraction), ECG (electrocardiogram), EF (ejection fraction), EHR (electronic health record), IRB (institutional review board), NP (nurse practitioner), PA (physician assistant)
      Artificial intelligence (AI) has promised to augment decision-making for clinicians in health care for decades. Recent advances and the adoption of electronic health records (EHRs) and the ability to apply machine learning to this enormous data repository have now made it possible to use AI to improve diagnostic accuracy and refine treatment plans.
      • Shortliffe E.H.
      • Sepúlveda M.J.
      Clinical decision support in the era of artificial intelligence.
      ,
      • Komorowski M.
      • Celi L.A.
      • Badawi O.
      • Gordon A.C.
      • Faisal A.A.
      The artificial intelligence clinician learns optimal treatment strategies for sepsis in intensive care.
      Specialties that rely heavily on waveform and imaging data are gaining a solid foothold in using machine learning to improve diagnostic accuracy, develop effective treatment plans, and enhance efficiencies.
      • Bizzo B.C.
      • Almeida R.R.
      • Michalski M.H.
      • Alkasab T.K.
      Artificial intelligence and clinical decision support for radiologists and referring providers.
      • Chang A.C.
      Primary prevention of sudden cardiac death of the young athlete: the controversy about the screening electrocardiogram and its innovative artificial intelligence solution.
      • Dilsizian S.E.
      • Siegel E.L.
      Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment.
      Primary care has been slower to realize the benefits of AI.
      • Lin S.Y.
      • Mahoney M.R.
      • Sinsky C.A.
      Ten ways artificial intelligence will transform primary care.
      • Kueper J.K.
      • Terry A.L.
      • Zwarenstein M.
      • Lizotte D.J.
      Artificial intelligence and primary care research: a scoping review.
      • Gottliebsen K.
      • Petersson G.
      Limited evidence of benefits of patient operated intelligent primary care triage tools: findings of a literature review.
      The complexities of the undifferentiated patient presentations, lack of data standardization, and clinician trust in machine-based learning are just a few reasons for this slow adaptation.
      • Asan O.
      • Bayrak A.E.
      • Choudhury A.
      Artificial intelligence and human trust in healthcare: focus on clinicians.
      On the other hand, several studies have shown improved diagnostic accuracy, patient triage, and clerical burden reduction.
      • Dankwa-Mullan I.
      • Rivo M.
      • Sepulveda M.
      • Park Y.
      • Snowdon J.
      • Rhee K.
      Transforming diabetes care through artificial intelligence: the future is here.
      ,
      • Halamka J.
      • Cerrato P.
      An FP's guide to AI-enabled clinical decision support.
      The potential for AI to transform primary care is enormous. An AI tools' effectiveness in improving clinical outcomes depends on its adoption by front-line clinicians, where “the rubber meets the road.”
      • He J.
      • Baxter S.L.
      • Xu J.
      • Xu J.
      • Zhou X.
      • Zhang K.
      The practical implementation of artificial intelligence technologies in medicine.
      To date, few studies examine the characteristics of clinicians who readily embrace AI tools (high adopters) vs those who are more hesitant (low adopters) and the clinical outcomes associated with these two approaches.
      In the recently published EAGLE (ECG AI-Guided Screening for Low Ejection Fraction) study, Yao et al
      • Yao X.
      • Rushlow D.R.
      • Inselman J.W.
      • et al.
      Artificial intelligence-enabled electrocardiograms for identification of patients with low ejection fraction: a pragmatic, randomized clinical trial.
      ,
      • Yao X.
      • McCoy R.G.
      • Friedman P.A.
      • et al.
      ECG AI-Guided Screening for Low Ejection Fraction (EAGLE): rationale and design of a pragmatic cluster randomized trial.
      tested the real-world applicability of an AI algorithm designed to detect low left ventricular ejection fraction (EF) by analyzing routine electrocardiograms (ECGs).
      • Yao X.
      • Rushlow D.R.
      • Inselman J.W.
      • et al.
      Artificial intelligence-enabled electrocardiograms for identification of patients with low ejection fraction: a pragmatic, randomized clinical trial.
      ,
      • Yao X.
      • McCoy R.G.
      • Friedman P.A.
      • et al.
      ECG AI-Guided Screening for Low Ejection Fraction (EAGLE): rationale and design of a pragmatic cluster randomized trial.
      In this study, 358 primary care clinicians were randomized to either an intervention group (access to AI screening results) or a control group (usual care) across 48 upper Midwest clinics/hospitals. The primary endpoint was a new diagnosis of EF less than or equal to 50% within 3 months. The study showed an increase in detecting low EF from 178 (1.6%) in the control group to 244 (2.1%) for the intervention group (odds ratio, 1.32; 95% CI, 1.01 to 1.61]; P=0.007).
      Early diagnosis of low EF, particularly in asymptomatic patients, is critical in reducing lifetime risk of mobility and mortality.
      • Bowling C.B.
      • Sanders P.W.
      • Allman R.M.
      • et al.
      Effects of enalapril in systolic heart failure patients with and without chronic kidney disease: insights from the SOLVD Treatment trial.
      • Ponikowski P.
      • Voors A.A.
      • Anker S.D.
      • et al.
      2016 ESC guidelines for the diagnosis and treatment of acute and chronic heart failure: the task force for the diagnosis and treatment of acute and chronic heart failure of the European Society of Cardiology (ESC). Developed with the special contribution of the Heart Failure Association (HFA) of the ESC.
      • Wang T.J.
      • Levy D.
      • Benjamin E.J.
      • Vasan R.S.
      The epidemiology of “asymptomatic” left ventricular systolic dysfunction: implications for screening.
      Given this impact, and the results of the EAGLE study, we chose to dive deeper into the data to better understand the clinician characteristics that are key to driving adoption. Adoption was defined as the clinician ordering an echocardiogram within 3 months after receiving the AI-ECG alert. Echocardiograms are considered the test of choice for the definitive diagnosis of left ventricular dysfunction.
      • Ponikowski P.
      • Voors A.A.
      • Anker S.D.
      • et al.
      2016 ESC guidelines for the diagnosis and treatment of acute and chronic heart failure: the task force for the diagnosis and treatment of acute and chronic heart failure of the European Society of Cardiology (ESC). Developed with the special contribution of the Heart Failure Association (HFA) of the ESC.
      To better understand the complexities of AI adoption in the clinical setting, we compared clinician- and patient-level characteristics of those more likely to respond to the prompted recommendation of the AI algorithm with those less likely to respond in the EAGLE study's intervention arm.
      • Yao X.
      • Rushlow D.R.
      • Inselman J.W.
      • et al.
      Artificial intelligence-enabled electrocardiograms for identification of patients with low ejection fraction: a pragmatic, randomized clinical trial.
      By further assessing this group of clinicians, we identified characteristics associated with adoption of the AI tool and compared the effectiveness of each group in detecting low EF.

      Methods

      This is a secondary analysis of data from a pragmatic, cluster randomized controlled trial comparing AI-guided ECG-based screening for low EF with usual care in 48 Mayo Clinic primary care practices, including an academic medical center and community and rural clinics across Minnesota and Wisconsin. Clinicians in the intervention arm received an AI screening result when a clinically indicated ECG was ordered; those in the control arm received a standard ECG report. The original study was approved by the Mayo Clinic Institutional Review Board, in accordance with the Declaration of Helsinki. The trial was registered (ClinicalTrials.gov number NCT04000087), and the study protocol and materials have been previously published.
      • Yao X.
      • Rushlow D.R.
      • Inselman J.W.
      • et al.
      Artificial intelligence-enabled electrocardiograms for identification of patients with low ejection fraction: a pragmatic, randomized clinical trial.

      Participant Data

      This analysis includes 165 clinicians from 60 care teams that were randomized to the intervention group (ie, received results of an AI ECG that detects possible ventricular dysfunction) (Figure). Primary care clinicians included physicians, nurse practitioners (NPs), and physician assistants (PAs), organized in teams that provide care for empaneled adult patients (≥18 years of age). A panel is defined as the number of patients assigned to an individual clinician. Clinicians in the pediatric care teams, resident care teams, acute and urgent care teams, and nursing home care teams were excluded.
      Figure thumbnail gr1
      FigureFlowchart of participants. ECG = electrocardiogram; EF = ejection fraction; HF = heart failure.
      To measure the individual patient complexity, we used the Johns Hopkins Adjusted Clinical Groups (ACG) score. Scores are normalized for our organization with the average set at 1. Scores greater than 1 suggest higher than average patient complexity, and those less than 1 suggest lower complexity.

      Setting

      Clinical data were collected using EHRs. A patient's data was included in the analysis if the patient was 18 years of age or older and had received an ECG for any indication between August 5, 2019, and March 31, 2020. Only the first ECG of an individual patient was considered for the decision to order an echocardiogram within the study period. Patients' data were excluded if they had known EF less than or equal to 50%, or a history of heart failure before the ECG, or if they did not provide authorization to use their data for research.
      Demographics and subjective data (ie, comfort level with patients with low EF) on clinicians were collected using a survey that was administered at the time the clinician enrolled in the trial.

      Treatment

      The AI algorithm was embedded in the EHR to automatically generate a screening report every time an ECG was performed. Clinicians in the intervention group had access to the AI-ECG screening report, which displayed the AI-ECG screening as positive or negative. When the screening result was negative, the recommendation was “no further testing unless indicated by other symptoms or conditions.” When the screening result was positive, the recommendation was to “consider ordering an echocardiogram.” Electrocardiograms were considered positive if the probability of having low EF was higher than a threshold selected to have similar sensitivity and specificity for a cohort with a prevalence of 9.0%.
      • Yao X.
      • McCoy R.G.
      • Friedman P.A.
      • et al.
      ECG AI-Guided Screening for Low Ejection Fraction (EAGLE): rationale and design of a pragmatic cluster randomized trial.
      In addition to the EHR report, the intervention group clinicians also received an email alert when their patients had a positive screening result.

      Outcomes

      Adoption of AI was defined at the clinician level by dividing the number of echocardiograms for AI-positive ECGs by the total number of AI-positive ECGs. Each clinician was then categorized into either the high adopter group (top quartile) or low adopter group (lower three quartiles). High adopters ordered echocardiograms for at least 64.3% (third quartile of the distribution) of patients that had an AI-positive ECG. The primary outcome was the impact of adoption on the clinical outcome of low EF, and the provider characteristics associated with adoption.

      Statistical Analysis

      At the clinician level, baseline characteristics were assessed for differences between the high-adopter and low-adopter groups using unadjusted Student t tests and χ2 tests for continuous and categorical variables. Multivariable logistic regression was then used to assess the association of a subset of these variables with the high-adopter group. We also conducted a sensitivity analysis using a negative binomial regression with the same subset of variables and the AI adoption proportion to see if any associations existed that were not present in the high- or low-adoption groups. Hypertension, diabetes, myocardial infarction, peripheral artery disease, and atrial fibrillation were determined using a validated natural language processing program to abstract clinical notes.
      • Wen A.
      • Fu S.
      • Moon S.
      • et al.
      Desiderata for delivering NLP to accelerate healthcare AI advancement and a Mayo Clinic NLP-as-a-service implementation.
      Valvular heart disease and chronic kidney disease were determined using diagnosis codes.
      At the patient level, the proportion of AI-positive patients with low EF was compared between patients of clinicians within the high-AI adopter group and the patients of clinicians within the low-AI adopter group. Mixed-effect logistic regression was used to compare low EF incidence between the high-adopter and the low-adopter groups with the care team as a random effect. Data management was conducted, and analyses were performed using SAS 9.4 (SAS Institute Inc., Cary, NC, USA) and R 4.0.3.
      R Core Team
      R: A language and environment for statistical computing. In: Vienna, Austria: R Foundation for Statistical Computing; 2020.

      Results

      Clinician and Patient Characteristics

      Among the 165 clinicians randomized to the intervention arm, 112 (67.9%) were physicians, and 53 (32.1%) were NP/PAs. Specialty representation was 118 (72.4%) family medicine and 45 (27.6%) internal medicine. Clinicians' mean age (SD) was 45.1 (10.3) years, 87 (55.8%) were female, and 135 (92.5%) were White. The average clinician length in practice (SD) was 13.3 (10.3) years (Table 1).
      Table 1Baseline Characteristics of 165 Midwest Clinicians Notified of AI Electrocardiogram Results
      AI = artificial intelligence; NP = nurse practitioner; PA = physician assistant.
      CharacteristicsLow adopter (n=124)High adopter (in Q3) (n=41)Total (N=165)Unadjusted P value
      Mean age (SD), y45.1 (10.3)45.3 (10.2)45.1 (10.3).83
       Missing11617
      Sex, n (%).13
       Male57 (47.5)12 (33.3)69 (44.2)
       Female63 (52.5)24 (66.7)87 (55.8)
       Missing459
      Race, n (%).68
       White103 (92.0)32 (94.1)135 (92.5)
       Non-White9 (8.0)2 (5.9)11 (7.5)
       Missing12719
      Position, n (%).008
       Physician91 (73.4)21 (51.2)112 (67.9)
       NP/PA33 (26.6)20 (48.8)53 (32.1)
      Specialty, n (%).10
       Family medicine85 (69.1)33 (82.5)118 (72.4)
       Internal medicine38 (30.9)7 (17.5)45 (27.6)
       Missing112
      Mean years in practice (SD)13.3 (10.6)13.3 (9.7)13.3 (10.3)0.87
       Missing16824
      Mean years in current care team (SD)8.6 (8.7)9.1 (8.6)8.7 (8.7).79
       Missing22830
      In the past 12 months, what percent of your time do you work in direct patient care? n (%).59
       0-7522 (18.2)8 (22.2)30 (19.1)
       76-10099 (81.8)28 (77.8)127 (80.9)
       Missing358
      How comfortable are you managing patients with LV dysfunction? n (%).06
       Somewhat comfortable or less44 (36.7)19 (54.3)63 (40.6)
       More than somewhat comfortable76 (63.3)16 (45.7)92 (59.4)
       Missing4610
      How often do you consult cardiology for the management of LV dysfunction? n (%).18
       Somewhat often or less91 (75.2)23 (63.9)114 (72.6)
       More than somewhat often30 (24.8)13 (36.1)43 (27.4)
       Missing358
      Mean number of patients in panel (SD)1009 (543)880 (467)976 (527).21
       Missing516
      Average ACG score of paneled patients (SD)1.4 (0.75)1.1 (0.54)1.3 (0.72).002
       Missing617
      a AI = artificial intelligence; NP = nurse practitioner; PA = physician assistant.
      A total of 11,573 adult patients had ECGs ordered by clinicians in the intervention arm. Patients had a mean (SD) age of 60.5(17.6) years; 53.9% were women; 94.3% were White; 50.1% lived in rural areas; 16.8% of the patients had a prior echocardiogram. Multiple chronic conditions were documented with hypertension (56.1%) and diabetes (20.6%) being the most prevalent (Table 2).
      Table 2Patient Characteristics
      ECG = electrocardiogram.
      ,
      Values shown are n (%) unless otherwise stated.
      CharacteristicTreatment (N=11,573)
      Mean age (SD), y60.5 (17.5)
       18-646256 (54.1)
       65-742764 (23.9)
       ≥752553 (22.1)
      Female6080 (52.5)
      Race
       White10,926 (94.4)
       Black or African American201 (1.7)
       Asian145 (1.3)
       Other254 (2.2)
       Unknown/declined/missing47 (0.4)
      Rural6323 (54.6)
      Medical history
       Hypertension6491 (56.1)
       Diabetes2347 (20.3)
       Myocardial infarction770 (6.7)
       Peripheral artery disease411 (3.6)
       Stroke or transient ischemic attack409 (3.5)
       Prior atrial fibrillation991 (8.6)
       New atrial fibrillation on index ECG246 (2.1)
       Valvular heart disease129 (1.1)
       Chronic kidney disease1373 (11.9)
       Prior echocardiogram1903 (16.4)
      Location of ECG ordered
       Outpatient clinic6043 (52.2)
       Emergency room4411 (38.1)
       Hospital1119 (9.7)
      a ECG = electrocardiogram.
      b Values shown are n (%) unless otherwise stated.

      High Adopter vs Low Adopter Characteristics

      A greater proportion of the NP/PAs 20, (37.7%) were high adopters compared with physicians 21, (18.8%0; P=.008). Clinicians in family medicine 33, (28.0%) were more likely than those in internal medicine 7, (15.6%; P=.010) to be high adopters. High adopters had a lower average patient ACG complexity score (P=.002). Although not statistically significant, high adopters reported feeling less comfortable caring for patients with low EF (P=.06). Variables that were significant in the univariate model (NP/PA vs physicians, family medicine vs internal medicine, and patient panel complexity) were no longer significantly associated with adoption in a multivariable logistic regression analysis.

      Effectiveness of Detecting Low EF

      High adopters were twice as likely to identify low EF as low adopters in patients with an AI-positive ECG (Table 3). Clinicians in the highest quartile of AI adoption ordered echocardiogram’s after a positive AI result at the rate of 78.7% (n=100) as compared with 43.0% (n=243) in the lower three quartiles. Of the 100 echocardiograms ordered in response to a positive AI result by high adopters, 43 showed an EF less than 50% (43.0%). Low adopters ordered a total of 243 echocardiograms, with 92 showing low EF (37.9%). This translates to a diagnostic yield (proportion of AI-positive with confirmed low EF) of 33.9% in high adopters, and 16.3% for low adopters (P<.001). When analyzing AI-negative ECGs, we found no difference between high and low adopters in both the number of echocardiograms ordered or the number of patients identified with low EF (P=.46). To assess whether our findings were related to differences in patient complexity, we conducted a sensitivity analysis to assess the impact of the patient complexity on the diagnostic yield using a propensity matched cohort and found similar results to primary analysis.
      Table 3Comparison of Diagnostic Yield between Low and High Adopters
      AI = artificial intelligence; ECG = electrocardiograms.
      ,
      Proportion of ECGs that were diagnosed with low EF.
      No of ECGsEchocardiograms performed, n (%)Echocardiograms with low EF, n (%)Diagnostic Yield, %
      Proportion of ECGs that were diagnosed with low EF.
      P
      AI-positive ECGs<.001
       Low Adopter565243 (43.0)92 (37.9)16.3
       High Adopter127100 (78.7)43 (43.0)33.9
      AI-negative ECGs.46
       Low Adopter87881502 (17.1)85 (5.7)1.15
       High Adopter2093377 (18.0)24 (6.4)0.97
      a AI = artificial intelligence; ECG = electrocardiograms.
      b Proportion of ECGs that were diagnosed with low EF.
      In a multivariable logistic regression model, high adopters maintained a significantly greater likelihood of detecting low EF compared with low adopters’ odds ratio (1.62; 95% CI, 1.21-2.17).

      Discussion

      Artificial intelligence is poised to transform medicine by enabling the diagnosis of occult and early disease. However, to impact human health, AI tools must be adopted by clinicians. AI tools developed by specialists can only offer broad benefit to patients if primary care provider adoption actualizes an AI tool’s clinical utility. We found wide variation in the rate of adoption of AI recommendations. Importantly, those who responded to the AI prompt and ordered an echocardiogram were significantly more likely to identify left ventricular dysfunction in their patients (33.9% vs 16.3%, P<.001). This is important given the large body of trial evidence and numerous professional society guidelines showing improved outcomes, diminished morbidity, and decreased mortality with early treatment of ventricular dysfunction.
      • Yancy C.W.
      • Jessup M.
      • Bozkurt B.
      • et al.
      2017 ACC/AHA/HFSA focused update of the 2013 ACCF/AHA guideline for the management of heart failure: a report of the American College of Cardiology/American Heart Association Task Force on Clinical Practice Guidelines and the Heart Failure Society of America.
      ,
      • Yancy C.W.
      • Jessup M.
      • Bozkurt B.
      • et al.
      2013 ACCF/AHA guideline for the management of heart failure: a report of the American College of Cardiology Foundation/American Heart Association Task Force on practice guidelines.
      System and technology factors influence adoption of EHR and computer-based AI-enabled decision support tools.
      • Kilsdonk E.
      • Peute L.W.
      • Knijnenburg S.L.
      • Jaspers M.W.
      Factors known to influence acceptance of clinical decision support systems.
      • Kortteisto T.
      • Komulainen J.
      • Mäkelä M.
      • Kunnamo I.
      • Kaila M.
      Clinical decision support must be useful, functional is not enough: a qualitative study of computer-based clinical decision support in primary care.
      • Makam A.N.
      • Lanham H.J.
      • Batchelor K.
      • et al.
      The good, the bad and the early adopters: providers' attitudes about a common, commercial EHR.
      In general, computer literacy is an important predictor in the successful adoption of clinical decision support tools and AI-enabled tools may require even higher technical proficiency.
      • Sutton R.T.
      • Pincock D.
      • Baumgart D.C.
      • Sadowski D.C.
      • Fedorak R.N.
      • Kroeker K.I.
      An overview of clinical decision support systems: benefits, risks, and strategies for success.
      ,
      • Laka M.
      • Milazzo A.
      • Merlin T.
      Factors That Impact the Adoption of Clinical Decision Support Systems (CDSS) for Antibiotic Management.
      We hypothesized that early adopters would be younger and have less complex patient panels. We also hypothesized that advanced practice providers (APPs), who generally have fewer complex panels, would more likely be high adopters than physicians. This hypothesis was assumed based on the perception that younger clinicians would be more facile with computer-based technology and received greater exposure during their training. Several studies have suggested younger age as a characteristic of early adoption in the use of digital health tools.
      • Decker S.L.
      • Jamoom E.W.
      • Sisk J.E.
      Physicians in nonprimary care and small practices and those age 55 and older lag in adopting electronic health record systems.
      ,
      • Wylie M.C.
      • Baier R.R.
      • Gardner R.L.
      Perceptions of electronic health record implementation: a statewide survey of physicians in Rhode Island.
      One study assessing characteristics of early and late adopters in the clinical use of personal digital assistant, found early adopters to be younger with less clinical experience and more facile with the use of technology.
      • Vishwanath A.
      • Brodsky L.
      • Shaha S.
      • Leonard M.
      • Cimino M.
      Patterns and changes in prescriber attitudes toward PDA prescription-assistive technology.
      There is some evidence that primary care physicians are more likely to adopt EHR technology.
      • Decker S.L.
      • Jamoom E.W.
      • Sisk J.E.
      Physicians in nonprimary care and small practices and those age 55 and older lag in adopting electronic health record systems.
      ,
      • DesRoches C.M.
      • Campbell E.G.
      • Rao S.R.
      • et al.
      Electronic health records in ambulatory care — a national survey of physicians.
      We found little evidence suggesting APPs, or clinicians with less complex patients, show higher adoption rates of decision support tools.
      Our results in the unadjusted analysis show that high adopters tended to be APPs compared with physicians and had a lower average patient panel complexity. We also saw a trend in family physicians and those with lower comfort level in managing patients with low EF being more likely to be high adopters. However, in the adjusted analysis, we found no evidence that APPs or clinicians with less complex patient panels had a higher adoption rate of decision support tools. Perhaps with a larger sample size, these characteristics would have maintained their significance in the adjusted analysis. We did not find significant differences in several other important variables, including age, sex, number of years in practice, or the number of patients cared for in a panel.
      It is equally interesting to note those factors that were not significantly associated with AI adoption. Age was not associated with the high-adopter group. This suggests that age may have less of an impact on a provider's willingness to trust a digital diagnostic decision aid. Perhaps, with increased experience with EHRs, age may no longer be an important factor in determining acceptance of new technology. If this finding is accurate, it shows a deviation between popular perception and actual application. These results suggest tailoring intervention training based on provider age would be an unnecessary use of time and resources.
      In patients with an AI-flagged ECG, high adopters were twice as likely to detect low EF compared with low adopters (33.9% vs 16.2%). When analyzing outcomes from patients with negative AI-ECGs, we did not find a difference in the echocardiogram order rate between the two groups, suggesting that high adopters were not just indiscriminately ordering more echocardiogram testing. In fact, despite doubling the diagnostic yield, the high-adopter group only ordered 35% more echocardiograms in those patients flagged by the AI-ECG. This is rather remarkable given that the high-adopter group tended to be an APP and have less complicated patient panels, suggesting that their patients may be less likely to have undiagnosed low EF.
      Because patients with asymptomatic low EF have a higher incidence of chronic medical problems, such as atherosclerotic heart disease, hypertension, atrial fibrillation, rheumatoid arthritis, and diabetes, we expected the high-adopter group to have fewer patients with low EF.
      • Ponikowski P.
      • Voors A.A.
      • Anker S.D.
      • et al.
      2016 ESC guidelines for the diagnosis and treatment of acute and chronic heart failure: the task force for the diagnosis and treatment of acute and chronic heart failure of the European Society of Cardiology (ESC). Developed with the special contribution of the Heart Failure Association (HFA) of the ESC.
      ,
      • Kapelios C.J.
      • Bonou M.
      • Barmpagianni A.
      • et al.
      Early left ventricular systolic dysfunction in asymptomatic patients with type 1 diabetes: a single-center, pilot study.
      ,
      • Cioffi G.
      • Viapiana O.
      • Ognibeni F.
      • et al.
      Prognostic role of subclinical left ventricular systolic dysfunction evaluated by speckle-tracking echocardiography in rheumatoid arthritis.
      However, we found that adjusting for patient complexity had no effect on the diagnostic yield of the AI algorithm. This finding would indicate that, when trusted, the algorithm improves accuracy and reduces variability in detecting patients with low EF. If the low-adopter group had used it to a greater degree, more patients with low EF would likely have been identified.
      This study highlights the power of collaboration between a specialty practice (cardiology) and primary care. Given the highly technical nature of AI in health care, it is often initiated and developed in academic specialty practices. Applications of these tools within these practices expose them to relatively smaller patient populations with higher disease prevalence. In the case of this study, primary care has an exponentially greater opportunity to impact the devastating effects of left ventricular dysfunction when this tool is applied to their patient populations. To maximize AI's benefits in health care, more collaboration is needed between specialty practices and primary care.

      Study Limitations

      We included 165 primary care clinicians within an integrated, multispecialty health system across three Midwest states, and the findings might not be generalizable to clinicians in other health systems or specialties. Other health care organizations may differ in their patient population, distribution of cardiac disease prevalence, race and ethnicity, socioeconomic status, and rural vs urban distribution. Provider characteristics may also differ in other institutions including ratios of APPs to physicians, scope of practice, care team structure, and specialty availability. However, this is one of the largest studies to investigate AI implementation in primary care and provides important insights into the emerging field of AI-enabled clinical decision support. In addition, we reported clinician behavior (ie, an echocardiogram was ordered and performed) and the outcome (eg, diagnosis of low EF), but there are many scenarios where not ordering an echocardiogram is appropriate (eg, patient preference, cost, or other clinical justification). Also, we do not fully understand the rationale behind the clinician's decision not to order an echocardiogram after a positive ECG. It is certainly plausible other factors appropriately outweighed the echocardiogram recommendation. Lastly, our definition of adoption was based on echocardiogram order and completion rate. Other researchers have used validated scales to more precisely characterize clinicians who are more or less likely to use digital tools.
      • Hatz M.H.
      • Sonnenschein T.
      • Blankart C.R.
      The PMA scale: a measure of physicians' motivation to adopt medical devices.

      Conclusion

      Primary care clinicians who were higher adopters of an AI-enabled clinical decision support tool were twice as likely to diagnose low EF than low adopters. Clinicians most likely to follow through with the recommendations of the AI decision aid tended to be less experienced in dealing with complex patients. This underscores the importance of clinician education and engagement, and AI systems that integrate seamlessly into the workflows of busy caregivers.

      Potential Competing Interests

      Mayo Clinic holds a patent of this technology and may receive financial benefits from it. Drs Friedman, Lopez-Jimenez, and Atia may also receive financial benefits from this agreement. However, at no point will Mayo Clinic benefit financially from its use for the care of patients at Mayo Clinic. The remaining authors report no potential competing interests.

      Acknowledgments

      The authors thank all of the study participants. Without their participation, this practice improvement project would not have been possible. In accordance with the Declaration of Helsinki, this study was reviewed and approved (ID 19-003137) by the Mayo Clinic Institutional Review Board (IRB). Mayo Clinic IRB approved written informed consent was obtained for all study participants before study participation. All authors assert that all procedures contributing to this work comply with the ethical standards of Mayo Clinic.

      References

        • Shortliffe E.H.
        • Sepúlveda M.J.
        Clinical decision support in the era of artificial intelligence.
        JAMA. 2018; 320: 2199-2200
        • Komorowski M.
        • Celi L.A.
        • Badawi O.
        • Gordon A.C.
        • Faisal A.A.
        The artificial intelligence clinician learns optimal treatment strategies for sepsis in intensive care.
        Nat Med. 2018; 24: 1716-1720
        • Bizzo B.C.
        • Almeida R.R.
        • Michalski M.H.
        • Alkasab T.K.
        Artificial intelligence and clinical decision support for radiologists and referring providers.
        J Am Coll Radiol. 2019; 16: 1351-1356
        • Chang A.C.
        Primary prevention of sudden cardiac death of the young athlete: the controversy about the screening electrocardiogram and its innovative artificial intelligence solution.
        Pediatr Cardiol. 2012; 33: 428-433
        • Dilsizian S.E.
        • Siegel E.L.
        Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment.
        Curr Cardiol Rep. 2014; 16: 441
        • Lin S.Y.
        • Mahoney M.R.
        • Sinsky C.A.
        Ten ways artificial intelligence will transform primary care.
        J Gen Intern Med. 2019; 34: 1626-1630
        • Kueper J.K.
        • Terry A.L.
        • Zwarenstein M.
        • Lizotte D.J.
        Artificial intelligence and primary care research: a scoping review.
        Ann Fam Med. 2020; 18: 250-258
        • Gottliebsen K.
        • Petersson G.
        Limited evidence of benefits of patient operated intelligent primary care triage tools: findings of a literature review.
        BMJ Health Care Inform. 2020; 27e10114
        • Asan O.
        • Bayrak A.E.
        • Choudhury A.
        Artificial intelligence and human trust in healthcare: focus on clinicians.
        J Med Internet Res. 2020; 22e15154
        • Dankwa-Mullan I.
        • Rivo M.
        • Sepulveda M.
        • Park Y.
        • Snowdon J.
        • Rhee K.
        Transforming diabetes care through artificial intelligence: the future is here.
        Popul Health Manag. 2019; 22: 229-242
        • Halamka J.
        • Cerrato P.
        An FP's guide to AI-enabled clinical decision support.
        J Fam Pract. 2019; 68 (486;488;490;492)
        • He J.
        • Baxter S.L.
        • Xu J.
        • Xu J.
        • Zhou X.
        • Zhang K.
        The practical implementation of artificial intelligence technologies in medicine.
        Nat Med. 2019; 25: 30-36
        • Yao X.
        • Rushlow D.R.
        • Inselman J.W.
        • et al.
        Artificial intelligence-enabled electrocardiograms for identification of patients with low ejection fraction: a pragmatic, randomized clinical trial.
        Nat Med. 2021; 27: 815-819
        • Yao X.
        • McCoy R.G.
        • Friedman P.A.
        • et al.
        ECG AI-Guided Screening for Low Ejection Fraction (EAGLE): rationale and design of a pragmatic cluster randomized trial.
        Am Heart J. 2020; 219: 31-36
        • Bowling C.B.
        • Sanders P.W.
        • Allman R.M.
        • et al.
        Effects of enalapril in systolic heart failure patients with and without chronic kidney disease: insights from the SOLVD Treatment trial.
        Int J Cardiol. 2013; 167: 151-156
        • Ponikowski P.
        • Voors A.A.
        • Anker S.D.
        • et al.
        2016 ESC guidelines for the diagnosis and treatment of acute and chronic heart failure: the task force for the diagnosis and treatment of acute and chronic heart failure of the European Society of Cardiology (ESC). Developed with the special contribution of the Heart Failure Association (HFA) of the ESC.
        Eur Heart J. 2016; 37: 2129-2200
        • Wang T.J.
        • Levy D.
        • Benjamin E.J.
        • Vasan R.S.
        The epidemiology of “asymptomatic” left ventricular systolic dysfunction: implications for screening.
        Ann Intern Med. 2003; 138: 907-916
        • Wen A.
        • Fu S.
        • Moon S.
        • et al.
        Desiderata for delivering NLP to accelerate healthcare AI advancement and a Mayo Clinic NLP-as-a-service implementation.
        NPJ Digit Med. 2019; 2: 130
        • R Core Team
        R: A language and environment for statistical computing. In: Vienna, Austria: R Foundation for Statistical Computing; 2020.
        • Yancy C.W.
        • Jessup M.
        • Bozkurt B.
        • et al.
        2017 ACC/AHA/HFSA focused update of the 2013 ACCF/AHA guideline for the management of heart failure: a report of the American College of Cardiology/American Heart Association Task Force on Clinical Practice Guidelines and the Heart Failure Society of America.
        Circulation. 2017; 136: e137-e161
        • Yancy C.W.
        • Jessup M.
        • Bozkurt B.
        • et al.
        2013 ACCF/AHA guideline for the management of heart failure: a report of the American College of Cardiology Foundation/American Heart Association Task Force on practice guidelines.
        Circulation. 2013; 128: e240-e327
        • Kilsdonk E.
        • Peute L.W.
        • Knijnenburg S.L.
        • Jaspers M.W.
        Factors known to influence acceptance of clinical decision support systems.
        Stud Health Technol Inform. 2011; 169: 150-154
        • Kortteisto T.
        • Komulainen J.
        • Mäkelä M.
        • Kunnamo I.
        • Kaila M.
        Clinical decision support must be useful, functional is not enough: a qualitative study of computer-based clinical decision support in primary care.
        BMC Health Serv Res. 2012; 12: 349
        • Makam A.N.
        • Lanham H.J.
        • Batchelor K.
        • et al.
        The good, the bad and the early adopters: providers' attitudes about a common, commercial EHR.
        J Eval Clin Pract. 2014; 20: 36-42
        • Sutton R.T.
        • Pincock D.
        • Baumgart D.C.
        • Sadowski D.C.
        • Fedorak R.N.
        • Kroeker K.I.
        An overview of clinical decision support systems: benefits, risks, and strategies for success.
        NPJ Digit Med. 2020; 3: 17
        • Laka M.
        • Milazzo A.
        • Merlin T.
        Factors That Impact the Adoption of Clinical Decision Support Systems (CDSS) for Antibiotic Management.
        Int J Environ Res Public Health. 2021; 18: 1901
        • Decker S.L.
        • Jamoom E.W.
        • Sisk J.E.
        Physicians in nonprimary care and small practices and those age 55 and older lag in adopting electronic health record systems.
        Health Aff (Millwood). 2012; 31: 1108-1114
        • Wylie M.C.
        • Baier R.R.
        • Gardner R.L.
        Perceptions of electronic health record implementation: a statewide survey of physicians in Rhode Island.
        Am J Med. 2014; 127 (1010.e1021-e1017)
        • Vishwanath A.
        • Brodsky L.
        • Shaha S.
        • Leonard M.
        • Cimino M.
        Patterns and changes in prescriber attitudes toward PDA prescription-assistive technology.
        Int J Med Inform. 2009; 78: 330-339
        • DesRoches C.M.
        • Campbell E.G.
        • Rao S.R.
        • et al.
        Electronic health records in ambulatory care — a national survey of physicians.
        N Engl J Med. 2008; 359: 50-60
        • Kapelios C.J.
        • Bonou M.
        • Barmpagianni A.
        • et al.
        Early left ventricular systolic dysfunction in asymptomatic patients with type 1 diabetes: a single-center, pilot study.
        J Diabetes Complications. 2021; 35: 107913
        • Cioffi G.
        • Viapiana O.
        • Ognibeni F.
        • et al.
        Prognostic role of subclinical left ventricular systolic dysfunction evaluated by speckle-tracking echocardiography in rheumatoid arthritis.
        J Am Soc Echocardiogr. 2017; 30: 602-611
        • Hatz M.H.
        • Sonnenschein T.
        • Blankart C.R.
        The PMA scale: a measure of physicians' motivation to adopt medical devices.
        Value Health. 2017; 20: 533-541