Abstract
Objectives: To report quantitative and qualitative analyses of features, functionalities, organizational, training, clinical specialties, and other factors that impact electronic health record (EHR) experience based on a survey by two large healthcare systems.
Materials and Methods: A total of 816 clinicians—352 (43 percent) physicians, 96 (12 percent) residents/fellows, 177 (22 percent) nurses, 96 (12 percent) advanced practice providers, and 95 (12 percent) allied health professionals—completed surveys on different EHRs. Responses were analyzed for quantitative and qualitative factors. The measured outcome was calculated as a net EHR experience.
Results: Net EHR experience represents overall satisfaction that clinicians report with the EHR and its usability. EHR experience for Virginia Commonwealth University Medical Center and University of Chicago Medicine was low. There were noticeable differences in physician and nursing experiences with EHRs at both universities. EHR personalization, years of practice, impact on efficiency, quality of care, and satisfaction with EHR training contributed significantly to the net EHR experience. Satisfaction of certain specialty practitioners such as endocrinology, family medicine, infectious disease, nephrology, neurology, and pulmonology was noted to be especially low. Ability to use a split-screen function to view labs, follow-up training from other providers rather than vendors, reduced documentation time burden, fewer click boxes, more customizable order sets, improved messaging, e-prescribing, and improved integration were the most common desired EHR improvements requested on qualitative analysis.
Discussion: EHR experience was low regardless of the system and may be improved by better EHR training, increased utilization of personalization tools, reduced documentation burden, and enhanced EHR design and functionality. There was a difference between provider and nursing experiences with the EHR.
Conclusion: Designing better EHR training, increasing utilization of personalization tools, enhancing functionality, and decreasing documentation burden may lead to a better EHR experience.
Introduction
Despite early optimism regarding the potential benefits of electronic health records (EHRs), reported outcomes with ubiquitous use are mixed, and EHRs are believed to contribute greatly to physician burnout.1 Clinicians have expressed dissatisfaction with EHRs due to poor usability, time-consuming data entry, interference with face-to-face patient time, inefficiency, lack of interoperability, and degradation of documentation.2 Moreover, studies indicate that EHRs create clerical and cognitive burden, as well as interruptions and distractions that negate any benefits.3 EHRs place much greater demands on a clinician’s time compared to paper-based charts.4
Most clinicians see value in EHRs, but they want substantial improvements. Experiences with EHRs can affect career plans as well as satisfaction. A 2017 study by Sinsky et al. reported approximately 20 percent of physicians surveyed planned to reduce clinical hours in the next 12 months, and 26 percent planned to leave practice in the next two years. Dissatisfaction with EHRs was an independent risk factor for those plans.5 In another study by Robertson et al., survey data indicated that 53 percent of primary care clinicians reported they were “dissatisfied or very dissatisfied” with their work-life balance and that EHRs had a negative impact on work-life balance.6
Many healthcare organizations are evaluating current EHR practices to identify factors that impact and improve EHR experience. Even though a lot has been published regarding poor experience with EHRs and their impact on clinicians, not much is known regarding functionality, organizational, training, and other factors that impact EHR experience. This is the first article based on the KLAS Arch Collaborative survey to report qualitative and quantitative factors that influence the clinician’s experience with the EHR as well as compare experiences of physicians, nursing, and clinical subspecialties from two large academic medical centers.7 These factors identify areas of concern and focus on future EHR usability enhancements and implementation strategies that could lead to a better EHR experience and possibly decreased clinician burnout.
Methods
Settings
Virginia Commonwealth University Health System (VCU) is a large, urban academic medical center located in Richmond, Virginia. VCU staff includes more than 830 physicians, 750 residents and fellows, and 400 advanced practice providers representing 200 specialties. VCU has used Cerner EHR in outpatient and inpatient setting since 2006.8
University of Chicago Medicine (UCM) clinical facilities include the University of Chicago Medical Center, Ingalls Memorial (a community-based hospital and outpatient facility in Harvey, Illinois), and dozens of outpatient clinics around the Chicago area. Physician staff includes 848 attending physicians and 1,132 residents and fellows. Epic EHR modules have been in use since 1995 at the University of Chicago and ambulatory, and inpatient documentation were fully adopted in 2011. Ingalls Health System completed a merger with the University of Chicago Medicine in October 2016 and used Cerner Soarian Inpatient Clinicals. Medical staff members at Ingalls use an array of ambulatory EHR solutions.9
Survey Instrument
The EHR Experience Survey is web-based and offered free for KLAS Arch Collaborative members. A PDF report summary and spreadsheet were sent to the participating organizations shortly after survey completion.
The survey is organized into the following themes 1) General Background; 2) Training; 3) EHR Personalization; 4) Satisfaction with EHR features; 5) Satisfaction with Organization; and 6) Free Text Comments. The surveys used by VCU and UCM were not identical and were administered six months apart. At VCU, the survey results were frequently reported for “all clinicians” and “physicians” only; at UCM, reports were divided into “all clinicians,” “physicians and advanced practitioners,” “nurses,” and “allied health professionals.” Details of the survey are displayed in Table 1.
Subject Recruitment
At University of Chicago Medicine and Virginia Commonwealth University, emails linked to the KLAS survey were sent on two occasions two weeks apart to all attending physicians, non-physician care providers, nursing staff members, and other non-provider clinical staff members. The survey was open for VCU from September 2017 to December 2017. Initial and subsequent survey messages were sent to UCM in April and May of 2018.
Quantitative Analysis
Most survey question responses used a Likert scale “very dissatisfied,” “dissatisfied,” “satisfied,” and “very satisfied” and were reported as dummy codes 1-4, with 1 being equivalent to “very dissatisfied” and 4 “very satisfied.” Several questions also included “indifferent” in the answer choices, leading to a 1-5 scale. The outcome of interest was the “net EHR experience” score. It was calculated by subtracting all negative responses from all positive responses on a scale of negative 100 to positive 100.
Comparisons for the continuous variables among groups were conducted using the Wilcoxon rank-sum test, while the chi-square test was used for comparing proportions. Jamovi 1.2.5.0 (The jamovi project (2021), jamovi (Version 1.6), and R 3.6.0 (R Foundation for Statistical Computing, Vienna, Austria) were used to analyze the quantitative results of the survey.
Qualitative Analysis
Two comment sections were included in each survey and asked the following questions: “What was the most valuable EHR feature?” and “What are the desired improvements?” These qualitative data were analyzed using a modified template approach.10,11 A 30 percent sample of the comments for each of the two open-ended questions was reviewed to identify common, recurrent themes. These identified themes served as an initial codebook. Coding was then conducted by a team of two informatics fellows, a computer science undergraduate student and an attending physician who initially coded responses independently. To ensure inter-rater reliability, coding professionals then met in pairs to review their respective codes, discuss to consensus, and modify the initial codebook by adding any newly identified recurrent themes. To serve as an index of relative salience among survey respondents, frequencies of individual coded occurrences within each theme were computed using the “countif” statistical function in Microsoft Excel. Themes that were observed to have relatively high coding frequencies were then independently reviewed for sub-themes by a practicing physician (DL) and a social scientist (MQ), who subsequently met to discuss to consensus.
Results
Clinician Characteristics
Table 2 lists clinician characteristics. Continuous data were rounded to the nearest whole number. Overall, the clinician characteristics of both institutions were similar, with the exception that UCM had a higher percentage of medical specialties compared to VCU and a higher percentage of EHR users from the ambulatory and inpatient locations.
VCU
The survey was sent to 2,120 clinicians using a health system listserv. Of the 429 clinicians who began the survey, 361 completed the survey, with an overall survey return rate of 17 percent. Clinical background included 150 physicians, 59 resident/fellows, 84 nursing staff members, 59 PA/NPs, and nine allied health clinicians (Table 2). Eleven percent of physicians have been practicing between zero and four years, 43 percent between five and 14 years, and 30 percent more than 25 years, representing an experienced group of physicians across 31 specialties.
UCM
The survey was sent to 4,800 clinicians. Of the 525 clinicians who began the survey, 455 (87 percent) completed the survey with an overall survey return rate of 9.5 percent. Ten percent of physicians had been practicing between zero to four years, 32 percent between five and 14 years, and 31 percent more than 25 years. Approximately half (48 percent) of physicians reported using the EHR in both inpatient and ambulatory settings. Seventy-four (40 percent) mainly used the EHR in ambulatory settings, and 23 (12 percent) used the EHR in inpatient settings. A majority (75 percent) of physicians had practiced on an EHR for more than five years. Two-hundred three (80 percent) used Epic and 22 (11 percent) used Cerner. One-hundred thirty-four (63 percent) were adult doctors, 43 (20 percent) represented both adult and peds, and 36 (17 percent) pediatrics.
Physician and Nursing Staff Satisfaction with EHR Features
VCU physicians consistently reported poor satisfaction with most EHR features as compared to their nursing peers (Table 3). The difference in satisfaction at VCU reached statistical significance for ratings on: easy to learn, analytics, and impact on efficiency, suggesting the EHR’s functional and design challenges impacted physician workflow.
At UCM, there was not a statistically significant different between physicians and nurses in terms of the net experience score. However, they were statistically different for satisfaction responses on reliability (83 percent versus 68 percent), indicating nursing perceived EHR use to be less reliable compared to physicians. UCM physicians’ satisfaction rating on analytics and efficiency was also noted to be statistically significant compared to nursing peers. Only 47 percent of physicians at both universities were satisfied with EHR-enabled quality of care, needed functionality (41 percent), expected external integration (21 percent), efficiency (19 percent), and analytics and reporting (16 percent) capabilities. Satisfaction with vendor EHR design quality (29 percent) and EHR implementation and support (37 percent) were also noted to be low.
Thirty-three percent of clinicians at VCU and 47 percent of clinicians at UCM were satisfied with the quality of EHR design. Subgroup analysis of vendor satisfaction showed that only 25 percent of physicians at VCU were satisfied compared to 42 percent at UCM. Thirty-three percent of UCM physicians were satisfied with their EHR implementation and support; at VCU, 39 percent physicians were satisfied. Satisfaction with the personal endeavor to learn the EHR was consistently reported to be high for nursing staff as compared to physicians at both institutions and was statistically significant.
Satisfaction with initial training was significantly lower for physicians compared to the nursing staff at both institutions.
Physicians at VCU reported completing 40 percent of charts during clinic hours as compared to 90 percent by nursing peers. At UCM, only 40 percent of physicians were satisfied with patient safety and EHR, and 32 percent were satisfied by patient-centeredness of EHR.
Satisfaction of physicians and practitioners at VCU in cardiology, endocrinology, family medicine, obstetrics and gynecology, infectious disease, neurology, pulmonology, and otolaryngology specialty areas was noted to be below 50 percent, whereas for UCM, more than 50 percent of specialty practitioners in emergency medicine, endocrinology, family medicine, infectious disease, neurology, nephrology, and orthopedics surgery reported being dissatisfied (Table 4).
EHR Personalization
The utilization of EHR personalization tools (e.g., order sets) was a significant predictor of EHR experience. At VCU, nine personalization questions were presented to all clinicians irrespective of their clinical background. At UCM, questions were presented only to staff physicians, NP/PAs, and residents/fellows. Personalization responses from non-providers at VCU were removed for this analysis. In addition, based on the number of personalization tools being utilized by providers, responses were further categorized: 1) very low/no personalization for providers using less than two tools; 2) low personalization for providers using between two and three tools; 3) moderate personalization for provider using three to four tools; and 4) high if using more than four personalization tools.
Forty-two percent of providers at VCU and 30 percent of providers at UCM reported very low/no personalization. The utilization of the data input personalization tool was higher than data review and navigation personalization tools. Clinical templates were the most frequently used data input personalization tool both at VCU and UCM. Fifty-eight percent of providers at VCU who utilized clinical templates found them very useful or useful; only 16 percent of the providers did not use clinical templates. At UCM, 62 percent of providers found templates useful or very useful; 14 percent of providers did not use them.
Interestingly, order sets, a tool that has a potential to decrease clinical variation, was not utilized by 47 percent of providers at both institutions. Only 32 percent of providers at VCU and 30 percent at UCM found them useful. More than 50 percent of providers at VCU did not use report views, shortcuts, filters, sorting orders, or layouts; while, at UCM, utilization of report views, shortcuts, sort orders, and layouts were in the range of 39-59 percent.
Net EHR Experience
The overall net EHR experience score for VCU was 6.2, or slightly positive, while the net EHR score for UCM was higher at 19.7. The net EHR experience for VCU physicians was negative 6.2, which was significantly worse compared to the nursing staff experience score at VCU of 14.92.
The UCM net EHR experience score for physicians and nursing was noted to be lower than the average UCM score at 10.36 and 10.19.
Qualitative Results
Most Valuable EHR Features
Five common themes characterized respondents’ comments regarding valuable EHR features: 1) communication, 2) e-prescribing, 3) training and support, 4) vendor responsiveness, and 5) efficiency. The communication functions of “electronic consults,” “updates of patient information,” and “discharge instructions” were cited as valuable features. “Ability to e-prescribe Schedule II drugs” and “having access to medication fill history” were reported as valuable e-prescribing features, while having “weekly updates” and “relevant tutorials” were reported as valuable vendor responsiveness functions. Within the efficiency theme, 10 sub-themes were identified (Table 5).
Desired Improvements
Across both UCM and VCU samples, six common themes characterized respondents’ comments regarding desired EHR improvements: 1) ability to view labs during a patient encounter, 2) follow-up training and support, 3) dictation function, 4) alarm fatigue/validity 5) improved communication function, and 6) improved functionality. Ability to use a split-screen function to view labs during a patient encounter and highlighted abnormal lab findings were reported as desired lab improvements. Follow-up training from other providers rather than vendors who don’t understand the workflow was also cited as a needed improvement. The frequency of unnecessary “pop-up” reminders and notes was reported as distracting and inefficient, as was the need to cut and paste from earlier notes.
Within the “improved functionality” theme, a large number of comments were noted in both the UCM and VCU samples (144 and 163, respectively). Subsequent review identified 15 sub-themes within the “efficiency” theme. The most frequently cited desired improvements in functionality sub-themes were reduced documentation time burden, fewer click boxes for patients, more customizable order sets, improved messaging, and e-prescribing, and improved internal integration (Table 5 and Table 6).
Discussion
EHR experience scores were low at both institutions and worse among physicians than nurses. Physicians at both universities had lower satisfaction scores for most domains of EHR features, with significantly lower scores for ease of learning, analytics, and reporting and efficiency compared to nursing peers, suggesting EHR usability may be disproportionately worse for physicians as compared to other clinician groups. Moreover, only 47 of physicians at both universities were satisfied with EHR enabled quality of care, needed functionality (41), and expected external integration (21). Satisfaction of certain specialty practitioners such as endocrinology, family medicine, infectious disease, nephrology, neurology, and pulmonology is low, suggesting the need for better EHR design and training to meet the demand of specialty areas. Previous studies from Emani et al. and Shanafelt et al. have reported similar physician concerns and challenges with provider use of an EHR system.12,13
In our survey, a minority (25 VCU versus 42 UCM) of physicians reported satisfaction with EHR design quality, a finding consistent with a “not acceptable” EHR usability ranking. Usability challenges are related to several factors that include vendor non-adherence to EHR usability standards but also related to customization choices made by both the EHR vendors and healthcare providers during EHR implementation. According to two recent studies, the average System Usability Scores on EHR use by physicians was 46 and for nurses was 58, both failing grades.14,15 This difference between physician and nurse usability and satisfaction scores might be partially explained by VCU data where nurses had a much higher chart completion rate during working hours. In addition, our survey noted low satisfaction with EHR implementation and support at both institutions (34 UCM versus 39 VCU). Low satisfaction with EHR usability, implementation, and support underscores the need for research and investment in the discovery, dissemination, and application of scientifically proven best practices for EHR implementation and governance.16
Utilization of EHR personalization tools is a major predictor of EHR experience. Data input personalization tools like clinical templates, order sets, macros, and order lists improve EHR efficiency by streamlining, organizing, and reducing the effort needed to input data into the EHR. Forty-two percent of providers at VCU and 30 percent of providers at UCM utilized very low/no personalization. More than 50 percent of providers at VCU did not use report views, shortcuts, filters, sorting orders, and layouts, while at UCM, utilization of report views, shortcuts, sort orders, and layouts were in the range of 39-59 percent, which suggests the need for a significant investment in training and EHR personalization. Improved personalization could lead to better EHR efficiency, better physician agreement that the EHR enables quality care, more provider trust that the EHR vendor has built a quality tool, and higher overall EHR satisfaction.
A significant predictor of net EHR experience score at VCU was initial training. Deliberate and comprehensive end-user training is essential for implementation, actualization, and end-user satisfaction. Given the variety of roles and specialized workflows performed by medical staff, physicians comprise a unique group of end users for whom distinct recommendations are essential. However, there are few guidelines in the literature addressing the development and implementation of an EHR training program for physicians. At UCM, the net EHR experience score was highly correlated with several areas, including initial training, clinical practice, specialty, follow-up training, and personalization. Because two-thirds of UCM respondents used Epic, it is possible that differences in the UCM and VCU survey responses may relate both to vendor differences and to rollout, design, and training philosophy differences. While vendor differences may affect the extent of personalization possible, the correlation between higher net experience scores and personalization at UCM suggests that enabling personalization is a tactic that may improve user experience. Our findings regarding the importance of training and personalization were mirrored in a 2019 study by several Arch Collaborative members but add to the literature by providing differences between physician and nursing experiences and qualitative data analysis.17
Additional significant predictors of net EHR experience include satisfaction with EHR impact on clinical efficiency, quality of care, ease of learning, available analytics capabilities, and internal and external integration, highlighting areas of focus for future EHR enhancements.
The qualitative analysis revealed several additional themes. In the most valuable improvements comment section, the greatest number of responses reflected efficiency gains. This was true at both VCU and UCM and suggests system changes that result in improved efficiency gains are those deemed most valuable by users. Other categories included improvements in communication, training, prescribing, and vendor responsiveness. In the desired improvements comment section, responses related to functionality greatly exceeded other categories. Further, when sub-themes for functionality were explored, reducing the documentation time burden was the largest category at both VCU and UCM.
Several limitations of this study should be noted. In addition to a low survey response rate, the majority of responses from UCM reflected the use of a single EHR (Epic); but approximately 30 percent of responses were from users at Ingalls Memorial who used Cerner Soarian and a variety of other ambulatory EHR systems. The inter-institutional comparison reported did not account for this intra-institutional variation. In addition, the surveys administered to the two academic medical centers were not identical, making exact comparisons difficult. The survey was not validated and did not utilize an instrument to evaluate EHR usability or burnout. The workflow of physicians, nursing, and allied health professionals is very different. This survey is not specific enough to capture all aspects of the usability of all functional groups.
Conclusion
Despite near-universal adoption of outpatient and inpatient electronic health records, substantial EHR experience challenges persist. In this paper, we report the quantitative and qualitative analyses of feature, functionalities, organizational, training, and other factors that affect EHR experience while highlighting differences in experiences of physicians, nursing, and clinical specialties based on a survey by two large healthcare systems. Experience with EHRs and their usability is low regardless of the system. It may be improved by designing better EHR training by the providers who understand the EHR system, clinical content, and workflow; decreasing documentation burden; increasing utilization of personalization tools; enhancing external and internal integration and functionality features; having fewer click boxes and more customizable order sets; improving messaging, e-prescribing, and factors that impact efficiency and clinical care. A substantial percentage of clinicians received no follow-on EHR training, an area for future improvement.
In the post-meaningful use era, we anticipate attention to usability by EHR vendors and reduced documentation requirements by government and private insurers. Further research and investment are needed to determine best practices.
Acknowledgements
Taylor Davis, PhD, for guidance with data analysis and article review.
Bhakti Dave, MPH, for her assistance with the manuscript preparation.
Competing Interests: None
Funding: None
Notes
1. Gold M, McLaughlin CG. Assessing HITECH Implementation and Lessons: 5 Years Later. Milbank Quarterly 2016;94(3):654-687
2. Ehrenfeld JM, Wanderer JP. Technology as friend or foe? Do electronic health records increase burnout? Current Opinions in Anesthesiology 2018;31:357-360
3. Shanafelt TD, Dyrbye LN, Sinsky et al. Relationship Between Clerical Burden and Characteristics of the Electronic Environment with Physician Burnout and Professional Satisfaction. Mayo Clinic Proceedings 2016;91(7):836-848
4. Arndt BG, Beasley JW, Watkinson MD et al. Tethered to the EHR: Primary Care Physician Workload Assessment Using EHR Event Log Data and Time Motion Observations. Annals of Family Medicine. 2017;15(5):419-426
5. Sinsky CA, Dyrbye LN, West CP, et al. Professional Satisfaction and the Career Plans of US Physicians. Mayo Clinic Proceedings. 2017;92(11):1625–35
6. Robertson SL, Robinson MD, Reid A. Electronic Health Record Effects on Work-Life Balance and Burnout Within the I 3 Population Collaborative. Journal of Graduate Medical Education 2017;9(4):479–84.
7. Klas Arch Collaborative https://klasresearch.com/arch-collaborative [Accessed December 31, 2018]
8. VCU Annual Reports https://annualreports.vcu.edu/vcuhealth/yir.html [Accessed December 30, 2018]
9. University of Chicago Medicine https://www.uchicagomedicine.org/ [Accessed December 30, 2018]
10. Crabtree BF, Miller WL. Using codes and code manuals: A template organizing style of interpretation. In Crabtree BF and Miller WL (Eds.), Doing Qualitative Research, 2nd edition (163-177), Newbury Park, CA: Sage, 1999
11. Saldaña J. The Coding Manual for Qualitative Researchers. 2nd ed. SAGE; 2013.
12. Shanafelt TD, Dyrbye LN, Sinsky et al. Relationship Between Clerical Burden and Characteristics of the Electronic Environment with Physician Burnout and Professional Satisfaction. Mayo Clinic Proceedings 2016;91(7):836-848
13. Emani S, Ting DY, Healey M, et al. Physician beliefs about the meaningful use of the electronic health record: a follow-up study. Applied Clinical Informatics. 2017 Dec;8(4):1044–1053. doi: 10.4338/ACI-2017-05-RA-0079
14. Melnick ER, Dyrbye LN, Sinsky CA, et al. The association between perceived electronic health record usability and professional burnout among US physicians. Mayo Clinic Proceedings 2019: 12. doi: 10.1016/j.mayocp.2019.09.024.
15. Melnick ER, West CP, Nath B et al. The association between perceived electronic health record usability and professional burnout among US nurses. Journal of the American Medical Informatics Association 2021;00(0);1-10
16. Longhurst CA, Davis T, Maneker M et al. Local Investment in Training Drives Electronic Health Record User Satisfaction. Applied Clinical Informatics 2019;10;331-335
17. Ibid.
Author Biographies
Vimal Mishra (vimal.mishra@vcuhealth.org) serves as the director of digital health at the American Medical Association (AMA) and a medical director and associate professor of medicine at Virginia Commonwealth University.
David Liebovitz (davidl@northwestern.edu) is a practicing internal medicine physician and co-director of the Center for Medical Education in Data Science and Digital Health within the Institute for Augmented Intelligence in Medicine for the Feinberg School of Medicine at Northwestern University.
Michael Quinn (mquinn@medicine.bsd.uchicago.edu) is a senior research scientist with the Department of Medicine at the University of Chicago.
Le Kang (le.kang@vcuhealth.org) is an associate professor and program director for clinical research and biostatistics in the Department of Biostatistics at Virginia Commonwealth University
Thomas Yackel (thomas.yackel@vcuhealth.org) is the president of VCU Health MCV Physicians and senior associate dean for clinical affairs at the Virginia Commonwealth University.
Robert Hoyt (robert.hoyt@vcuhealth.org) is an associate clinical professor in the Department of Medicine at Virginia Commonwealth University School of Medicine.