Electronic Health Record Documentation Times among Emergency Medicine Trainees

by Scott Crawford, MD; Igor Kushner, MD; Radosveta Wells, MD; and Stormy Monks, PhD

Abstract

Physicians spend a large portion of their time documenting patient encounters using electronic health records (EHRs). Meaningful Use guidelines have made EHR systems widespread, but they have not been shown to save time. This study compared the time required to complete an emergency department note in two different EHR systems for three separate video-recorded standardized simulated patient encounters. The total time needed to complete documentation, including the time to write and order the initial history, physical exam, and diagnostic studies, and the time to provide medical decision making and disposition, were recorded and compared by trainee across training levels. The only significant difference in documentation time was by classification, with second- and third-year trainees being significantly faster in documenting on the Cerner system than fourth-year medical student and first-year trainees (F = 8.36, p < .001). Level of training and experience with a system affected documentation time.

Keywords: electronic health record; electronic medical record; simulation; documentation; training; time

Introduction and Background

The electronic health record (EHR), in the era of Meaningful Use guidelines, is supposed to improve quality, safety, and efficiency and reduce health disparities.1 Goals for EHRs suggested by HealthIT.gov note that EHRs improve patient care, patient outcomes, and care coordination; allow practices to increase their efficiency; and reduce financial expenditure.2 However, research has shown that EHRs have not reduced physician data entry time.3, 4 In addition, emergency medicine physicians report spending between 23 percent and 65 percent of their scheduled patient care time on electronic documentation.5–7 Frustration with the lack of ease of use associated with EHR systems contributes to physician burnout8–10 and a perceived loss of professional autonomy.11 Moreover, the time devoted to EHR systems is associated with physicians feeling disconnected from their patients and team12–15 and has led to an explosion in the use of scribes to assist with documentation over the past 10 years.16–19 Studies that have examined the effect of EHRs on overall emergency department (ED) efficiency have been conflicting.20–23 The Technology Acceptance Model has been used to evaluate the factors that play into the actual utilization of any new technology: perceived usefulness, perceived ease of use, attitude toward using, and intention to use.24 One of the largest barriers to use is suggested to be perceived ease of use. This factor is thought to be related to lack of time and practice with the system used; however, it has been suggested that increased use lessens the effect of perceived ease of use on actual use.25 However, the acceptance of EHRs by physicians is of little consequence because the use of EHRs is mandated in order to obtain government reimbursement from the Centers for Medicare and Medicaid Services.26

Recognizing the importance of EHR technology in today’s emergency medicine (EM) practice, the Accreditation Council for Graduate Medical Education and American Board of Emergency Medicine’s Emergency Medicine Milestone Project suggests that the Systems Based Practice Technology (SBP3) milestone should indicate residents who are able to critically assess the use of EHRs, and a trainee who has reached Level 5 mastery should be able to recommend ways to redesign systems to improve computerized processes. This milestone indicates the importance of involving physicians in the design and implementation of EHRs, and physicians should seek an active role in improving the quality and efficiency of these systems.

The purpose of this study was to examine the time required to document video-recorded standardized patient encounters in two different EHR systems in order to examine the efficiency and accuracy of a novel EHR system. In addition, this study examined the relationship between training levels and documentation time.

Methods

A prospective randomized trial was conducted at a three-year academic EM residency program. The study had 47 participants obtained by convenience sampling. All residents and rotating fourth-year medical students (MS4) were voluntary participants in this study. All participants provided informed consent. This study was approved by the Institutional Review Board of the sponsoring university.

Two EHR user interfaces were compared:

  1. the Cerner FirstNet system currently used in the hospital’s ED, and
  2. the Sparrow Emergency Department Information System, an EHR system unknown to all study participants.

Cerner is the most widely used enterprise-level EHR system in the world. It is used across inpatient, outpatient, and ED systems for labs, documentation and review of all forms of charting, orders, and viewing of results.27 The Sparrow system is an iPad application designed to be an overlay for an existing EHR. It is an alternative data entry interface that links with an existing EHR to upload patient data using the universal Health Language 7 (HL7) format. The Sparrow system was made available by the designer (Montrue Technologies) for use in this study without any fee or payment from the company. Additionally, the company was not part of the research design, data collection, or analysis.

An instructional video was created to explain the use of each interface, including screenshots with highlighted buttons to demonstrate the correct procedure for progressing through a physician note, ordering tests, and entering lab results. The screenshots were arranged in the same order and flow, and a voiceover described the process to complete the note for both systems.

Participants watched both videos before beginning the study. The Cerner FirstNet system was run on a Dell Latitude E5420 laptop using the hospital information technology training environment to ensure that no actual patient data were available. The Sparrow system was operated on an iPad mini (Model A1432) with an auxiliary keyboard (Logitech model Y-R0038) that also provided a stand for the iPad. Because the system was separate from the usual hospital data system, no macros or order sets were available to users for either system.

Each participant was shown video recordings for three standardized patient encounters to ensure consistency in the clinical information provided and to standardize the lab/imaging studies ordered and the final disposition of the patient (see Table 1). All three patient encounters reflected typical presentations. Each video was divided into two parts. The first portion included the initial history, physical exam, and expected orders, followed by a planned intermission. The physical exam findings were provided to ensure consistency of documentation. During the intermission, the study participants were instructed to log in to their assigned electronic documentation system and begin documenting according to their usual practice, entering the provided history and physical examination results, laboratory tests, and imaging studies. The time to complete the history and physical examination with order entry was recorded as time 1. The participants then watched the second portion of the video, which provided the results of the laboratory and imaging studies and the instructions for the final disposition of the patient. The documentation of laboratory results, interpretation of the workup, and documentation of the patient’s disposition was recorded as time 2. Time 1 and time 2 were added together to give a total time for each case.

The three standardized patients were as follows:

  1. a 34-year-old woman with feelings of depression and suicidal ideation who required a workup for possible ingestion and eventual transfer to an inpatient psychiatric institution,
  2. a 54-year-old man with atypical chest pain who has a negative cardiac workup and is eventually referred for outpatient stress testing, and
  3. a 45-year-old man with severe right upper quadrant abdominal pain with an ultrasound and labs consistent with cholecystitis who required IV antibiotics and admission to surgery.

After using the initially assigned system for two complete video-recorded standardized patient encounters, all participants switched to the other system. We used a randomized block design to assign the order of the three patient encounters to minimize bias from cases that might require less documentation. Thus, all study participants were assigned to one system for two interactions and the other system for one interaction. An average time for the paired system was calculated for each participant.

At the completion of the three patient encounters, participants indicated their age, preference of system, and whether they owned an iPad.

Results

Prior to the analysis, all study variables were examined for accuracy of data input using univariate descriptive statistics. Out-of-range values, implausible responses, and univariate outliers were noted and corrected by reexamining the raw data. Computed variables included total time for each patient encounter as well as a total average Sparrow system time and a total average Cerner system time.

The study sample was 63 percent male with a mean age of 30 years. Overall, the sample classification was 34 percent EM resident year 1 (EM1), 28 percent EM resident year 2 (EM2), 23 percent EM resident year 3 (EM3), and 15 percent medical student year 4 (MS4). The majority of the participants owned an iPad (60 percent), and 55 percent preferred the Sparrow system over the Cerner system, whereas 43 percent preferred the Cerner system (one response was left blank). No association was found between documentation time and age, gender, iPad ownership, or preferred system. Documentation time did not correlate with system preference.

The average time to document on the Cerner system was 15.9 minutes for MS4 students, 13.6 minutes for first-year trainees, 11.2 minutes for second-year trainees, and 11.2 minutes for third-year trainees, with an overall average of 12.7 minutes for the Cerner system. The average time to document on the Sparrow system was 16.2 minutes for MS4 students, 14.6 minutes for first-year trainees, 13.2 minutes for second-year trainees, and 14.0 minutes for third-year trainees, with an overall average of 14.3 minutes for the Sparrow system (see Figure 1).

An overall statistically significant difference in time of documentation was found among the classification groups. Tukey’s honestly significant difference post hoc test, a statistical test used to determine where the difference occurred between groups, was completed and showed that second- and third-year trainees were significantly faster in documenting on the Cerner system than MS4 and first-year trainees were (F = 8.36, p < .001). This finding suggests that the level of training and years of experience with a system have a significant effect on documentation time.

The time to document in the Sparrow system was greater than the time to document in the Cerner system across all three standardized patient encounters, with statistically significant differences present for the depression and cholecystitis cases (T = −2.18, p < .05 and T = −2.76, p < .01 respectively; see Table 2).

Physicians in this study spent more time recording data into the EHR for a given patient encounter than they spent watching the video-recorded patient encounter (see Table 2). Of the entire patient encounter time from initial evaluation to disposition, including documentation time, the percentage of time spent documenting was 63 percent, 56 percent, and 62 percent respectively for each patient encounter using the Cerner system and 67 percent, 58 percent, and 66 percent respectively for each patient encounter using the Sparrow system across all participants.

Discussion

Our study demonstrated that participants’ level of training and experience with a system affected their documentation time. Documentation on the Cerner system was faster for more experienced residents (EM2s and EM3s) compared with less experienced users (MS4s and EM1s). This experience benefit was not as dramatic for the Sparrow system, which was new to all users; however, EM2s and EM3s were still faster than EM1s and MS4s, suggesting that some efficiency in documentation may be associated with the ability to distinguish clinically relevant information from the entirety of a patient encounter.

Despite the fact that more providers preferred the Sparrow system, which was designed to have a more user-friendly interface, they did not actually achieve faster documentation with this system as novice users. It is not unreasonable to expect that documentation times would significantly improve with experience on the Sparrow system.

Even though the use of standardized patient care plans in this study meant that time spent in clinical decision making was removed, more time was spent on documentation than was spent with the patient. The findings of our study are consistent with other studies evaluating physician documentation using EHRs. Studies have shown that emergency physicians spend 43 percent of their time on data entry and 28 percent in direct contact with patients,28, 29 with time for electronic documentation ranging between 29 percent and 65 percent.30–32

Studies looking at the efficiency of providers using scribes or alternative documentation services have shown improvement in work efficiency.33 This improvement was demonstrated through an increase in physician productivity34, 35 as measured by the number of patients seen per hour and the number of Relative Value Units (RVUs) per hour.36 This study’s findings suggest that an increased focus on early training of residents in documentation skills may lead to improvement in efficiency when other documentation services are not available.

EHRs are intended to improve efficiency, safety, and quality of care. While EHRs can be beneficial, their use in the ED differs from that in other segments of the healthcare system because of the unpredictable volume, varying acuity, need for multitasking, and frequent interruptions and distractions present in the ED.37–40 Recommendations that would lead to the improvement of the safety of ED information systems have been made.41, 42 An EHR system that would be truly functional is difficult to create because of the unique characteristics of this specialty and because it would require the active participation of emergency physicians. Technical innovations and evaluation of the user experience and patient experience should be incorporated into the training of EM residents.43–47 A combination of functionality and innovation was the original aim of the Sparrow system.

The adoption of a new system can be limited by the perceived usefulness, perceived ease of use, and attitudes toward use. The majority of participants preferred the Sparrow system but actually had longer documentation times with it. This finding suggests that novice learners felt that the Sparrow user interface offered better ease of use, which is consistent with the principles of the Technology Acceptance Model.48

Limitations and Future Work

One of the biggest limitations of this study was the physical input devices for both systems. The laptop was operated without an external mouse, instead using only the included trackpad. On the iPad mini, because of the small screen size, the paired keyboard had a smaller form than that of the laptop. All residents and most medical students had experience with documentation on the Cerner system, so we were comparing a known system to an unknown system. Conclusions about documentation times for experienced Sparrow users cannot be drawn from the data.

Future studies could compare keyboard versus voice dictation for documentation time and could offer a comparison of documenting during a patient encounter and documenting after the encounter in another location, to determine which option offers better efficiency overall. By design, a Sparrow system operating on a mobile device like an iPad might be easier to implement at the bedside, compared with using the Cerner system on a laptop; however, both designs should be studied.

Conclusion

This study demonstrated that the level of training and experience with a system affected the time of documentation. A decrease in the total documentation time for a patient encounter was noted across years of training for the Cerner system on a laptop that was familiar to all users, compared with the newly introduced iPad interface. The data showed faster documentation by EM2s and EM3s on the Sparrow system, but the finding was not statistically significant. No association was found between documentation time and age, gender, iPad ownership, or preferred system. Documentation time did not correlate with system preference. Although the majority of users preferred the Sparrow system, the documentation time was shorter with the Cerner system.

 

 

Scott Crawford, MD, is an assistant professor at Texas Tech University Health Sciences Center El Paso in El Paso, TX.

Igor Kushner, MD, is a first-year resident at Phoenix Children’s Hospital in Phoenix, AZ.

Radosveta Wells, MD, is an assistant professor at Texas Tech University Health Sciences Center El Paso in El Paso, TX.

Stormy Monks, PhD, is an assistant professor at Texas Tech University Health Sciences Center El Paso in El Paso, TX.

 

Notes

  1. HealthIT.gov. “Meaningful Use.” 2017. Available at https://www.healthit.gov/topic/federal-incentive-programs/meaningful-use (accessed May 29, 2018).
  2. HealthIT.gov. “Benefits of EHRs.” 2017. Available at https://www.healthit.gov/topic/health-it-basics/benefits-ehrs (accessed May 29, 2018).
  3. Bukata, R. “When Evaluating EMR Efficacy, Where’s the Beef?” Emergency Physicians Monthly. 2012. Available at http://epmonthly.com/article/when-evaluating-emr-efficacy-wheres-the-beef/.
  4. Häyrinen, K., K. Saranto, and P. Nykänen. “Definition, Structure, Content, Use and Impacts of Electronic Health Records: A Review of the Research Literature.” International Journal of Medical Informatics 77, no. 5 (2008): 291–304.
  5. Dela Cruz, J. E., J. C. Shabosky, M. Albrecht, et al. “Typed versus Voice Recognition for Data Entry in Electronic Health Records: Emergency Physician Time Use and Interruptions.” Western Journal of Emergency Medicine 15, no. 4 (2014): 541–47.
  6. Hill, R. G., L. M. Sears, and S. W. Melanson. “4000 Clicks: A Productivity Analysis of Electronic Medical Records in a Community Hospital ED.” American Journal of Emergency Medicine 31, no. 11 (2013): 1591–94.
  7. Neri, P., L. Redden, S. Poole, et al. “Emergency Medicine Resident Physicians’ Perceptions of Electronic Documentation and Workflow.” Applied Clinical Informatics 6, no. 1 (2015): 27–41.
  8. Benda, N. C., M. L. Meadors, A. Z. Hettinger, and R. M. Ratwani. “Emergency Physician Task Switching Increases with the Introduction of a Commercial Electronic Health Record.” Annals of Emergency Medicine 67, no. 6 (2016): 741–46.
  9. Farley, H. L., K. M. Baumlin, A. G. Hamedani, et al. “Quality and Safety Implications of Emergency Department Information Systems.” Annals of Emergency Medicine 62, no. 4 (2013): 399–407.
  10. Shanafelt, T. D., L. N. Dyrbye, C. Sinsky, et al. “Relationship between Clerical Burden and Characteristics of the Electronic Environment with Physician Burnout and Professional Satisfaction.” Mayo Clinic Proceedings 91, no. 7 (2016): 836–48.
  11. McGinn, C. A., S. Grenier, J. Duplantie, et al. “Comparison of User Groups’ Perspectives of Barriers and Facilitators to Implementing Electronic Health Records: A Systematic Review.” BMC Medicine 9, no. 1 (2011): 46.
  12. Shanafelt, T. D., L. N. Dyrbye, C. Sinsky, et al. “Relationship between Clerical Burden and Characteristics of the Electronic Environment with Physician Burnout and Professional Satisfaction.”
  13. McGinn, C. A., S. Grenier, J. Duplantie, et al. “Comparison of User Groups’ Perspectives of Barriers and Facilitators to Implementing Electronic Health Records: A Systematic Review.”
  14. Nguyen, L., E. Bellucci, and L. T. Nguyen. “Electronic Health Records Implementation: An Evaluation of Information System Impact and Contingency Factors.” International Journal of Medical Informatics 83, no. 11 (2014): 779–96.
  15. Yu, P., S. Qian, H. Yu, and J. Lei. “Measuring the Performance of Electronic Health Records: A Case Study in Residential Aged Care in Australia.” Studies in Health Technology and Informatics 192 (2013): 1035.
  16. Bukata, R. “When Evaluating EMR Efficacy, Where’s the Beef?”
  17. Dela Cruz, J. E., J. C. Shabosky, M. Albrecht, et al. “Typed versus Voice Recognition for Data Entry in Electronic Health Records: Emergency Physician Time Use and Interruptions.”
  18. Arya, R., D. M. Salovich, P. Ohman‐Strickland, and M. A. Merlin. “Impact of Scribes on Performance Indicators in the Emergency Department.” Academic Emergency Medicine 17, no. 5 (2010): 490–94.
  19. Marshall, J. S., C.M. Verdick, M.S. Tanaka, et al. “Implementation of Medical Scribes in an Academic Emergency Department: Effect on Emergency Department Throughput, Clinical Productivity, and Emergency Physician Professional Fees.” Annals of Emergency Medicine 60, no. 4, suppl. (2012): S105.
  20. Bukata, R. “When Evaluating EMR Efficacy, Where’s the Beef?”
  21. Häyrinen, K., K. Saranto, and P. Nykänen. “Definition, Structure, Content, Use and Impacts of Electronic Health Records: A Review of the Research Literature.”
  22. Nguyen, L., E. Bellucci, and L. T. Nguyen. “Electronic Health Records Implementation: An Evaluation of Information System Impact and Contingency Factors.”
  23. Furukawa, M. F. “Electronic Medical Records and the Efficiency of Hospital Emergency Departments.” Medical Care Research and Review 68, no. 1 (2011): 75–95.
  24. Davis, F. D., R. P. Bagozzi, and P. R. Warshaw. “User Acceptance of Computer Technology: A Comparison of Two Theoretical Models.” Management Science 35, no. 8 (1989): 982–1003.
  25. Holden, R. J., and B. T. Karsh. “The Technology Acceptance Model: Its Past and Its Future in Health Care.” Journal of Biomedical Informatics 43, no. 1 (2010): 159–72.
  26. Shanafelt, T. D., L. N. Dyrbye, C. Sinsky, et al. “Relationship between Clerical Burden and Characteristics of the Electronic Environment with Physician Burnout and Professional Satisfaction.”
  27. Monegain, B. “Cerner Has Almost Double EHR Global Market Share of Closest Rival Epic, Kalorama Says.” Healthcare IT News. May 15, 2018. Available at http://www.healthcareitnews.com/news/cerner-has-almost-double-ehr-global-market-share-closest-rival-epic-kalorama-says.
  28. Bukata, R. “When Evaluating EMR Efficacy, Where’s the Beef?”
  29. Hill, R. G., L. M. Sears, and S. W. Melanson. “4000 Clicks: A Productivity Analysis of Electronic Medical Records in a Community Hospital ED.”
  30. Dela Cruz, J. E., J. C. Shabosky, M. Albrecht, et al. “Typed versus Voice Recognition for Data Entry in Electronic Health Records: Emergency Physician Time Use and Interruptions.”
  31. Neri, P., L. Redden, S. Poole, et al. “Emergency Medicine Resident Physicians’ Perceptions of Electronic Documentation and Workflow.”
  32. Benda, N. C., M. L. Meadors, A. Z. Hettinger, and R. M. Ratwani. “Emergency Physician Task Switching Increases with the Introduction of a Commercial Electronic Health Record.”
  33. Dela Cruz, J. E., J. C. Shabosky, M. Albrecht, et al. “Typed versus Voice Recognition for Data Entry in Electronic Health Records: Emergency Physician Time Use and Interruptions.”
  34. Arya, R., D. M. Salovich, P. Ohman‐Strickland, and M. A. Merlin. “Impact of Scribes on Performance Indicators in the Emergency Department.”
  35. Furukawa, M. F. “Electronic Medical Records and the Efficiency of Hospital Emergency Departments.”
  36. Arya, R., D. M. Salovich, P. Ohman‐Strickland, and M. A. Merlin. “Impact of Scribes on Performance Indicators in the Emergency Department.”
  37. McGinn, C. A., S. Grenier, J. Duplantie, et al. “Comparison of User Groups’ Perspectives of Barriers and Facilitators to Implementing Electronic Health Records: A Systematic Review.”
  38. Inokuchi, R., H. Sato, K. Nakamura, et al. “Motivations and Barriers to Implementing Electronic Health Records and ED Information Systems in Japan.” American Journal of Emergency Medicine 32, no. 7 (2014): 725–30.
  39. McGinn, C. A., M. P. Gagnon, N. Shaw, et al. “Users’ Perspectives of Key Factors to Implementing Electronic Health Records in Canada: A Delphi Study.” BMC Medical Informatics and Decision Making 12, no. 1 (2012): 105.
  40. Nemeth, C. P., R. I. Cook, and R. L. Wears. “Studying the Technical Work of Emergency Care.” Annals of Emergency Medicine 50, no. 4 (2007): 384–86.
  41. Farley, H. L., K. M. Baumlin, A. G. Hamedani, et al. “Quality and Safety Implications of Emergency Department Information Systems.”
  42. Handel, D. A., and J. L. Hackman. “Implementing Electronic Health Records in the Emergency Department.” Journal of Emergency Medicine 38, no. 2 (2010): 257–63.
  43. Farley, H. L., K. M. Baumlin, A. G. Hamedani, et al. “Quality and Safety Implications of Emergency Department Information Systems.”
  44. Nguyen, L., E. Bellucci, and L. T. Nguyen. “Electronic Health Records Implementation: An Evaluation of Information System Impact and Contingency Factors.”
  45. McGinn, C. A., M. P. Gagnon, N. Shaw, et al. “Users’ Perspectives of Key Factors to Implementing Electronic Health Records in Canada: A Delphi Study.”
  46. Handel, D. A., and J. L. Hackman. “Implementing Electronic Health Records in the Emergency Department.”
  47. Lobach, D. F., and D. E. Detmer. “Research Challenges for Electronic Health Records.” American Journal of Preventive Medicine 32, no. 5 (2007): S104–S111.
  48. Davis, F. D., R. P. Bagozzi, and P. R. Warshaw. “User Acceptance of Computer Technology: A Comparison of Two Theoretical Models.”

 

Printer friendly version of this article.

Leave a Reply