by Scott Crawford, MD; Igor Kushner, MD; Radosveta Wells, MD; and Stormy Monks, PhD
Physicians spend a large portion of their time documenting patient encounters using electronic health records (EHRs). Meaningful Use guidelines have made EHR systems widespread, but they have not been shown to save time. This study compared the time required to complete an emergency department note in two different EHR systems for three separate video-recorded standardized simulated patient encounters. The total time needed to complete documentation, including the time to write and order the initial history, physical exam, and diagnostic studies, and the time to provide medical decision making and disposition, were recorded and compared by trainee across training levels. The only significant difference in documentation time was by classification, with second- and third-year trainees being significantly faster in documenting on the Cerner system than fourth-year medical student and first-year trainees (F = 8.36, p < .001). Level of training and experience with a system affected documentation time.
Keywords: electronic health record; electronic medical record; simulation; documentation; training; time
The electronic health record (EHR), in the era of Meaningful Use guidelines, is supposed to improve quality, safety, and efficiency and reduce health disparities.1 Goals for EHRs suggested by HealthIT.gov note that EHRs improve patient care, patient outcomes, and care coordination; allow practices to increase their efficiency; and reduce financial expenditure.2 However, research has shown that EHRs have not reduced physician data entry time.3, 4 In addition, emergency medicine physicians report spending between 23 percent and 65 percent of their scheduled patient care time on electronic documentation.5–7 Frustration with the lack of ease of use associated with EHR systems contributes to physician burnout8–10 and a perceived loss of professional autonomy.11 Moreover, the time devoted to EHR systems is associated with physicians feeling disconnected from their patients and team12–15 and has led to an explosion in the use of scribes to assist with documentation over the past 10 years.16–19 Studies that have examined the effect of EHRs on overall emergency department (ED) efficiency have been conflicting.20–23 The Technology Acceptance Model has been used to evaluate the factors that play into the actual utilization of any new technology: perceived usefulness, perceived ease of use, attitude toward using, and intention to use.24 One of the largest barriers to use is suggested to be perceived ease of use. This factor is thought to be related to lack of time and practice with the system used; however, it has been suggested that increased use lessens the effect of perceived ease of use on actual use.25 However, the acceptance of EHRs by physicians is of little consequence because the use of EHRs is mandated in order to obtain government reimbursement from the Centers for Medicare and Medicaid Services.26
Recognizing the importance of EHR technology in today’s emergency medicine (EM) practice, the Accreditation Council for Graduate Medical Education and American Board of Emergency Medicine’s Emergency Medicine Milestone Project suggests that the Systems Based Practice Technology (SBP3) milestone should indicate residents who are able to critically assess the use of EHRs, and a trainee who has reached Level 5 mastery should be able to recommend ways to redesign systems to improve computerized processes. This milestone indicates the importance of involving physicians in the design and implementation of EHRs, and physicians should seek an active role in improving the quality and efficiency of these systems.
The purpose of this study was to examine the time required to document video-recorded standardized patient encounters in two different EHR systems in order to examine the efficiency and accuracy of a novel EHR system. In addition, this study examined the relationship between training levels and documentation time.
A prospective randomized trial was conducted at a three-year academic EM residency program. The study had 47 participants obtained by convenience sampling. All residents and rotating fourth-year medical students (MS4) were voluntary participants in this study. All participants provided informed consent. This study was approved by the Institutional Review Board of the sponsoring university.
Two EHR user interfaces were compared:
Cerner is the most widely used enterprise-level EHR system in the world. It is used across inpatient, outpatient, and ED systems for labs, documentation and review of all forms of charting, orders, and viewing of results.27 The Sparrow system is an iPad application designed to be an overlay for an existing EHR. It is an alternative data entry interface that links with an existing EHR to upload patient data using the universal Health Language 7 (HL7) format. The Sparrow system was made available by the designer (Montrue Technologies) for use in this study without any fee or payment from the company. Additionally, the company was not part of the research design, data collection, or analysis.
An instructional video was created to explain the use of each interface, including screenshots with highlighted buttons to demonstrate the correct procedure for progressing through a physician note, ordering tests, and entering lab results. The screenshots were arranged in the same order and flow, and a voiceover described the process to complete the note for both systems.
Participants watched both videos before beginning the study. The Cerner FirstNet system was run on a Dell Latitude E5420 laptop using the hospital information technology training environment to ensure that no actual patient data were available. The Sparrow system was operated on an iPad mini (Model A1432) with an auxiliary keyboard (Logitech model Y-R0038) that also provided a stand for the iPad. Because the system was separate from the usual hospital data system, no macros or order sets were available to users for either system.
Each participant was shown video recordings for three standardized patient encounters to ensure consistency in the clinical information provided and to standardize the lab/imaging studies ordered and the final disposition of the patient (see Table 1). All three patient encounters reflected typical presentations. Each video was divided into two parts. The first portion included the initial history, physical exam, and expected orders, followed by a planned intermission. The physical exam findings were provided to ensure consistency of documentation. During the intermission, the study participants were instructed to log in to their assigned electronic documentation system and begin documenting according to their usual practice, entering the provided history and physical examination results, laboratory tests, and imaging studies. The time to complete the history and physical examination with order entry was recorded as time 1. The participants then watched the second portion of the video, which provided the results of the laboratory and imaging studies and the instructions for the final disposition of the patient. The documentation of laboratory results, interpretation of the workup, and documentation of the patient’s disposition was recorded as time 2. Time 1 and time 2 were added together to give a total time for each case.
The three standardized patients were as follows:
After using the initially assigned system for two complete video-recorded standardized patient encounters, all participants switched to the other system. We used a randomized block design to assign the order of the three patient encounters to minimize bias from cases that might require less documentation. Thus, all study participants were assigned to one system for two interactions and the other system for one interaction. An average time for the paired system was calculated for each participant.
At the completion of the three patient encounters, participants indicated their age, preference of system, and whether they owned an iPad.
Prior to the analysis, all study variables were examined for accuracy of data input using univariate descriptive statistics. Out-of-range values, implausible responses, and univariate outliers were noted and corrected by reexamining the raw data. Computed variables included total time for each patient encounter as well as a total average Sparrow system time and a total average Cerner system time.
The study sample was 63 percent male with a mean age of 30 years. Overall, the sample classification was 34 percent EM resident year 1 (EM1), 28 percent EM resident year 2 (EM2), 23 percent EM resident year 3 (EM3), and 15 percent medical student year 4 (MS4). The majority of the participants owned an iPad (60 percent), and 55 percent preferred the Sparrow system over the Cerner system, whereas 43 percent preferred the Cerner system (one response was left blank). No association was found between documentation time and age, gender, iPad ownership, or preferred system. Documentation time did not correlate with system preference.
The average time to document on the Cerner system was 15.9 minutes for MS4 students, 13.6 minutes for first-year trainees, 11.2 minutes for second-year trainees, and 11.2 minutes for third-year trainees, with an overall average of 12.7 minutes for the Cerner system. The average time to document on the Sparrow system was 16.2 minutes for MS4 students, 14.6 minutes for first-year trainees, 13.2 minutes for second-year trainees, and 14.0 minutes for third-year trainees, with an overall average of 14.3 minutes for the Sparrow system (see Figure 1).
An overall statistically significant difference in time of documentation was found among the classification groups. Tukey’s honestly significant difference post hoc test, a statistical test used to determine where the difference occurred between groups, was completed and showed that second- and third-year trainees were significantly faster in documenting on the Cerner system than MS4 and first-year trainees were (F = 8.36, p < .001). This finding suggests that the level of training and years of experience with a system have a significant effect on documentation time.
The time to document in the Sparrow system was greater than the time to document in the Cerner system across all three standardized patient encounters, with statistically significant differences present for the depression and cholecystitis cases (T = −2.18, p < .05 and T = −2.76, p < .01 respectively; see Table 2).
Physicians in this study spent more time recording data into the EHR for a given patient encounter than they spent watching the video-recorded patient encounter (see Table 2). Of the entire patient encounter time from initial evaluation to disposition, including documentation time, the percentage of time spent documenting was 63 percent, 56 percent, and 62 percent respectively for each patient encounter using the Cerner system and 67 percent, 58 percent, and 66 percent respectively for each patient encounter using the Sparrow system across all participants.
Our study demonstrated that participants’ level of training and experience with a system affected their documentation time. Documentation on the Cerner system was faster for more experienced residents (EM2s and EM3s) compared with less experienced users (MS4s and EM1s). This experience benefit was not as dramatic for the Sparrow system, which was new to all users; however, EM2s and EM3s were still faster than EM1s and MS4s, suggesting that some efficiency in documentation may be associated with the ability to distinguish clinically relevant information from the entirety of a patient encounter.
Despite the fact that more providers preferred the Sparrow system, which was designed to have a more user-friendly interface, they did not actually achieve faster documentation with this system as novice users. It is not unreasonable to expect that documentation times would significantly improve with experience on the Sparrow system.
Even though the use of standardized patient care plans in this study meant that time spent in clinical decision making was removed, more time was spent on documentation than was spent with the patient. The findings of our study are consistent with other studies evaluating physician documentation using EHRs. Studies have shown that emergency physicians spend 43 percent of their time on data entry and 28 percent in direct contact with patients,28, 29 with time for electronic documentation ranging between 29 percent and 65 percent.30–32
Studies looking at the efficiency of providers using scribes or alternative documentation services have shown improvement in work efficiency.33 This improvement was demonstrated through an increase in physician productivity34, 35 as measured by the number of patients seen per hour and the number of Relative Value Units (RVUs) per hour.36 This study’s findings suggest that an increased focus on early training of residents in documentation skills may lead to improvement in efficiency when other documentation services are not available.
EHRs are intended to improve efficiency, safety, and quality of care. While EHRs can be beneficial, their use in the ED differs from that in other segments of the healthcare system because of the unpredictable volume, varying acuity, need for multitasking, and frequent interruptions and distractions present in the ED.37–40 Recommendations that would lead to the improvement of the safety of ED information systems have been made.41, 42 An EHR system that would be truly functional is difficult to create because of the unique characteristics of this specialty and because it would require the active participation of emergency physicians. Technical innovations and evaluation of the user experience and patient experience should be incorporated into the training of EM residents.43–47 A combination of functionality and innovation was the original aim of the Sparrow system.
The adoption of a new system can be limited by the perceived usefulness, perceived ease of use, and attitudes toward use. The majority of participants preferred the Sparrow system but actually had longer documentation times with it. This finding suggests that novice learners felt that the Sparrow user interface offered better ease of use, which is consistent with the principles of the Technology Acceptance Model.48
One of the biggest limitations of this study was the physical input devices for both systems. The laptop was operated without an external mouse, instead using only the included trackpad. On the iPad mini, because of the small screen size, the paired keyboard had a smaller form than that of the laptop. All residents and most medical students had experience with documentation on the Cerner system, so we were comparing a known system to an unknown system. Conclusions about documentation times for experienced Sparrow users cannot be drawn from the data.
Future studies could compare keyboard versus voice dictation for documentation time and could offer a comparison of documenting during a patient encounter and documenting after the encounter in another location, to determine which option offers better efficiency overall. By design, a Sparrow system operating on a mobile device like an iPad might be easier to implement at the bedside, compared with using the Cerner system on a laptop; however, both designs should be studied.
This study demonstrated that the level of training and experience with a system affected the time of documentation. A decrease in the total documentation time for a patient encounter was noted across years of training for the Cerner system on a laptop that was familiar to all users, compared with the newly introduced iPad interface. The data showed faster documentation by EM2s and EM3s on the Sparrow system, but the finding was not statistically significant. No association was found between documentation time and age, gender, iPad ownership, or preferred system. Documentation time did not correlate with system preference. Although the majority of users preferred the Sparrow system, the documentation time was shorter with the Cerner system.
Scott Crawford, MD, is an assistant professor at Texas Tech University Health Sciences Center El Paso in El Paso, TX.
Igor Kushner, MD, is a first-year resident at Phoenix Children’s Hospital in Phoenix, AZ.
Radosveta Wells, MD, is an assistant professor at Texas Tech University Health Sciences Center El Paso in El Paso, TX.
Stormy Monks, PhD, is an assistant professor at Texas Tech University Health Sciences Center El Paso in El Paso, TX.