As a knowledge-based field of medicine, critical care medicine has benefited from the use of the electronic health records (EHRs) in daily practice, as intensive care unit (ICU) patients generate thousands of pieces of clinical data each day.1 ICU teams must review, interpret, and take action on these data points when managing multiple patients in a time-constrained environment. The increasing number of available data facts to be processed by ICU clinicians for decision-making surpasses human cognitive capacity. ICU physicians described the current display and representation of patient data in the EHR as suboptimum. Performance dashboards are an information delivery system that display the most important information about performance objectives to ICU directors, allowing them to monitor and manage their ICU performance more effectively. The development of visualization dashboards that monitor ICU performance will still need to adhere to usability principles such as Jakob Nielsen’s heuristics. The goal of improving EHR interfaces will directly enhance provider well-being, patient outcomes, and quality of care.
More than a decade has passed since the passage of the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009, which incentivized health systems to adopt electronic health records (EHRs).2,3 By accruing digitized clinical data, EHRs now impact nearly every decision in patient care. As a knowledge-based field of medicine, critical care medicine has benefited from the use of EHRs in daily practice, as ICU patients generate thousands of pieces of clinical data each day.4,5 ICU teams must review, interpret, and take action on these data points when managing multiple patients in a time-constrained environment. Since its implementation, the EHR has demonstrated capability in improving quality and efficiency of ICU care processes,6 improving communication, and becoming a platform for clinical decision support.
However, studies also suggest that unintended consequences with EHR use in the ICU have simultaneously emerged. These include decreased job satisfaction, fatigue, and burnout among clinicians.7-9 The current state of EHR usability—or the extent to which this technology can be used efficiently, effectively, satisfactorily, and safely—has been a source of increased clinician cognitive workload10 and has been associated with patient safety risks.11 The Institute of Medicine recognizes that these current inefficiencies associated with EHR use threaten the quality and safety of healthcare delivery, and that improving the quality and safety (QS) performance of hospitals has become increasingly important in recent years.12,13 In the upcoming decade, a focus on visual analytics on EHR data—utilizing tools from improvement science, safety science, implementation, and system safety science—may help ICUs improve on care delivery at the level of the patient and the processes of ICU care. In this paper, we discuss the current challenges with representing ICU patient data in EHRs and the potential of data visualization dashboards, and we provide insights into possible technology and policy solutions.
ICU Data Representation: Challenges at the Patient Level
The increasing number of available data to be processed by ICU clinicians for decision-making surpasses human cognitive capacity.14 Clinicians are presented with more than 1,300 data points during the evaluation and planning of a single patient visit.15 ICU physicians monitor approximately 2.5 million data points during a given month, and they respond to an average of 187 EHR alerts per patient per day.16-18 The optimum human cognitive capacity averages five sets of facts per decision, and the current EHR screens present more data than our cognition needs, which can have unintended consequences on clinicians, patients, and the overall quality of care.
ICU physicians described the current display and representation of patient data in the EHR as suboptimum.19 Physicians report that finding information in the EHR is a major challenge, which they attribute to data-busy screens and to layers of menus. Data redundancy and inconsistent data among EHR screens are major barriers to timely and accurate decisions. For instance, patient vital signs data presented in the flowsheet screen presents real-time patient data compared to the vital signs screen that presents relatively older data facts because it is not refreshed as frequently as the flowsheet. Such data discrepancy creates frustration and confusion for providers attempting to make time-sensitive decisions for critically ill patients. As a result, providers have reported not using screens that are known to have a data lag, which poses the question of why present the same patient data in two separate EHR screens? An effective and usable system should seek to reduce data redundancy and discrepancies, ensure minimalistic menu design, and present patient data in consistent and user-friendly fashion.
Data display is another area of frustration among ICU providers. Many EHR systems did not adopt a user-centered design (UCD) approach when designing their EHR interfaces.20 UCD approaches ensure that a system is designed to meet the expectations of the user and increases the likelihood of delivering a product that is satisfactory to the user.21 The lack of UCD approach led to data display in the EHR that does not align with the way physicians were trained to read and interpret data. For example, while EHR systems may use color-coding to label a lab value, when graphing values over time, providers may have to read the results in a different way than they are trained (left to right, every eight hours, etc.). Clinicians eventually adapt to the way data is displayed, but it remains a pain point, especially among physicians with more years of practice in the pre-EHR era. Data representation needs to meet user requirements rather than user needs. User requirements distinguish the required functionality for the user to achieve specific tasks in the system, while user needs describe the end result of a given task.
Currently, IT applications provide little support for the cognitive tasks of clinicians. However, a nascent area of research investigates the use of artificial intelligence in prioritizing relevant patient information in order to minimize the time and effort that physicians spend in identifying relevant information. Machine learning-based systems have been demonstrated to directly learn the relationship between data and visualizations by training models on analyst interaction and show promise in reducing the cognitive load in seeking ICU data.22 However, it is unknown the effect of using visualization dashboards on clinician decision-making abilities and fatigue levels, which is an area of further investigation using subjective and objective measurements.
ICU Data Representation: Challenges at the ICU System Level
The data currently captured in the EHR reflects ICU processes of care but is underutilized in helping ICU directors assess whether their ICUs are performing effectively. The measurement of these ICU processes of care, or process measures, are meaningful to ICU directors because they evaluate the actions and behaviors of ICU teams and may identify modifiable targets to improve patient care. Tracking process measures allow ICU directors to ask questions such as “Is my ICU meeting benchmark quality and safety metrics?” or “Is care in this ICU adhering to approved institutional guidelines?” With the voluminous amounts of tracked data, there is an increasing focus on the development of ICU performance dashboards to continuously assess ICU care processes in order to support ICU teams in delivering a consistently high level of clinical performance.
Performance dashboards are an information delivery system that display the most important information about performance objectives to ICU directors, allowing them to monitor and manage their ICU performance more effectively. Performance dashboards support data-driven situational awareness and show promise in reducing variations in healthcare quality by stimulating quality improvement (QI).23-27 They may assist in identifying deviations from clinical best practices and monitoring of compliance to professional practice standards and can improve care quality through fewer errors, improved efficiency, and enhanced situational awareness.27-29 Challenges, however, arise when developing performance dashboards; these include initially identifying process measures through discussion with stakeholders or through academic measures in the literature. Early involvement of health IT may also help in defining a real-time process measure and its appropriate data element in the EHR, which may then require changes in clinical processes in order to record the correct data. Additionally, a challenge in the past to the use of performance dashboards has been their inability to reflect ICU performance in real time, in part because key data elements may be inconsistently reported, absent entirely, or stored in different databases.
Finally, the development of visualization dashboards that monitor ICU performance will still need to adhere to usability principles such as Jakob Nielsen’s heuristics principles.30 More specific information visualization guidelines highlight the importance of providing flexibility for users to control display configurations such as time periods and baseline measurements.31 Data set reduction, the elimination of data elements rarely used, is a challenge in the current EHR and remains a riddle in the development of data visualization for performance dashboards. Additionally, the type of information and its representation varies among clinicians based on their role and patients based on their condition. Creating “personas” based on professional role has been one way to mitigate the differences in user needs.32 However, creating a comprehensive and sustainable persona remains a challenge given the changes in staffing and with the continuous EHR upgrades. A nascent area of research surrounds itself with the development of machine learning algorithms to create personas based on what EHR screens and data clinicians use the most, derived from EHR log data.
A Call to Action
Evidence from other industries suggests that value derived from information technology comes from the ability to analyze and share real-time data.33-35 In the next decade, a priority on the meaningful use of data in the EHR will advance its use as an effective tool towards high quality, efficient, and safe healthcare. As described by the Healthcare Information and Management System Society (HIMSS), the presentation of clinical data must strive to adhere to the five “rights”: deliver the right information, to the right person, in the right intervention format, through the right channel, and at the right time in the workflow. Current real-time data presentation for clinical assessment poses a risk to patient safety, with limitations that include alerts that do not deliver information in a timely manner. A focus on developing meaningful data visualization and EHR user interface and design that better supports clinical care and cognitive tasks may improve front-line user efficiency and effectiveness. Additionally, an emphasis on the development of ICU performance dashboards with effective real-time data visualization will advance digital quality improvement, helping ICU teams be more effective through ongoing outcomes feedback. Finally, adding EHR user design standards and usability criteria, especially pertaining to information layout, into current EHR certification standards through the Office of the National Coordinator for Health Information Technology IT Certification Program, may help to prioritize improving EHR usability moving forward.36
ICU teams continue to express frustration with the current representation of patient data in the EHR. The widespread adoption of EHRs and advances in information technology and data analytics offer an opportunity to improve the quality of care through real-time clinical feedback that leads to quality improvement of in-hospital processes. Visualization dashboards present a unique opportunity to visually represent patient data in meaningful ways that enable providers’ decision-making processes. ICU performance dashboards show promise in assisting health systems in tracking their performance against benchmarks, revealing areas to processes that may become more efficient and more effective through behavioral change in order to achieve quality targets. Dashboards will need to follow UCD principles to ensure minimalist design and eliminate data redundancies. For a properly functioning and sustainable real-time dashboard that will enable automated monitoring of ICU performance, quality, and safety, a robust healthcare data analytics and information technology platform will be required. More research is needed to investigate the impact of visualization dashboard on providers’ cognitive load, fatigue, and performance levels. The goal of improving EHR interfaces will directly enhance provider wellbeing, patient outcomes, and quality of care.
1. Drew, B. J., P. Harris, J. K. Zègre-Hemsey, T. Mammone, D. Schindler, R. Salas-Boni, Y. Bai, A. Tinoco, Q. Ding, and X. Hu. 2014. “Insights into the problem of alarm fatigue with physiologic monitor devices: a comprehensive observational study of consecutive intensive care unit patients.” PLoS One 9 (10): e110274. https://doi.org/10.1371/journal.pone.0110274.
2. Adler-Milstein, J., A. J. Holmgren, P. Kralovec, C. Worzala, T. Searcy, and V. Patel. 2017. “Electronic health record adoption in US hospitals: the emergence of a digital “advanced use” divide.” J Am Med Inform Assoc 24 (6): 1142-1148. https://doi.org/10.1093/jamia/ocx080.
3. Adler-Milstein, J., and A. K. Jha. 2017. “HITECH Act Drove Large Gains In Hospital Electronic Health Record Adoption.” Health Aff (Millwood) 36 (8): 1416-1422. https://doi.org/10.1377/hlthaff.2016.1651.
4. Sanchez-Pinto, L. Nelson, Yuan Luo, and Matthew M. Churpek. 2018. “Big Data and Data Science in Critical Care.” Chest 154 (5): 1239-1248. https://doi.org/10.1016/j.chest.2018.04.037. https://pubmed.ncbi.nlm.nih.gov/29752973
5. Drew et al. 2014.
6. Khairat, S. S., A. Dukkipati, H. A. Lauria, T. Bice, D. Travers, and S. S. Carson. 2018. “The Impact of Visualization Dashboards on Quality of Care and Clinician Satisfaction: Integrative Literature Review.” JMIR Hum Factors 5 (2): e22. https://doi.org/10.2196/humanfactors.9328
7. Sinsky, C., L. Colligan, L. Li, M. Prgomet, S. Reynolds, L. Goeders, J. Westbrook, M. Tutty, and G. Blike. 2016. “Allocation of Physician Time in Ambulatory Practice: A Time and Motion Study in 4 Specialties.” Ann Intern Med 165 (11): 753-760. https://doi.org/10.7326/m16-0961.
8. Khairat, Saif., Coleman. Cameron, Paige. Ottmar, Dipika. Jayachander, Thomas. Bice, and Shannon. Carson. 2020. “Association of Electronic Health Records Use with Physician Fatigue and Efficiency.” JAMA Open.
9. Shanafelt, T. D., L. N. Dyrbye, and C. P. West. 2017. “Addressing Physician Burnout: The Way Forward.” Jama 317 (9): 901-902. https://doi.org/10.1001/jama.2017.0076.
10. Mazur, Lukasz M., Prithima R. Mosaly, Carlton Moore, and Lawrence Marks. 2019. “Association of the Usability of Electronic Health Records With Cognitive Workload and Performance Levels Among Physicians.” JAMA Network Open 2 (4): e191709-e191709. https://doi.org/10.1001/jamanetworkopen.2019.1709.
11. Artis, K. A., J. Bordley, V. Mohan, and J. A. Gold. 2019. “Data Omission by Physician Trainees on ICU Rounds.” Crit Care Med 47 (3): 403-409. https://doi.org/10.1097/ccm.0000000000003557.
12. Jha, A., and A. Epstein. 2010. “Hospital governance and the quality of care.” Health Aff (Millwood) 29 (1): 182-7. https://doi.org/10.1377/hlthaff.2009.0297.
13. Medicine, I., C.L.H.C.S. America, J.M. McGinnis, L. Stuckhardt, R. Saunders, and M. Smith. 2013. Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. National Academies Press.
14. Stead, W. W., J. R. Searle, H. E. Fessler, J. W. Smith, and E. H. Shortliffe. 2011. “Biomedical informatics: changing what physicians need to know and how they learn.” Acad Med 86 (4): 429-34. https://doi.org/10.1097/ACM.0b013e3181f41e8c.
15. Morris, Alan. 1992. Computer applications. In: Principles of Critical Care. Nielsen, Jakob. 1994. “Heuristic evaluation.” In Usability inspection methods, 25–62. John Wiley & Sons, Inc.
16. Kizzier-Carnahan, Vanessa, Kathryn A. Artis, Vishnu Mohan, and Jeffrey A. Gold. 2019. “Frequency of Passive EHR Alerts in the ICU: Another Form of Alert Fatigue?” Journal of Patient Safety 15 (3): 246-250. https://doi.org/10.1097/pts.0000000000000270. https://journals.lww.com/journalpatientsafety/Fulltext/2019/09000/Frequency_of_Passive_EHR_Alerts_in_the_ICU_.12.aspx/.
17. Drew et al. 2014.
18. Stead et al. 2011.
19. Khairat, S., C. Coleman, R. Teal, S. Rezk, V. Rand, T. Bice, and S. S. Carson. 2021. “Physician experiences of screen-level features in a prominent electronic health record: Design recommendations from a qualitative study.” Health Informatics J 27 (1): 1460458221997914. https://doi.org/10.1177/1460458221997914.
20. Ratwani, Raj M., Rollin J. Fairbanks, A. Zachary Hettinger, and Natalie C. Benda. 2015. “Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors.” Journal of the American Medical Informatics Association 22 (6): 1179-1182. https://doi.org/10.1093/jamia/ocv050. http://dx.doi.org/10.1093/jamia/ocv050.
21. Zhang, Jiajie, and Muhammad F. Walji. 2011. “TURF: Toward a unified framework of EHR usability.” Journal of Biomedical Informatics 44 (6): 1056-1067. https://doi.org/https://doi.org/10.1016/j.jbi.2011.08.005. http://www.sciencedirect.com/science/article/pii/S1532046411001328.
22. Hu, Kevin, Michiel Bakker, Stephen Li, Tim Kraska, and Cesar Hidalgo. 2018. VizML: A Machine Learning Approach to Visualization Recommendation.
23. Medicine et al. 2013
24. Randell, Rebecca, Natasha Alvarado, Lynn McVey, Joanne Greenhalgh, Robert M. West, Amanda Farrin, Chris Gale, Roger Parslow, Justin Keen, Mai Elshehaly, Roy A. Ruddle, Julia Lake, Mamas Mamas, Richard Feltbower, and Dawn Dowding. 2020. “How, in what contexts, and why do quality dashboards lead to improvements in care quality in acute hospitals? Protocol for a realist feasibility evaluation.” BMJ Open 10 (2): e033208. https://doi.org/10.1136/bmjopen-2019-033208. http://bmjopen.bmj.com/content/10/2/e033208.abstract.
25. Phekoo, KJ, J Clements, and D Bell. 2014. “National Clinical Audit Quality Assessment-Overview of the self-assessment survey:” audit of audits.” London: Healthcare Quality Improvement Partnership.
26. Miller, Tracy, and Sheila Leatherman. 1999. “The National Quality Forum: A ‘Me-Too’ Or A Breakthrough In Quality Measurement And Reporting?” Health Affairs 18 (6): 233-237. https://doi.org/10.1377/hlthaff.18.6.233. https://doi.org/10.1377/hlthaff.18.6.233.
27. S.S. Khairat et al. 2018.
28. Dziadzko, M. A., V. Herasevich, A. Sen, B. W. Pickering, A. M. Knight, and P. Moreno Franco. 2016. “User perception and experience of the introduction of a novel critical care patient viewer in the ICU setting.” Int J Med Inform 88: 86-91. https://doi.org/10.1016/j.ijmedinf.2016.01.011.
29. Ahmed, A., S. Chandra, V. Herasevich, O. Gajic, and B. W. Pickering. 2011. “The effect of two different electronic health record user interfaces on intensive care provider task load, errors of cognition, and performance.” Crit Care Med 39 (7): 1626-34. https://doi.org/10.1097/CCM.0b013e31821858a0.
30. Nielsen 1994.
31. Dowding, D., and J. A. Merrill. 2018. “The Development of Heuristics for Evaluation of Dashboard Visualizations.” Appl Clin Inform 9 (3): 511-518. https://doi.org/10.1055/s-0038-1666842.
32. LeRouge, C., J. Ma, S. Sneha, and K. Tolle. 2013. “User profiles and personas in the design and development of consumer health technologies.” Int J Med Inform 82 (11): e251-68. https://doi.org/10.1016/j.ijmedinf.2011.03.006.
33. Wang, Zhengyi, Man Liang, and Daniel Delahaye. 2020. “Automated data-driven prediction on aircraft Estimated Time of Arrival.” Journal of Air Transport Management 88: 101840. https://doi.org/https://doi.org/10.1016/j.jairtraman.2020.101840. https://www.sciencedirect.com/science/article/pii/S0969699719304429.
34. Kim, Junghyun, Cedric Justin, Dimitri Mavris, and Simon Briceno. “Data-Driven Approach Using Machine Learning for Real-Time Flight Path Optimization.” Journal of Aerospace Information Systems 0 (0): 1-19. https://doi.org/10.2514/1.I010940. https://arc.aiaa.org/doi/abs/10.2514/1.I010940.
35. Amasaka, Kakuro. 2004. “Applying New JIT-A Management Technology Strategy Model at Toyota-Strategic QCD Studies with Affiliated and Non-affiliated Suppliers.”
36. Ratwani et al. 2015.