Research Article

Quality Of Care

Identifying Electronic Health Record Usability And Safety Challenges In Pediatric Settings

Affiliations
  1. Raj M. Ratwani ([email protected]) is director of the National Center for Human Factors in Healthcare, MedStar Health, and an assistant professor of emergency medicine, Department of Emergency Medicine, Georgetown University School of Medicine, both in Washington, D.C.
  2. Erica Savage is a manager in Ambulatory Quality and Safety, MedStar Health.
  3. Amy Will is a research program manager at the National Center for Human Factors in Healthcare, MedStar Health.
  4. Allan Fong is a research scientist at the National Center for Human Factors in Healthcare, MedStar Health.
  5. Dean Karavite is principal human computer interaction specialist, Department of Biomedical and Health Informatics, Children’s Hospital of Philadelphia, in Pennsylvania.
  6. Naveen Muthu is director of the Cognitive Informatics Group, Department of Biomedical and Health Informatics, Children’s Hospital of Philadelphia, and an instructor of pediatrics, University of Pennsylvania Perelman School of Medicine.
  7. A. Joy Rivera is a senior human factors system engineer at the Children’s Hospital of Wisconsin, in Milwaukee.
  8. Cori Gibson is a safety specialist at the Children’s Hospital of Wisconsin.
  9. Don Asmonga is an officer in the Health Information Technology Initiative, Pew Charitable Trusts, in Washington, D.C.
  10. Ben Moscovitch is the project director of the Health Information Technology Initiative, Pew Charitable Trusts.
  11. Robert Grundmeier is director of clinical informatics, Department of Biomedical and Health Informatics, Children’s Hospital of Philadelphia, and an assistant professor of pediatrics, University of Pennsylvania Perelman School of Medicine.
  12. Josh Rising is director of Healthcare Programs, Pew Health Group, Pew Charitable Trusts.
PUBLISHED:Free Accesshttps://doi.org/10.1377/hlthaff.2018.0699

Abstract

Pediatric populations are uniquely vulnerable to the usability and safety challenges of electronic health records (EHRs), particularly those related to medication, yet little is known about the specific issues contributing to hazards. To understand specific usability issues and medication errors in the care of children, we analyzed 9,000 patient safety reports, made in the period 2012–17, from three different health care institutions that were likely related to EHR use. Of the 9,000 reports, 3,243 (36 percent) had a usability issue that contributed to the medication event, and 609 (18.8 percent) of the 3,243 might have resulted in patient harm. The general pattern of usability challenges and medication errors were the same across the three sites. The most common usability challenges were associated with system feedback and the visual display. The most common medication error was improper dosing.

TOPICS

Electronic health records (EHRs) have become like the central nervous system of the practice of medicine, helping guide nearly every decision in care; enabling clinicians to more seamlessly order medications and diagnostic tests than when using paper orders; and providing physicians, nurses, and patients with more timely information than when using paper records. EHRs are superior to the paper-based processes they have replaced, in that they eliminate the problem of illegible documentation, provide greater awareness of patients’ health status, and deliver automated decision support.1,2 However, EHR use has its own challenges, such as interoperability, data integrity, and notably usability.3,4Usability is the extent to which the technology can be used efficiently, effectively, and satisfactorily based on system design, as well as how it’s customized in the clinical environment to the specific workflows that clinicians employ.5 EHR usability challenges affect clinician burden and—along with other factors such as interoperability, which can limit the sharing of patient information—can contribute to patient harm.6,7

Pediatric patients are uniquely vulnerable to EHR usability and safety challenges because of different physical characteristics, developmental issues, and dependence on parents and other care providers to prevent medical errors.8 For example, lower body weight and less developed immune systems make pediatric patients less able to tolerate even small errors in medication dosing or delays in care that could be a result of EHR usability and safety issues.9,10

The Office of the National Coordinator for Health Information Technology (ONC), the federal agency that oversees EHRs, has policies to promote usability—such as requiring system developers to incorporate feedback from clinicians into software design and development and mandating the testing of twelve high-risk EHR functions that are primarily related to medication.11 These policies have not made a distinction between adult and pediatric populations. The 21st Century Cures Act of 2016 requires the ONC to establish new voluntary criteria unique to EHRs used in the care of children, thus recognizing the importance of ensuring that systems are designed for the unique needs of pediatric patients.

Despite interest in policies to promote the effective use of EHRs, three key factors can undermine efforts to enhance usability and safety for patients, including children. First, some EHR developers might not be adhering to these policies, yet their products are being certified by government-authorized certification bodies as if they are.12 Second, developer usability testing often lacks rigor because test-case scenarios, which are used by EHR vendors to evaluate usability and safety, might not represent real-world clinical care scenarios.13 Importantly, until the passing of the 21st Century Cures Act, there have been no requirements for specific EHR developer testing that acknowledges differences in clinician and patient populations, such as children. Finally, the EHR product may undergo tremendous change during implementation because of configuration and customization at each provider site, which may render aspects of EHR developer testing irrelevant since the product being used is significantly different from what was evaluated.14,15

Studies have shown an association between EHR usability and safety in adult populations.6,16,17 Although not as widely studied, EHR usability challenges have also been found to affect the safety of pediatric patients. For example, a pediatric patient received a thirty-eightfold overdose of an antibiotic because of challenges with EHR usability.18 In other cases, patients have missed necessary medication doses because the EHR defaulted to a different administration time than the physician intended without clearly notifying the physician of that change.6 There has also been an increase in legal claims associated with the EHR, a subset of which may be related to usability.19 However, there is a paucity of data examining EHR usability challenges writ large and their association with patient safety in pediatric populations. Better knowledge on these topics can help inform federal policies shaping EHRs’ usability for pediatric populations, including averting patient harm—even death—associated with their use.

With a focus on pediatric populations and medication events (both inpatient and outpatient), we sought to identify the specific types of EHR usability issues and associated medication errors and to examine the patterns of these usability issues across institutions.

Study Data And Methods

Patient Safety Event Report Sources

Pediatric patient safety event report data from inpatient and outpatient settings that were entered in the period 2012–17 were retrieved from three large academic health care institutions (two stand-alone pediatric institutions and one adult and pediatric institution) that used Epic and Cerner EHRs (two institutions used Epic, and one used Cerner). Patient safety event reports are voluntarily self-reported descriptions of patient safety events, ranging from safety hazards that did not reach the patient to potential harm events that reached the patient. Each report contains categories to describe the type of event (for example, medication error or fall) and a free-text description of the safety event and possible causal factors. All pediatric reports from the five-year study period from all three sites were included for analysis.

Identifying Electronic Health Record And Medication Reports

To identify EHR usability and safety challenges, each site started with five years of patient safety events (there were more than 50,000 per site) and purposefully filtered the safety event report data to retrieve 3,000 events from each site that had a high likelihood of being related to the EHR and medication. The 3,000 events were identified using an algorithm that analyzes reports’ free text for language related to EHR and medication issues20 (see the online appendix for a technical description).21 The 9,000 event reports were then manually coded by the research team (all of whom are authors), as described herein.

Safety Event Report Coding

The 9,000 reports—3,000 from each site—were reviewed to verify whether the events were related to the EHR and medication; determine whether EHR usability contributed to the event and, if it did, identify what the specific usability challenge was; identify the type of medication error; and identify whether the event reached the patient.

Reports were determined to be related to the EHR and medication if the EHR was associated with the safety event and if the event was due to a medication issue. EHR usability was considered a contributing factor to the event if the report contained language suggesting that the design of and user interaction with the EHR at least partially contributed to the event. To code the usability events, we reviewed existing usability taxonomies, integrated relevant categories from these taxonomies, and derived a list of usability categories to create our own usability taxonomy.2231 The usability taxonomy we used is in appendix exhibit A1.21 If an event was usability related, it was assigned to one of four general usability categories and to a specific subcategory within the general one. The four general categories were system feedback, defined as when the EHR does not provide appropriate feedback to the user; visual display, defined as a confusing or cluttered EHR information display; data entry, defined as difficult or impossible EHR data entry; and workflow support, defined as a mismatch between EHR workflow and the expectations of the clinician.

If a report contained multiple EHR usability contributing factors, only the primary usability challenge was coded. To code the type of medication error, we used a modified version of the National Coordinating Council for Medication Error Reporting and Prevention’s National Medication Errors Reporting Program. There were nine medication error categories: improper dose, wrong strength/concentration, wrong drug, wrong dosage form/technique/route, wrong rate, wrong time, wrong patient, monitoring error, and all other medication issues, or “other” (appendix exhibit A2).21

To determine whether the safety event reached the patient, the free-text description was analyzed and coded. Some reports contained an explicit data field indicating the level of patient harm; if available, it was included in the analysis.

Two researchers from each institution coded their respective report data. For each usability-related report, the specific usability subcategory was assigned by joint discussion between the pair of researchers. The appendix describes the coding process in greater detail.21

Limitations

Our study had some limitations. First, the focus of the research was to evaluate how EHRs can contribute to harm in pediatric patients. Therefore, we analyzed events most likely to be associated with EHR use. Our methodology did not provide insights into the overall likelihood of an EHR-related safety event during patient care.

Second, the participating institutions each examined 3,000 reports out of more than 50,000, and the number of EHR usability and safety events at each institution is likely much higher.

Third, patient safety events are typically underreported, so it is not possible to know the total number of safety hazards at a given institution. Research has shown that clinicians often do not report safety events because reporting systems take too long to use and reporting clinicians often do not receive feedback from the institution.32

Study Results

Of the 9,000 patient safety event reports that we collected, 56.4 percent (5,079 events in all, 849 at site 1, 2,550 at site 2, and 1,680 at site 3) were confirmed as being related to both the EHR and medication. Of these 5,079, 63.9 percent (3,243) had a usability issue as a contributing factor to the safety event (484 events at site 1, 1,873 at site 2, and 886 at site 3), which amounts to 36 percent of the 9,000 reports analyzed. Of the 3,243 reports that had usability as a contributing factor, 18.8 percent (609) reached the patient. Of these, 33 percent (201) did not cause harm and did not require monitoring, 17.9 percent (109) required monitoring or an intervention to prevent harm, 3.3 percent (20) resulted in temporary harm, and the consequence was unknown for 45.8 percent (279) (data not shown).

The general pattern of usability challenges was the same at each site (exhibit 1). System feedback was the most common usability challenge at all sites (82.4 percent), followed by visual display (9.7 percent), data entry (6.2 percent), and workflow support (1.7 percent) (exhibit 2). Within each general usability challenge category there was some variability across the sites in the specific usability subcategories contributing to the safety event reports.

Exhibit 1 Electronic health record (EHR) usability issues, by general category and pediatric academic health care site

Exhibit 1
SOURCE Authors’ analysis of coded events. NOTE The four general categories of events related to EHR usability are defined in the text.

Exhibit 2 General and specific electronic health record (EHR) usability issue categories, by pediatric academic health care site

Site 1
Site 2
Site 3
All
%No.%No.%No.%No.
System feedback54.0261/48493.51,750/1,87374.5660/88682.42,671/3,243
 Suboptimal clinical decision support or error prevention70.8185/26197.41,709/1,75091.7605/66093.62,499/2,671
 No feedback to user about system actions7.8201.0215.6372.978
 Automation or conversion with no clear feedback18.0471.4170.002.464
 Wrong feedback about system actions3.490.232.7181.130
Visual display28.9140/4842.954/1,87313.8122/8869.7316/3,243
 Suboptimal interface between applications37.152/1407.44/5449.260/12236.7116/316
 Hard to find or confusing information display34.34831.51732.03933.0104
 Wrong information displayed26.43759.33213.11626.885
 Alert difficult to interpret2.231.815.773.511
Data entry13.063/4842.344/1,87310.694/8866.2201/3,243
 Confusing data entry (where to put info)36.523/6329.613/4459.656/9445.892/201
 Work-around needed36.52322.71030.92930.862
 Unable to enter desired information27.01747.7219.5923.447
Workflow support4.120/4841.325/1,8731.110/8861.755/3,243
 Mismatch between mental model and actual system15.03/2056.014/2540.04/1038.221/55
 Lack of support for communication70.0148.0220.0232.718
 Suboptimal teamwork support15.0336.0940.0429.116

SOURCE Authors’ analysis of coded events. NOTE The four general categories of events related to EHR usability are defined in the text.

The pattern of medication errors was generally the same across the sites (exhibit 3). The most common medication error for all three sites combined was improper dose (84.5 percent), followed by other (5.9 percent) and wrong time (3.5 percent).

Exhibit 3 Medication errors related to electronic health records (EHRs), by reporting categories and pediatric academic health care sites

Site 1
Site 2
Site 3
All
Category%No.%No.%No.%No.
Improper dose (including dose omission and wrong duration)62.2301/48494.61,772/1,87375.3667/88684.52,740/3,243
Wrong strength or concentration1.050.241.1100.619
Wrong drug4.5220.8153.5312.168
Wrong dosage form/technique/route1.780.6121.6141.034
Wrong rate2.7130.7130.220.828
Wrong time9.5461.5284.3383.5112
Wrong patient0.000.232.1190.722
Monitoring error1.570.242.2190.930
Other16.9821.2229.7865.9190

SOURCE Authors’ analysis of coded events. NOTE The categories are modified from the National Coordinating Council for Medication Error Reporting and Prevention’s National Medication Errors Reporting Program.

When we analyzed the 609 usability events that reached the patient, combining data from all sites, we found that more than half of these events were improper dose errors associated with system feedback and visual display issues (exhibit 4).

Exhibit 4 Reports of events that reached the patient, by medication error and electronic health record (EHR) usability issue category, all pediatric academic health care sites combined

System feedback
Visual display
Data entry
Workflow support
Category%No.%No.%No.%No.
Improper dose (including dose omission and wrong duration)36.922518.71146.9422.515
Wrong time5.6344.6281.591.06
Other3.0181.8112.0120.74
Wrong drug2.3141.8110.530.32
Monitoring error2.4150.210.000.00
Wrong rate1.380.850.210.00
Wrong strength/concentration0.851.060.000.21
Wrong dosage form/technique/route0.740.530.210.21
Wrong patient0.740.210.210.00
Multiple0.530.210.000.00

SOURCE Authors’ analysis of coded events. NOTE N=609.

Discussion

In the first evaluation of its kind, we found that nearly two-thirds of safety reports related to the EHR and medication at three pediatric hospitals were associated with usability issues. Furthermore, nearly one-fifth of these events reached the patient, with 129 reported as causing minor or major harm. While EHRs have improved care and safety under certain circumstances, these findings suggest that thousands of patients may be put at risk because of usability challenges that stem from the design, implementation, customization, or use of this technology.

Previous research has shown variability in the usability and safety of health information technology across health care institutions.3336 For example, results from the application of a computerized provider order entry simulation tool developed by the Leapfrog Group and applied to forty-one pediatric hospitals to test the safety of the order entry and clinical decision support showed potential medication errors in the test scenarios that ranged from 23 percent to 91 percent across the institutions.33

This study, while not designed to assess the frequency with which usability and safety events occurred across sites, did identify trends and differences among the sites that were related to the types of usability issues and associated medication errors that arose.

Where our research identified similar patterns of usability and safety challenges across institutions, those similarities may suggest the presence of a systemic problem that necessitates a more comprehensive and broader solution. The most frequent hazards across sites were associated with system feedback, most commonly related to suboptimal clinical decision support or error prevention.

One example of an event in this category occurred when a physician ordered five times the recommended dose of a medication without receiving an alert from the EHR, although the prescribed dose was outside the recommended range. Both vendor design and development, as well as implementation and customization, may be contributing to the challenges associated with system feedback. To address this systemic problem, vendors and providers should consider developing more comprehensive design guidelines and use generalizable tools to assess usability and safety. The Leapfrog tool,33 which assesses clinical decision support functionality, is one example of the types of tools that could improve the safety of implemented EHR products.

Alternatively, where there was variability in the types of usability issues across sites, an organization that is an outlier could indicate the presence of site-specific hazards due to different workflows, implementation and customization decisions, and the availability of institutional knowledge and resources to make EHRs safer for patients. For example, in the visual display and data entry usability categories, site 3 had very few events. This difference may suggest that a product implementation had been optimized to support the display of information and data entry, and there may be an opportunity for other institutions to learn from that site. Identifying institutions that have optimized usability and safety and understanding how their health information technology systems support improved patient safety is critical to reducing hazards across institutions.

The results of this study also identified improper dose (including dose omission and wrong duration) as the most prominent category of medication errors, which were associated with system feedback and visual display usability challenges. These types of dosing errors can be particularly detrimental to pediatric populations. Health care institutions and vendors should focus on the interpretability of alerts, the way in which dosing information is presented to clinicians, and the accuracy of this information.

Functionality And Usability

Assessing the results of our research in the context of other studies on EHR safety highlights the critical aspects of testing functionality and usability throughout the product’s life cycle—including during design and after implementation. The Leapfrog tool33 is a rigorous approach to assessing the functionality of clinical decision support alerts. This tool determines whether alerts are triggered, given particular information present in the EHR. For example, if one of the test cases contained a patient with a documented allergy to a medication and the provider prescribed that medication, the tool would assess whether or not the clinical decision support functioned appropriately and issued an alert.

Equally important as appropriate EHR functionality is ensuring usability.

Equally important as appropriate EHR functionality is ensuring usability. While the Leapfrog tool assesses functionality, our study focused on usability with real-world safety events. Many alerts may be triggered during real-world use, satisfying the Leapfrog criteria, but these alerts might not be effectively presented to the clinician. These types of usability issues would not be detected by the tool but would have been accounted for in the data used in this study if safety events had been reported. Together, the Leapfrog results and ours demonstrate the need to examine both appropriate functionality and usability to promote safety.

Policy Opportunities

The 21st Century Cures Act, passed with bipartisan support, includes two provisions that enable the ONC to prioritize the usability and safety of EHRs. First, the law requires the ONC to develop voluntary criteria to certify EHRs used in the care of children; current criteria apply to EHRs generally. Given that our study found significant numbers of safety hazards in EHRs used in the care of children and that many of these events represent challenges prevalent in pediatrics (for example, weight-based dosing and care for newborns), the ONC should prioritize safety in developing these regulations. Many influential organizations—including the Children’s Hospital Association—have called on the ONC to address in its regulations these safety-related issues that are prevalent in the pediatric population.37

Second, the law requires the ONC to develop a reporting program that collects improved, real-world data on the functions of different EHR systems—including on their usability. Given the critical intersection of safety and usability, the ONC should ensure that some of the usability-related measures collect information on harm associated with the use of systems. Congress should conduct oversight to ensure that the agency embeds safety in the implementation of both provisions of the law.

EHR developers and hospitals also have an opportunity to improve their current safety processes to better test systems for potential safety issues during development and after implementation.38 Developers use test cases that replicate different clinical scenarios to evaluate product functions, both to help improve systems and to meet federal regulations for certifications. However, regulations do not require that these test cases be robust and focus on safety or the EHR’s use after implementation to evaluate different customizations and workflows. The Pew Charitable Trusts, MedStar Health’s National Center for Human Factors in Healthcare, and the American Medical Association have released robust, safety-focused test cases that serve as examples for how to test systems for safety during development and after implementation.39 Using more robust test cases during development and implementation could reduce the number of safety hazards and the variability across health care institutions.

The Joint Commission should incorporate health information technology safety, particularly usability and pediatric-specific factors. Given the Joint Commission’s critical role in hospital accreditation for Medicare, criteria from the organization on EHR safety would significantly catalyze progress on a national scale. Similarly, the Association for the Advancement of Medical Instrumentation is developing a set of safety standards.40

Conclusion

In our analysis of 9,000 patient safety reports from three institutions using Epic or Cerner EHRs, we found that 36 percent were related to usability issues, the most common of which were system feedback and visual display challenges. Many of these safety-related usability issues contributed to patient harm. To better prevent usability-related medical errors, the ONC could include safety as part of the voluntary certification criteria of EHRs for use with children and implement usability-related measures to assess EHR performance. Vendors and providers should use rigorous test-case scenarios based on realistic clinician tasks. Finally, the Joint Commission should assess EHR safety as part of its hospital accreditation program. The implementation of approaches such as these is needed to reduce patient harm related to EHR use.

ACKNOWLEDGMENTS

The categorization of patient safety event reports was sponsored by the Pew Charitable Trusts. Funding for report analysis was provided by the Pew Charitable Trusts to Raj Ratwani (MedStar Health). Algorithm development was funded by the Agency for Healthcare Research and Quality (AHRQ), Department of Health and Human Services (HHS) (Grant No. R01 HS023701-02 to Ratwani). The opinions expressed in this document are those of the authors and do not reflect the position of AHRQ or HHS. The authors thank Amina Khan, Trenya Garner, James Won, Jeanette Teets, Svetlana Ostapenko, Glenn Bushee, and Allan Coukell for support.

NOTES

  • 1 King J, Patel V, Jamoom EW, Furukawa MF. Clinical benefits of electronic health record use: national findings. Health Serv Res. 2014;49(1 Pt 2):392–404.Crossref, MedlineGoogle Scholar
  • 2 Radley DC, Wasserman MR, Olsho LE, Shoemaker SJ, Spranca MD, Bradshaw B. Reduction in medication errors in hospitals due to adoption of computerized provider order entry systems. J Am Med Inform Assoc. 2013;20(3):470–6.Crossref, MedlineGoogle Scholar
  • 3 Zahabi M, Kaber DB, Swangnetr M. Usability and safety in electronic medical records interface design: a review of recent literature and guideline formulation. Hum Factors. 2015;57(5):805–34.Crossref, MedlineGoogle Scholar
  • 4 Ellsworth MA, Dziadzko M, O’Horo JC, Farrell AM, Zhang J, Herasevich V. An appraisal of published usability evaluations of electronic health records via systematic review. J Am Med Inform Assoc. 2017;24(1):218–26.Crossref, MedlineGoogle Scholar
  • 5 International Organization for Standardization. ISO 9241-210:2010: ergonomics of human-system interaction—part 210: human-centred design for interactive systems. Geneva: ISO; 2010. Google Scholar
  • 6 Howe JL, Adams KT, Hettinger AZ, Ratwani RM. Electronic health record usability issues and potential contribution to patient harm. JAMA. 2018;319(12):1276–8.Crossref, MedlineGoogle Scholar
  • 7 Middleton B, Bloomrosen M, Dente MA, Hashmat B, Koppel R, Overhage JMet al. Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA. J Am Med Inform Assoc. 2013;20(e1):e2–8.Crossref, MedlineGoogle Scholar
  • 8 Steering Committee on Quality Improvement and Management and Committee on Hospital Care. Policy statement—principles of pediatric patient safety: reducing harm due to medical care. Pediatrics. 2011;127(6):1199–210.Crossref, MedlineGoogle Scholar
  • 9 Koren G, Barzilay Z, Greenwald M. Tenfold errors in administration of drug doses: a neglected iatrogenic disease in pediatrics. Pediatrics. 1986;77(6):848–9.MedlineGoogle Scholar
  • 10 Kaushal R, Bates DW, Landrigan C, McKenna KJ, Clapp MD, Federico Fet al. Medication errors and adverse drug events in pediatric inpatients. JAMA. 2001;285(16):2114–20.Crossref, MedlineGoogle Scholar
  • 11 Office of the National Coordinator for Health Information Technology (ONC), Department of Health and Human Services (HHS). 2015 edition health information technology (health IT) certification criteria, 2015 edition base electronic health record (EHR) definition, and ONC Health IT Certification Program modifications. Final rule. Fed Regist. 2015;80(200):62601–759.MedlineGoogle Scholar
  • 12 Ratwani RM, Benda NC, Hettinger AZ, Fairbanks RJ. Electronic health record vendor adherence to usability certification requirements and testing standards. JAMA. 2015;314(10):1070–1.Crossref, MedlineGoogle Scholar
  • 13 Ratwani RM, Zachary Hettinger A, Kosydar A, Fairbanks RJ, Hodgkins ML. A framework for evaluating electronic health record vendor user-centered design and usability testing processes. J Am Med Inform Assoc. 2017;24(e1):e35–9.MedlineGoogle Scholar
  • 14 Lorenzi NM, Kouroubali A, Detmer DE, Bloomrosen M. How to successfully select and implement electronic health records (EHR) in small ambulatory practice settings. BMC Med Inform Decis Mak. 2009;9:15.Crossref, MedlineGoogle Scholar
  • 15 Ratwani R, Fairbanks T, Savage E, Adams K, Wittie M, Boone Eet al. Mind the gap. A systematic review to identify usability and safety challenges and practices during electronic health record implementation. Appl Clin Inform. 2016;7(4):1069–87.Crossref, MedlineGoogle Scholar
  • 16 Minshall S. A review of healthcare information system usability and safety. Stud Health Technol Inform. 2013;183:151–6.MedlineGoogle Scholar
  • 17 Zahabi M, Kaber DB, Swangnetr M. Usability and safety in electronic medical records interface design: a review of recent literature and guideline formulation. Hum Factors. 2015;57(5):805–34.Crossref, MedlineGoogle Scholar
  • 18 Wachter R. The digital doctor: hope, hype, and harm at the dawn of medicine’s computer age. New York (NY): McGraw-Hill Education; 2015. Google Scholar
  • 19 Troxel DB. Electronic health record closed claims study: an expert analysis of medical malpractice allegations [Internet]. Napa (CA): Doctors Company; c 2017 [cited 2018 Oct 1]. Available from: https://www.thedoctors.com/siteassets/pdfs/risk-management/closed-claims-studies/11220_ccs_ehr_1017_single-page_fr2.pdf Google Scholar
  • 20 Fong A, Adams KT, Gaunt MJ, Howe JL, Kellogg K, Ratwani RM. Identifying health information technology related safety event reports from patient safety event report databases. J Biomed Inform. 2018 Sep 10. [Epub ahead of print].Crossref, MedlineGoogle Scholar
  • 21

    To access the appendix, click on the Details tab of the article online.

    Google Scholar
  • 22 Nielsen J. 10 usability heuristics for user interface design [Internet]. Fremont (CA): Nielsen Norman Group; 1995 Jan 1 [cited 2018 Oct 1]. Available from: http://www.nngroup.com/articles/ten-usability-heuristics/ Google Scholar
  • 23 Shneiderman B, Plaisant C. Designing the user interface: strategies for effective human-computer interaction. 5th edition. Boston (MA): Pearson Addison-Wesley; 2009. p. 606. Google Scholar
  • 24 Agency for Healthcare Research and Quality. Common Formats for Event Reporting—Hospital Version 2.0a [Internet]. Rockville (MD): AHRQ; 2012 [cited 2018 Oct 16]. Available from: https://www.psoppc.org/psoppc_web/publicpages/commonFormatsHV2.0 Google Scholar
  • 25 Walker JM, Hassol A, Bradshaw B, Rezaee ME. Health IT hazard manager beta-test: final report [Internet]. Rockville (MD): Agency for Healthcare Research and Quality; 2012 May [cited 2018 Oct 1]. (AHRQ Publication No. 12-0058-EF). Available from: https://healthit.ahrq.gov/sites/default/files/docs/citation/HealthITHazardManagerFinalReport.pdf Google Scholar
  • 26 Magrabi F, Ong MS, Runciman W, Coiera E. An analysis of computer-related patient safety incidents to inform the development of a classification. J Am Med Inform Assoc. 2010;17(6):663–70.Crossref, MedlineGoogle Scholar
  • 27 Meeks DW, Takian A, Sittig DF, Singh H, Barber N. Exploring the sociotechnical intersection of patient safety and electronic health record implementation. J Am Med Inform Assoc. 2014;21(e1):e28–34.Crossref, MedlineGoogle Scholar
  • 28 Castro GM, Buczkowski L, Hafner JM. The contribution of sociotechnical factors to health information technology–related sentinel events. Jt Comm J Qual Patient Saf. 2016;42(2):70–6.Crossref, MedlineGoogle Scholar
  • 29 Menon S, Singh H, Giardina TD, Rayburn WL, Davis BP, Russo EMet al. Safety huddles to proactively identify and address electronic health record safety. J Am Med Inform Assoc. 2017;24(2):261–7.MedlineGoogle Scholar
  • 30 Galanter WL, Bryson ML, Falck S, Rosenfield R, Laragh M, Shrestha Net al. Indication alerts intercept drug name confusion errors during computerized entry of medication orders. PLoS One. 2014;9(7):e101977.Crossref, MedlineGoogle Scholar
  • 31 Schiff GD, Hickman TT, Volk LA, Bates DW, Wright A. Computerised prescribing for safer medication ordering: still a work in progress. BMJ Qual Saf. 2016;25(5):315–9.Crossref, MedlineGoogle Scholar
  • 32 Noble DJ, Pronovost PJ. Underreporting of patient safety incidents reduces health care’s ability to quantify and accurately measure harm reduction. J Patient Saf. 2010;6(4):247–50.Crossref, MedlineGoogle Scholar
  • 33 Chaparro JD, Classen DC, Danforth M, Stockwell DC, Longhurst CA. National trends in safety performance of electronic health record systems in children’s hospitals. J Am Med Inform Assoc. 2017;24(2):268–74.MedlineGoogle Scholar
  • 34 Metzger J, Welebob E, Bates DW, Lipsitz S, Classen DC. Mixed results in the safety performance of computerized physician order entry. Health Aff (Millwood). 2010;29(4):655–63.Go to the articleGoogle Scholar
  • 35 Ratwani RM, Savage E, Will A, Arnold R, Khairat S, Miller Ket al. A usability and safety analysis of electronic health records: a multi-center study. J Am Med Inform Assoc. 2018;25(9):1197–201.Crossref, MedlineGoogle Scholar
  • 36 Slight SP, Seger DL, Franz C, Wong A, Bates DW. The national cost of adverse drug events resulting from inappropriate medication-related alert overrides in the United States. J Am Med Inform Assoc. 2018;25(9):1183–8.Crossref, MedlineGoogle Scholar
  • 37 American College of Cardiology, American College of Physicians, American Medical Group Association, American Nurses Association, Children’s Hospital Association, Pew Charitable Trusts. Letter to Don Rucker, national coordinator for health information technology [Internet]. Philadelphia (PA): Pew Charitable Trusts; 2017 Sep 29 [cited 2018 Oct 1]. Available from: https://www.pewtrusts.org/-/media/assets/2017/09/pew-pediatric-letter.pdf Google Scholar
  • 38 Karsh BT. Beyond usability: designing effective technology implementation systems to promote patient safety. Qual Saf Health Care. 2004;13(5):388–94.Crossref, MedlineGoogle Scholar
  • 39 Pew Charitable Trusts, American Medical Association, National Center for Human Factors in Healthcare. Ways to improve electronic health record safety: rigorous testing and establishment of voluntary criteria can protect patients [Internet]. Philadelphia (PA): Pew Charitable Trusts; 2018 Aug [cited 2018 Oct 2]. Available from: https://www.pewtrusts.org/en/research-and-analysis/reports/2018/08/28/ways-to-improve-electronic-health-record-safety Google Scholar
  • 40 Association for the Advancement of Medical Instrumentation. AAMI launches health IT standards initiative [Internet]. Arlington (VA): AAMI; 2015 Aug [cited 2018 Oct 2]. Available from: http://www.aami.org/productspublications/articledetail.aspx?ItemNumber=2663 Google Scholar