Research Article
Quality Of CareScaling Safety: The South Carolina Surgical Safety Checklist Experience
- William R. Berry ([email protected]) is an associate director, senior adviser, chief implementation officer, and interim director of the Implementation Platform, all at Ariadne Labs, and a principal research scientist in the Department of Health Policy and Management, Harvard T. H. Chan School of Public Health, all in Boston, Massachusetts.
- Lizabeth Edmondson is a senior program manager at Ariadne Labs.
- Lorri R. Gibbons is vice president for development, South Carolina Hospital Research and Education Foundation, South Carolina Hospital Association, in Columbia.
- Ashley Kay Childers is project lead, Safe Surgery South Carolina, South Carolina Hospital Association, and a senior lecturer in the College of Engineering, Computing, and Applied Sciences, Clemson University.
- Alex B. Haynes is the director of the Safe Surgery Program at Ariadne Labs; an associate professor in the Department of Surgery, Massachusetts General Hospital; and a research associate at the Harvard T. H. Chan School of Public Health, all in Boston.
- Richard Foster is executive director of the Alliance for a Healthier South Carolina, South Carolina Hospital Association.
- Sara J. Singer is an adjunct professor of health care management and policy in the Department of Health Policy and Management, Harvard T. H. Chan School of Public Health; an affiliate member of Ariadne Labs; and a professor of primary care and population health in the School of Medicine and of organizational behavior in the Graduate School of Business at Stanford University, in California.
- Atul A. Gawande is the founding executive director of Ariadne Labs; a general and endocrine surgeon at Brigham and Women’s Hospital; and a professor in the Department of Health Policy and Management, Harvard T. H. Chan School of Public Health. He was also recently named CEO of a new health initiative founded by Amazon, Berkshire Hathaway, and JPMorgan Chase.
Abstract
Proven patient safety solutions such as the World Health Organization’s Surgical Safety Checklist are challenging to implement at scale. A voluntary initiative was launched in South Carolina hospitals in 2010 to encourage use of the checklist in all operating rooms. Hospitals that reported completing implementation of the checklist in their operating rooms by 2017 had significantly higher levels of CEO and physician participation and engaged more in higher-touch activities such as in-person meetings and teamwork skills trainings than comparison hospitals did. Based on our experience and the participation data collected, we suggest three considerations for hospital, hospital association, state, and national policy makers: Successful programs must be designed to engage all stakeholders (CEOs, physicians, nurses, surgical technologists, and others); offering a variety of program activities—both lower-touch and higher-touch—over the duration of the program allows more hospital and individual participation; and change takes time and resources.
The delivery of highly reliable care to patients is a challenge in every health care environment. This demands creative solutions and the necessary investments to make certain those solutions reach patients. In 2010 Ariadne Labs at the Harvard T. H. Chan School of Public Health and Brigham and Women’s Hospital collaborated with the South Carolina Hospital Association to form a team to launch the Safe Surgery South Carolina Program. The authors are or have been members of this program team. The primary goal of the program team was to establish the routine use of a modified version of the Surgical Safety Checklist of the World Health Organization (WHO) (online appendix A) in operating rooms across the state.1 Another goal was to create tools to further US and global efforts to implement the checklist.
The WHO Surgical Safety Checklist is a tool to improve the safety of patients who undergo surgical procedures in all settings. Studies have demonstrated that when the checklist is fully implemented, complication rates, surgical mortality rates, and operating room teamwork and culture improve.2,3 Despite widespread adoption of the checklist globally, its implementation has been variable, and many implementation barriers have been encountered.4–7 In some countries national governments have mandated use of the checklist, but in the United States its implementation in hospitals is voluntary.
In the US a collaboration between Ariadne Labs and the Institute for Healthcare Improvement initiated a large-scale voluntary implementation effort, the Sprint Challenge. At the Institute for Healthcare Improvement National Forum in December 2008, hospitals were asked to test the checklist in their own surgical settings, provide feedback, and begin local implementation. The primary lessons from this work were as follows: While the checklist is elegantly simple in form, it is a complex organizational intervention and can present challenges in implementation. Effective implementation requires changes in both surgical workflow and surgical team behavior, as the checklist calls for both checks of process completion and prompts for team discussions. During local checklist modification, many hospitals added process checks and removed prompts for team discussion, likely reducing the checklist’s potential impact as a communication tool.
The Safe Surgery South Carolina Program was built upon lessons learned from the Sprint Challenge and the hospitals that successfully implemented the Surgical Safety Checklist since its release. The program also incorporated principles from other large-scale US quality improvement efforts.8–10 The lessons from the program are likely generalizable in the US, as South Carolina hospitals largely mirror US hospitals in terms of types of communities served, size, scope of services provided, ownership, existing quality improvement infrastructure, culture, and experience with quality improvement in operating rooms. Elements of the program have evolved since it began, and the effort is ongoing.
At the onset of the program in 2010, every hospital CEO in South Carolina publicly committed their hospitals to participate in the program. Initially, lower-touch, distance-support methods that required minimal resources (webinars and coaching calls) were offered to participants. As the program matured, higher-touch activities (in-person meetings, on-site coaching visits, teamwork skills trainings, and submission of information to the program team) were added to more effectively reach hospitals with low participation and better meet the needs of participants. Full descriptions of program activities and a program timeline are in appendix B.1 Implementation science research was conducted concurrently to enable real-time feedback to participating hospitals and the program team.11,12
A self-certification program was created in 2015 to allow hospitals to audit and evaluate their own performance in using the Surgical Safety Checklist and attest to its regular use by declaring publicly that every surgical patient in their institution benefited from its use. Hospitals also must provide letters of support from clinical champions from all surgery disciplines (anesthesia professionals, nurses, surgical technologists, and surgeons) in the hospital when they apply for self-certification. Hospitals are also required to submit a copy of their checklist and a picture of how it is displayed in the operating room. Applications are reviewed by a multidisciplinary committee convened annually by the South Carolina Hospital Association.
The positive effects that this program had on surgical mortality and operating room culture have been published previously.12–14 A significant reduction of 22 percent in postoperative surgical mortality rates was seen in the fourteen hospitals that completed the major elements of the program by December 2013.13 Program completion was attained when a teamwork and safety culture survey was administered to operating room physicians and staff members after hospitals reported that their implementation was complete and that the checklist was used in a majority of surgical cases in their organization; a similar survey was administered at baseline. These surveys allowed us to relate safety culture to postsurgical mortality and to observe changes in safety and teamwork culture over time.12,14 These surveys revealed a significant relationship between a stronger safety culture and hospital postsurgical mortality. In the fourteen hospitals that completed the program, a significant improvement of 5.4 percent () in perceptions of teamwork, coinciding with checklist implementation, was also seen.14
Our data on the implementation of the program help explain how some hospitals achieved checklist use and how all hospitals participated in a long-term voluntary implementation effort. Most previously reported patient safety initiatives have been carried out on a small scale.15 This study adds to the limited existing knowledge about broadly implemented patient safety initiatives. The Safe Surgery South Carolina Program offers unique insights, which we hope can be applied to future large-scale patient safety efforts. These insights can also inform policy makers at the hospital, hospital association, state, and national levels in their efforts to support the adoption of patient safety innovations.
Study Data And Methods
Data Sources
Over the course of the program, we collected data about hospitals and individuals that participated (name, institutional affiliation, discipline, and title) in Safe Surgery South Carolina Program activities. This information was originally collected for programmatic purposes.
This study was approved by the Office of Human Research Administration at the Harvard T. H. Chan School of Public Health.
Cohort Definition
Sixty-four hospitals performing surgery in South Carolina were included in the analyses. One hospital was removed from the data set because it opened in late 2014, after the bulk of program activities took place, and two hospitals were collapsed into one organization for the purpose of this research because they participated in the program as a single institution. The hospitals were divided into two groups for comparison. We defined checklist hospitals as a group comprising the initial fourteen hospitals that had administered the follow-up culture survey and reported completing implementation of the checklist in their operating rooms by 201313 and fifteen other hospitals that had reported completing implementation of the checklist in their operating rooms and regular checklist use through the self-certification process by 2017. The remaining thirty-five hospitals constituted the comparison group; as explained below, they participated in the program, but had not reported completing implementation of the checklist in their operating rooms or regular checklist use.
Analyses
Hospital demographic characteristics were analyzed to compare the two groups. We then analyzed the participation data at two levels, hospital and individual. To allow comparisons of participation across activities and hospital groups, a participation scoring scale was created for each program activity and “implementation facilitator” (for example, CEO participation). Hospitals were scored on a scale ranging from 0 points for no participation to 5 points representing the highest level of hospital participation. Hospitals were given the same credit for participation regardless of the number of people involved in each activity. The means for each group were calculated for each program activity and implementation facilitator and plotted on a bar graph to display the differences between the two groups. To understand how hospitals participated in the program by year, we analyzed the number of interactions each hospital had with the program and grouped them into one of four participation categories (none, below average, average, and high).
Individual-level analyses enabled us determine the number of interactions each person had with the program by activity type and discipline, as well as what activity represented their first interaction. Additionally, the average number of people and interactions were calculated for the checklist hospital and comparison groups, and Wilcoxon-Mann-Whitney tests were conducted.
A p value of less than 0.05 was considered significant for all tests. All statistical analyses were conducted using R.16
Limitations
Our study had several limitations. First, it was an observational study based on data that were originally collected to guide the conduct of a large-scale quality improvement project, not gathered for research. We could only identify associations, not infer causation.
Second, our outcome measure (reporting completion of implementation or regular checklist use at the time of the follow-up culture survey or when applying for self-certification) is self-reported and could over- or underestimate successful implementation. While the true measure of implementation success—actual checklist use in operating rooms across the state—is theoretically possible to ascertain, the effort would be expensive, labor intensive, and potentially intrusive.
Third, during the time span involved, there were likely many unaccounted-for barriers and facilitators to implementation that could affect some of the conclusions.
Study Results
Hospital Demographic Characteristics
The CEOs of all surgery-performing hospitals in the state publicly committed their institutions to the program, and all hospitals participated in it to some degree. Hospital ownership varied among corporate, government, and charitable forms. Ownership, rural-urban location, and teaching status did not differ between the two groups. Neither did bed size, which varied from 25 beds to more than 800 (exhibit 1).
Checklist hospitals (n = 29) | Comparison hospitals (n = 35) | ||||
Characteristic | Number | Percent | Number | Percent | p value |
Locationa | 0.99 | ||||
Rural | 12 | 41.4 | 14 | 40.0 | |
Urban | 17 | 58.6 | 21 | 60.0 | |
Teaching statusa | 0.45 | ||||
No | 24 | 82.8 | 32 | 91.4 | |
Yes | 5 | 17.2 | 3 | 8.6 | |
Bedsb | 0.19 | ||||
50 or fewer | 4 | 13.8 | 6 | 17.1 | |
51–250 | 15 | 51.7 | 22 | 62.9 | |
251–500 | 7 | 24.1 | 6 | 17.1 | |
More than 500 | 3 | 10.3 | 1 | 2.9 |
Program Participation Data
The program started with a webinar series that was given three times, and hospitals were eligible to participate in any or all the webinar offerings. Forty-five hospitals participated the first time the webinar series was delivered; five more participated in the second series, and two more in the third. As the program expanded to include higher-touch activities, more hospitals participated: in-person meetings added five hospitals, teamwork skills trainings added four, and coaching visits added three.
Extending the period of time during which program activities were offered permitted more hospitals to report regular checklist use and become checklist hospitals (exhibit 2) and more individuals (exhibit 3) to directly participate. Appendix C shows how each hospital’s participation varied by year over the course of the program.1 Some hospitals participated at high levels throughout the program, while others did not participate at all during a given year.
Compared to hospitals in the comparison group, those in the checklist hospital cohort were significantly more likely to submit information (for example, a picture of their checklist implementation team) to the program team and to participate in teamwork skills trainings and in-person meetings (exhibit 4). Participation in program activities by CEOs and physicians was also associated with successful implementation. The two groups of hospitals did not differ significantly in their degrees of participation in webinars () and coaching visits (). Compared to comparison hospitals, hospitals in the checklist hospital cohort had more people participate in program activities (10.9 versus 4.7; ) and more interactions (30.6 versus 15.9; ) (data not shown).
Of the 478 individual participants, the majority (314) were nurses (exhibit 5). Nurses also accounted for the majority of interactions. Interestingly, surgical technologists were engaged only through higher-touch activities. Twenty-two surgical technologists attended in-person meetings, and fourteen attended teamwork skills training, with no technologists participating in webinars (data not shown).
Discipline | Interactions | Individuals |
Nurses | 1,070 | 314 |
Surgeons | 67 | 21 |
Surgical technologists | 53 | 36 |
Certified registered nurse anesthetists | 40 | 14 |
Anesthesiologists | 28 | 11 |
Other physicians | 20 | 10 |
Other | 135 | 50 |
Unknown | 30 | 22 |
All | 1,443 | 478 |
Discussion
The Safe Surgery South Carolina Program provides an opportunity to take lessons from a real-world, multiyear, voluntary health care implementation project and to use them to guide future large-scale implementation initiatives. Three major lessons emerged from the analyses.
Engage All Stakeholders
One of the key findings from this work is that checklist hospitals were significantly more likely than comparison hospitals to have a CEO participate in a program activity, which indicates active involvement in or support for their hospital’s implementation effort. The importance of leaders’ involvement in organizational change is widely understood across industries and in health care.17 While all CEOs publicly committed their hospital’s to participate in the program, not every hospital completed checklist implementation. We recommend that CEOs be approached throughout a program in a variety of ways: inviting CEOs to program meetings; presenting at existing CEO events, and using other opportunities to communicate with them (such as CEO newsletters and one-on-one meetings); giving them specific tools to use in their institution’s implementation effort (such as a tool for the CEO to observe checklist use); and providing routine feedback (such as participation reports and benchmarked culture survey results) about that effort, together with advice for improvement. We believe that benchmarking the information given to CEOs can be a useful means to stimulate action. We also encouraged CEOs to speak directly with their physicians and staff members about the importance of the surgical safety work.
While efforts were made to engage all operating room team members directly, nurses were the primary participants in program activities. They were frequently the most direct connection from the program to the hospital and physicians. Preparing nurses to interact with physicians and build physician support made the program possible. They were intentionally equipped with tools to help them engage their colleagues, including physicians, through one-on-one conversations. Other content was customized for the specific audience addressed. Webinars intended for physicians were tailored to show them how to be a checklist leader in the operating room and more broadly in their organizations.
Not surprisingly, when physicians participated directly in the program, their hospitals’ implementation efforts were significantly more successful. Involving physicians in patient safety projects is as essential to success as involving CEOs. However, it is difficult, in part because of the many demands on physicians’ time and attention, but also because physicians may lack skills in or comfort with quality improvement.18 Furthermore, the relationship between physicians and hospitals varies widely across the US, and South Carolina mirrors that variability.19 Physicians may work for themselves, a single hospital, or multiple hospitals, and they may have limited physical presence in a hospital. Front-line surgeons may be particularly difficult to reach. They are represented by multiple organizations, generally grouped by specialty, and these professional societies do not necessarily provide ready access to practitioners. Furthermore, until more surgeons are specifically compensated for their work on quality improvement, engagement may remain difficult. However, even in the absence of direct compensation, programs such as this give physicians an opportunity to improve patient care in ways that they can see and feel. Also, physicians who participate can encourage their peers to follow suit.
Provide Varied Opportunities For Participation
A combination of low- and high-touch activities is necessary for successful implementation. We found that webinar participation by itself was not associated with successful implementation. The initial plan was to rely on this popular, inexpensive, lower-touch, and scalable means of programmatic support. However, feedback received from hospitals early in the program through coaching calls helped us understand that while some hospitals were able to implement the checklist with only webinar guidance, many others needed more support. Even hospitals with high webinar participation usually also engaged in other activities. Frequent participation in higher-touch activities confirmed our belief that hospitals found these types of activities a valuable addition and helpful to their implementation efforts.
Adding higher-touch activities to webinars is needed to support implementation. It also gives additional opportunities for hospitals and individuals to get involved in ways that may appeal to them. As higher-touch activities were added, more hospitals and individuals engaged with the program. As explained above, surgical technologists did not participate in webinars. If higher-touch activities had not been offered, technologists would have been left out of a program in which multidisciplinary participation was critical to success.
Unlike the other higher-touch activities, the number of coaching visits, was not significantly different in the checklist hospital and comparison hospital groups. The program team, not the hospitals, drove coaching visits, and the goal was to reach every hospital at least once, if the hospital allowed. This finding does not imply that we believe that coaching visits are not important. They provided feedback from the front line, and hospitals generally valued the visits.
Allow Adequate Time And Resources
The program has now spanned eight years, with additional hospitals and individuals continuing to join the community and demonstrate success (exhibits 2 and 3). This program demonstrates that changing behavior and culture takes time and resources, but change is possible. Successful implementation of the Surgical Safety Checklist, despite its simple appearance, is complex. It requires organizations and teams to make significant changes to workflow, communication, culture, process, and structure. Following the initial implementation, organizational efforts are needed to maintain, sustain, and improve the use of the checklist so that patients can receive the most benefit over the long term.
Long-term patient safety programs require significant resources to facilitate sustained change. Both Ariadne Labs and the South Carolina Hospital Association committed significant staff time to running the program. Within the hospitals, leaders succeeded in convincing nearly 500 clinicians to step away from their clinical responsibilities to participate in program activities. Alongside program activities, these clinicians likely devoted additional hours of effort to the program at their hospitals. We believe that resources in the form of staff and clinician time were keys to the program’s success. Without resources devoted to the program by the hospitals and the program team, widespread change would not have been possible.
In a world of limited resources, careful selection of activities for participating hospitals is critical. Ideally, most activities would be lower-touch (webinars and coaching calls), which would allow resources to stretch further. However, we found that lower-touch activities alone were not enough to achieve success. Strategically adding higher-touch activities engaged new hospitals in the program and helped hospitals that were already participating further their implementation efforts.
Care should be taken to avoid overburdening hospitals by asking too much. The program asked participants to focus on implementing the checklist instead of diverting resources to collect outcomes data—which is commonly done in patient safety projects. All the data collected at the hospital level were used to support each institution’s implementation efforts, and resources were used judiciously.
The Capacity To Improve
The initial goal of the program was to have the checklist implemented throughout the state by 2013. Reality got in the way. Organizational capacity and readiness for change affects the adoption and implementation of all patient safety solutions at scale.17,20 This program provides considerable anecdotal evidence that factors in both internal and external environments can enhance or limit a hospital’s ability to participate in a meaningful way. As an example, we were concerned that some rural hospitals might find participation difficult because of their precarious financial condition. Using a financial distress index that evaluates hospitals’ finance-related risk of closure,21 we found that the ability to implement the checklist successfully was completely unrelated to the index (George Pink, senior research fellow, Cecil G. Sheps Center for Health Services, University of North Carolina at Chapel Hill, personal communication, May 6, 2018). Rural hospitals, even under duress, can find a way if the work is a priority.
Differences in capacity and capability at the hospital level must be anticipated and planned for, to enable widespread adoption and implementation of patient safety initiatives. To accommodate these differences, it is crucial to meet hospitals “where they are.” Patient safety projects like the Surgical Safety Checklist require incremental improvement over extended periods of time, and hospitals cannot do work that they are not ready for.17,20 Appendix C demonstrates that not every hospital was ready to participate in the program at the same time.1 Participants often reported that competing hospital priorities (for example, implementing an electronic medical record) and other contextual factors delayed implementation of the checklist. Furthermore, hospitals vary greatly in their experience with quality improvement projects.17 For some hospitals, this was the first major improvement initiative in their operating rooms, and they likely learned how to do quality improvement through the work.
This program demonstrates an approach to meeting the needs of multiple hospitals in their unique and changing contexts of cultures, challenges, and strengths. We believe that there is no single combination of activities or exposure that will always lead to implementation success in a hospital. Generally, hospitals that can participate more do better, but hospitals need to do what is right for them.
Recommendations To Policy Makers
Implementation of patient safety initiatives at scale is challenging. Policy makers should understand that changing care on the clinical front lines is a complex undertaking that requires their support. Resources need to be adequate to enable the work. The expectations from policy makers are often for instant results. Giving hospitals adequate time to change is necessary for successful implementation. Finally, incorporating flexibility that allows for local adaptation and accounts for hospitals’ unique differences in capacity to participate in and carry out this work is essential.
Leadership from policy makers at multiple levels is needed to encourage and sustain hospitals’ willingness to undertake such challenging work. Instruments of policy makers, including the power to convene as well as to publicly praise or mandate and hold accountable, can be powerful levers for overcoming resistance to change. Given the variation in hospitals’ readiness for change, mandates may be too strong a stick, at least until hospitals have had ample opportunity to engage in programs such as the Safe Surgery South Carolina Program that are designed to facilitate improvement.4,7 However, there may be a time for mandates when a patient safety solution has reached widespread but not complete adoption. Many patient safety initiatives have followed the path from voluntary uptake to mandated use through policy.8,22 Through crafting supportive policy, policy makers can help foster the change that keeps patients safe.
ACKNOWLEDGMENTS
Funding was provided by Branta Foundation and the Agency for Healthcare Research and Quality (Grant No. R18:HS019631). An earlier version of the manuscript was presented at a working paper review session in Washington, D.C., April 10, 2018, organized by Health Affairs and supported by the Gordon and Betty Moore Foundation.
NOTES
- 1 To access the appendix, click on the Details tab of the article online.
- 2 A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med. 2009;360(5):491–9. Crossref, Medline, Google Scholar
- 3 . Surgical checklists: a systematic review of impacts and implementation. BMJ Qual Saf. 2014;23(4):299–318. Crossref, Medline, Google Scholar
- 4 . Introduction of surgical safety checklists in Ontario, Canada. N Engl J Med. 2014;370(11):1029–38. Crossref, Medline, Google Scholar
- 5 . Implementation of safety checklists in surgery: a realist synthesis of evidence. Implement Sci. 2015;10(1):137. Crossref, Medline, Google Scholar
- 6 . Effective surgical safety checklist implementation. J Am Coll Surg. 2011;212(5):873–9. Crossref, Medline, Google Scholar
- 7 A qualitative evaluation of the barriers and facilitators toward implementation of the WHO Surgical Safety Checklist across hospitals in England: lessons from the “Surgical Checklist Implementation Project.” Ann Surg. 2015;261(1):81–91. Crossref, Medline, Google Scholar
- 8 An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med. 2006;355(26):2725–32. Crossref, Medline, Google Scholar
- 9 National efforts to improve door-to-balloon time results from the Door-to-Balloon Alliance. J Am Coll Cardiol. 2009;54(25):2423–9. Crossref, Medline, Google Scholar
- 10 . Explaining Michigan: developing an ex post theory of a quality improvement program. Milbank Q. 2011;89(2):167–205. Crossref, Medline, Google Scholar
- 11 Surgical team member assessment of the safety of surgery practice in 38 South Carolina hospitals. Med Care Res Rev. 2015;72(3):298–323. Crossref, Medline, Google Scholar
- 12 Implementation of the Surgical Safety Checklist in South Carolina hospitals is associated with improvement in perceived perioperative safety. J Am Coll Surg. 2016;222(5):725–736.e5. Crossref, Medline, Google Scholar
- 13 Mortality trends after a voluntary checklist-based surgical safety collaborative. Ann Surg. 2017;266(6):923–9. Crossref, Medline, Google Scholar
- 14 Perception of safety of surgical practice among operating room personnel from survey data is associated with all-cause 30-day postoperative death rate in South Carolina. Ann Surg. 2017;266(4):658–66. Crossref, Medline, Google Scholar
- 15 . Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50. Crossref, Medline, Google Scholar
- 16 R Core Team. R: a language and environment for statistical computing. Vienna: R Foundation for Statistical Computing; 2013. Google Scholar
- 17 The influence of context on quality improvement success in health care: a systematic review of the literature. Milbank Q. 2010;88(4):500–59. Crossref, Medline, Google Scholar
- 18 . A framework for engaging physicians in quality and safety. BMJ Qual Saf. 2012;21(9):722–8. Crossref, Medline, Google Scholar
- 19 The employed surgeon: a changing professional paradigm. JAMA Surg. 2013;148(4):323–8. Crossref, Medline, Google Scholar
- 20 . Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. Crossref, Medline, Google Scholar
- 21 . Predicting financial distress and closure in rural hospitals. J Rural Health. 2017;33(3):239–49. Crossref, Medline, Google Scholar
- 22 . The Surgical Infection Prevention and Surgical Care Improvement Projects: promises and pitfalls. Am Surg. 2006;72(11):1010–6; discussion 1021–30, 1133–48. Medline, Google Scholar