{"subscriber":false,"subscribedOffers":{}}

Cookies Notification

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Find out more.
×

Commentary

Diffusion Of Innovation
COMMENTARY

Overcoming Challenges In Codifying And Replicating Complex Health Care Interventions

Affiliations
  1. Tim J. Horton ([email protected]) is an associate director (Insight and Analysis) at the Health Foundation, in London, the United Kingdom.
  2. John H. Illingworth is an Improvement Fellow at the Health Foundation and an honorary research associate at the Institute of Global Health Innovation, Imperial College London.
  3. Will H. P. Warburton is director of Improvement at the Health Foundation and an honorary research fellow at the Institute of Global Health Innovation, Imperial College London.
PUBLISHED:Free Accesshttps://doi.org/10.1377/hlthaff.2017.1161

Abstract

The complex nature of many health care interventions poses challenges for successful replication. This article presents insights on tackling these challenges primarily drawn from recent research and programs in the UK. These insights include the need to codify complex interventions in ways that reflect their social, context-sensitive, and dynamic nature; to capture learning as the intervention is implemented in new contexts; and to design programs in ways that respect adopters’ role in the spread process. We argue that program leaders should have familiarity with theoretical approaches for conceptualizing complex interventions, that a discrete testing-and-revision phase should be recognized as part of the spread process, and that programs should be designed in ways that build and sustain adopter commitment. These perspectives complement the traditional focus on the innovator in models of spread by highlighting the role adopters play in adapting interventions and generating learning, and they have implications for the design of programs to spread innovation.

TOPICS

Although replicating successful health care interventions in new contexts is essential for maximizing their benefits for patients, it is also a well-recognized challenge.1 A commonly seen phenomenon is that when initially successful interventions are spread to new settings, they fail to achieve the same impact—or indeed any impact at all.2

On occasion, there may be straightforward explanations for this, such as a failure by adopters to adhere to intervention protocols. In recent years, however, there has been growing interest in a deeper set of explanations: that we might not be conceptualizing and describing interventions in ways that enable them to be successfully reproduced in new contexts, and we might not be organizing programs to spread interventions in ways that adequately help adopters reproduce them.3

Perhaps it is not surprising that this is such a challenge. For an adopter to be able to replicate an intervention successfully, they need to understand how they can translate the idea into their own setting; to know just what matters for the intervention to work in this new setting; and to have the opportunity, willingness, and capability to implement it. These issues become especially acute with complex health care interventions, particularly innovations in clinical processes and pathways—for example, improvements in hospital discharge.

This article presents insights on tackling these challenges drawn from a range of case studies, including programs and research funded by the Health Foundation (a grant-making charity that supports health care improvement in the UK), which have been selected because they illuminate these challenges of context, adaptation, and program design. The article begins with a brief consideration of why complexity poses challenges for conceptualizing and replicating interventions and then considers the implications for describing interventions and supporting their diffusion.

Complex Health Care Interventions

There is no single definition of a complex health care intervention, though many accounts describe these interventions as having multiple, potentially interacting, components.4 Here we highlight three interrelated properties of complex interventions that help explain why codifying and replicating them is difficult.

First, complex interventions are social: They are underpinned by the behaviors of staff and patients. This implicates the attitudes, relationships, and organizational cultures of those adopting an intervention in its success or failure. Understanding how particular components of the intervention work therefore requires understanding the social mechanisms that facilitate them, and successful replication may require adopters to re-create these social dynamics in their own setting.

Second, complex interventions are context-sensitive: They are built on and interact with the underlying systems and subsystems of the organizational environment in which they are implemented (including the social and relational elements described above). Indeed, some interventions are so embedded in their context that it can be hard to distinguish the intervention from its implementation in a specific context.5 To the extent that these systems and subsystems may differ from one location to the next, successful replication may require adaptation of the intervention. This means that the same intervention may look different in different contexts—for example, when interventions are codesigned with patients, which means that local patient priorities will shape adaptation (and, in the process, challenge standardization).6 This, in turn, means that the intervention description will need to capture which components are core to making the intervention work and also their tolerance to alteration.

Third, complex interventions are dynamic: The systems (people, teams, and organizations) that implement them can learn and self-organize, and the contexts in which the interventions are implemented can raise new issues that require those involved to respond. This means that a complex intervention may evolve over time in unpredictable ways. What is more, its fate may rely on the adopter’s ability to navigate these dynamics and adapt. The degree of adaptation and responsiveness required will have consequences for the extent to which the success of the intervention is seen to reside in its specific components versus in the capability and decision making of the adopter.

A range of approaches has evolved for analyzing complexity in both health care and health promotion. For example, realist evaluation explores the mechanisms and contextual factors that underpin an intervention’s outcomes,7 while complexity science considers an intervention in terms of a system of interdependent agents whose interaction gives rise to emergent, systemwide patterns of behavior that cannot be reduced solely to the impact of the component parts.8

There are certainly contrasts in how different schools of thought tackle issues of complexity. Some view complexity primarily as a property of the intervention and seek to include relevant background conditions and enabling factors in the conceptualization of the intervention itself, while others tend to see complexity as a feature of the context in which the intervention is embedded.9 Some seek to capture causality primarily through understanding the role that different components play within the intervention’s theory of change, while others focus instead on the system-level changes triggered by the intervention.10

Nevertheless, we view the various approaches that exist as different routes for capturing the same underlying features of complex interventions highlighted here: an intervention’s social dimension, how it is embedded in its context, and how it may evolve over time as those implementing it learn and adapt.

Some Common Misconceptions

Complexity can give rise to a range of misconceptions in attempts to codify and spread interventions.

One misconception is to believe that the technical components form the hard core of an intervention, while the social components are softer—that is, more discretionary or open to variation. In fact, the social components may be essential. This was seen in a study of a successful program (the Keystone Program described below) to reduce central venous catheter bloodstream infections in intensive care units (ICUs) across Michigan.11 Some contemporary accounts saw the intervention as a simple checklist of five technical components, such as using chlorhexidine for skin preparation. However, the study found that the checklist also had an important social function in promoting adherence to these technical components, because the program did not simply ask ICU staff members to use the checklist but also specified that every catheter insertion should be monitored by a nurse, who would report any breaches in protocol. This requirement for nurses to supervise doctors’ practices implied a restructuring of professional relationships and would work only if nurses were able and willing to intervene. In addition, the study found that unwavering support from senior physicians was crucial in enabling nurses to act as a disciplinary force in this way. Whether one regards such social and relational dynamics as aspects of context or of the intervention itself, exposing and understanding them can therefore be essential for effective replication.

Another misconception is to confuse the direct or instrumental effects of intervention components with their expressive or symbolic effects. For example, the same study found that the requirement for ICUs to create a dedicated cart that contained all of the items required for successful catheter insertion not only had instrumental benefits in averting delays but also had expressive ones: It signaled an organizational commitment to infection control and heightened awareness of the program. Sensitivity to the expressive as well as instrumental functions of an intervention component in a particular context can be important for describing that component in a way that allows replication of the same effects.

A third possible misconception is the failure to recognize capability building as an integral aspect of the intervention. For example, while the introduction of surgical checklists around the world has sometimes been associated with reductions in surgical complications and mortality, when their use was mandated in Ottawa, Ontario, in 2010, an evaluation found no such improvements—even though compliance with the checklist was high.12 The researchers note that, unlike in other instances where positive effects had been observed, the introduction of the checklist in Ottawa had not included team training on how to use it, and they suggest that a greater effect might have occurred had training been provided.

Challenges In Codifying And Replicating Complex Interventions

These social, context-sensitive, and dynamic properties of complex interventions mean that substantial work and creativity may be required by adopters to translate interventions into their own settings. We now explore three specific challenges that this creates for programs to spread innovation.

Codifying Complex Interventions To Support Implementation

Adequate description of an intervention’s technical aspects is of course often critical for effective replication. Recent research testifies to the role that the poor description of technical components plays in preventing successful spread, which has led to the development of approaches to assist innovators and evaluators in producing more robust intervention descriptions. One example is the Template for Intervention Description and Replication (TIDieR) ,13 a checklist and guide that makes it easier for authors to structure accounts of their interventions and prompts them to provide sufficient detail on a range of key issues to enable effective replication.

Nonetheless, the analysis above suggests that successfully replicating complex interventions requires codifying them in ways that go beyond a description of their technical components and allow adopters to navigate the underlying social, contextual, and dynamic forces.

Various approaches to codification are evident in the evaluation and implementation science literature,14 which we see as characterized by two contrasting impulses: tightening and loosening.

Some approaches seek to tighten the description of the intervention in response to these challenges of codification by attempting more comprehensive and finer-grained specifications. This could include specifying the method for implementing intervention components (for example, Lean principles) or detailing relevant social mechanisms in addition to technical components. An example of the latter can be seen in the case of Practical Obstetric Multi-professional Training (PROMPT), a training course to improve responses to obstetric emergencies that was developed at Southmead Hospital (in Bristol, the UK) and that has been associated with significant improvements in outcomes—including a 50 percent reduction in babies born starved of oxygen. While PROMPT has spread widely in the UK and globally, consistent replication of the improvements seen at Southmead has been harder.15 This suggests that the training package and accompanying technical proficiency alone may not explain Southmead’s success. A forthcoming study led by researchers at the University of Cambridge is aiming to characterize the mechanisms underlying the improvements seen at Southmead and to develop an additional implementation package that incorporates the norms and behaviors that also need to be in place to reproduce these outcomes.

Other approaches seek to loosen the description of the intervention by focusing less on specifying the details of each component and more on the ability of adopters to formulate their own versions of these components in their own setting. This includes approaches that focus on the theory of change underpinning the intervention and that see fidelity as replicating the function that components play rather than their original form, as well as approaches that focus primarily on building the knowledge, skills, and capabilities that adopters require to recreate the intervention’s effects in their own setting. In the language of intervention manuals, tightening approaches seek to lengthen the manual, while loosening approaches support the adopter in writing their own version.

Both tightening and loosening approaches seek to reconcile the need for creativity and constraint, but via complementary routes. Tightening approaches attempt to draw social and contextual factors into the intervention protocols, though in doing so they highlight the capabilities required for successful implementation. Loosening approaches focus on helping adopters adapt the intervention to fit their own context, though in doing so they make adopters own the constraints within which they need to operate.

Indeed, these approaches are not mutually exclusive and often appear in combination. For example, a capability-building approach could be combined with a detailed prescription of intervention components or implementation methods, as in the example of the Flow Coaching Academy in the UK, which supports teams in improving flow along condition-based pathways using clinical-microsystem and Lean principles. Instead of prescribing specific intervention components or outcomes, the academy trains people to use a road map to lead flow improvement within their own organizations. However, this capability building is combined with detailed guidance on the methods for implementing change, such as a customized pathway assessment tool.16

All of these approaches to intervention description have merits, and a priority for improvement research is to investigate which may be optimal in which circumstances. What is clear, however, is that it is important for innovators, policy makers, and program leaders, not just evaluators and researchers, to be aware of them if they are to codify and spread interventions more effectively.

Currently, most of the discussion about intervention conceptualization and description resides in academic literature, despite the fact that it contains rich, practical insights for those engaged in spread and adoption. There is a case to be made, therefore, for making these approaches to codification a standard part of the innovator’s tool kit, alongside more familiar skills such as how to make a pitch or design a business case. In making this case, we echo others who have advocated the greater use of program theory by innovators and improvers to enhance intervention description and promote successful replication.17

Using Learning From Implementation In New Contexts To Refine The Intervention

A second, related issue is that it can be hard for innovators to see their own context. When something has been achieved successfully in one location, it might seem straightforward to document the actions involved, but it may in fact be impossible to know which aspects of context were relevant to the initial success without being able to compare this experience with other scenarios. It may be only when an intervention is implemented in new contexts that the comparative information becomes available to enable the innovator to see what actually made their intervention work the first time.

Our interviews with teams that are working to spread complex interventions through the Health Foundation’s Scaling Up program, which supports teams to spread proven health care interventions within the UK, suggest that this is a common experience. When an intervention that has been successfully demonstrated in only one setting is introduced into a more diverse set of sites, the variable fortunes of the intervention in different sites can reveal which components and contextual factors are more or less important. Similarly, as adopters adapt the intervention to fit their own settings, valuable lessons are learned about the tolerance of different components to alteration. All of this gives the innovator new insights into what is significant for making the intervention work and enables them to revise the intervention description accordingly. What is happening during this initial spread phase—a learning process in which the intervention sometimes undergoes substantial reconceptualization and refinement—often looks quite different from attempts to spread the intervention at a later stage of maturity, when it is codified with more confidence and issues of fidelity are more clear-cut.

Within pharmaceutical and medical device innovation, such a stage of comparative testing is usually recognized as a formal part of the innovation cycle (such as the field-testing or beta-testing stages of product development), but we have found this is less consistently the case with quality improvement and process innovation. There may therefore be value in policy makers’ and program leaders’ recognizing this testing-and-revision phase as a necessary part of the cycle, distinct from attempts to spread the intervention at later stages of maturity. This would help set realistic expectations for the outcomes in this initial phase, helping those involved appreciate that a key objective is to learn from variations in performance, instead of assuming that replication will be successful everywhere. It could also influence the selection of initial adopter sites to ensure appropriate diversity.

Recognizing the role of early adopters in generating new learning about an intervention and helping refine it can pose a practical challenge to the innovator, who may view themselves as having exclusive knowledge about the intervention or be emotionally attached to aspects of its original form that prove to be superfluous or suboptimal in new contexts. Similarly, there are implications for the relationship between the innovator and the community of adopters during this testing phase, where they are cast more as peers engaged in reciprocal learning than as participants in the kind of teacher-pupil relationship that traditional dissemination models imply.

An innovator therefore needs to enter this phase prepared to revise their own conception of the intervention and to accept that their initial idea will be shaped by a wider community. In some cases, innovators may need to bring in others with the detachment and mind-set to lead this initial spread phase.

Building And Sustaining Adopters’ Commitment

The significant role that adopters play in adapting complex interventions suggests that attempts to replicate interventions at scale will be more likely to succeed if they recognize and respect the centrality of adopters’ role in the process. Most obviously, it matters that adopters are committed to implementing and sustaining an intervention, which in turn requires consensus about the problem being tackled and the proposed solution,18 as well as motivation to put the solution into practice.19 How best to build and sustain adopters’ commitment therefore become crucial questions of program design (a challenge that is crystallized most starkly when programs are mandatory or have a strong element of top-down pressure, but which remains even when participation is voluntary).

Creating acceptance of the need for change and generating a commitment to change can require far more than the presentation of evidence.

Crucially, creating acceptance of the need for change and generating a commitment to change can require far more than the presentation of evidence. Behavioral science suggests that these phenomena have strong psychological and social dynamics, which play out in different approaches to program design. These include peer leadership, peer communities, and adopters’ autonomy and ownership of the intervention.

Peer Leadership:

Peer leadership can be especially important in building a case for change and developing consensus around a solution.

The source of any change message is an important factor in persuading others. Evidence shows that people are more likely to listen to and be influenced by others like them, particularly when the topic of the message is related to the group identity of the messenger and receiver.20 This phenomenon can be particularly significant in health care, given the demarcation of professional identities and the strong role that professional bodies play in determining values and behavior.21 So peer leadership can be especially important in building a case for change and developing consensus around a solution.

Peer Communities:

Social networks play an important role in generating commitment to change because they function not only to spread information but also to shape norms and values, which can be powerful drivers of the adoption of new ideas (sometimes irrespective of whether the idea in question is desirable or evidence based).22 This suggests that creating a peer community or horizontal network of adopters can play an important role in generating a commitment to change, as well as in supporting peer-to-peer learning. Spread programs could therefore benefit from fostering the kinds of horizontal structures that are commonly used in wider quality improvement initiatives, such as collaboratives and clinical communities.

Autonomy And Ownership:

The degree of autonomy and ownership of the intervention that adopters enjoy may also be important for building and sustaining commitment. This goes beyond the need to adapt the intervention to make it work in new contexts: The very acts of creating and shaping something can be important for generating attachment. This has been named the “IKEA effect” in behavioral science, based on the observation that people value self-made products more highly than identical, externally assembled versions.23

The importance of these dynamics for program design are illustrated by recent research that looked at the differing fortunes of two programs to reduce central venous catheter bloodstream infections: the Keystone Program (2003–06), which successfully reduced infections in ICUs across Michigan; and an initiative of the English National Health Service called Matching Michigan (2009–11), which sought to reproduce the Keystone Program’s success but failed to have an impact above the secular trend. The Keystone Program, which was voluntary and led by a state hospital association, deployed ICU insiders to promote the program, with whom participants could identify. The researchers found that this peer leadership was essential for establishing trust and securing legitimacy.11 By contrast, the English program was mandatory and led by a government agency, and it came after a series of related initiatives that some had perceived as top-down and punitive. This undermined engagement and made it difficult to persuade participants that the program was necessary.24

Horizontal relationships were also important in the Keystone Program’s success: Through bringing participants together in workshops, a peer community was created that helped foster shared norms and generate commitment and ownership among participants. The program’s top-down elements, such as the use of benchmarking data, were therefore balanced by strong horizontal forces that supported change. The Keystone Program also maintained a degree of local autonomy—for example, encouraging ICUs to devise their own checklist format, provided they retained the main principles. The English program, by contrast, didn’t create the same horizontal links or local ownership, so participants lacked the sense of a collaborative community. The researchers found that this reduced the possibility of influencing professional norms and generating commitment.24

Implications For Policy Makers

This analysis has implications for policy makers and people overseeing spread programs.

First, before initiating large-scale spread initiatives, it is important to establish whether the intervention has undergone a process of comparative testing to identify its core features and their tolerance to variation, and whether it has been codified in a way that will support adopters in understanding the relevant social and contextual factors and adapting the intervention to their own settings.

Second, spread programs should be designed in ways that build and maintain adopters’ commitment. This includes seeking consensus about the problem being tackled and the proposed solution, as well as deploying mechanisms such as peer leadership and social networks to shape attitudes and norms. Also important is balancing the need to ensure fidelity with allowing appropriate adaptation, both to make the intervention work in new contexts and to encourage adopter ownership.

Recognizing the critical role played by adopters in the spread process makes apparent the need to consider adopters’ capability and readiness.

Third, recognizing the critical role played by adopters in the spread process makes apparent the need to consider adopters’ capability and readiness. And recognizing that adoption might itself be a creative process involving significant innovation implies that successful replication may take time and organizational capacity. So in contrast to programs that focus investment on a few pilots and then expect everyone else to follow fast and for free, resources may instead need to be invested in developing adopters’ readiness and giving them the time and space needed to do the hard work of translating the original idea to their own setting.

Conclusion

The traditional pipeline model of innovation, and the corresponding design of spread programs, has tended to privilege the perspective and role of the innovator. Initiatives that seek to pilot and then roll out innovations, for example, may implicitly assume that once an idea has been successfully demonstrated, the hard work has been done. The spread process is then framed as disseminating knowledge, with the innovator and adopter cast in a teacher-pupil relationship. This can be accompanied by an expectation that adoption can happen quickly, even though the innovator may have taken a number of years to develop and implement the intervention in their own organization.

The pipeline model may have endured because of its conceptual simplicity, but it proves inadequate to describing the messier distributed creativity involved in spreading and implementing complex interventions at scale. We join others in challenging this model by highlighting the task adopters face in adapting interventions to new contexts and their contribution in generating new learning.14 We therefore argue for greater emphasis on the role and status of adopters within spread programs, in terms of both how interventions are codified and how programs are designed.

ACKNOWLEDGMENTS

The authors are grateful to Mary Dixon-Woods, Michael Hallsworth, Sarah Henderson, Bryan Jones, and Tracy Webb for their helpful comments and insights in producing this article.

NOTES

  • 1 Keown OP, Parston G, Patel H, Rennie F, Saoud F, Al Kuwari Het al. Lessons from eight countries on diffusing innovation in health care. Health Aff (Millwood). 2014;33(9):1516–22. Go to the articleGoogle Scholar
  • 2 Perla R, Reid A, Cohen S, Parry G. Health care reform and the trap of the “iron law.” Health Affairs Blog [blog on the Internet]. 2015 Apr 22 [cited 2017 Dec 12]. Available from: http://healthaffairs.org/blog/2015/04/22/health-care-reform-and-the-trap-of-the-iron-law/ Google Scholar
  • 3 Howarth E, Devers K, Moore G, O’Cathain A, Dixon-Woods M. Contextual issues and qualitative research. In: Raine R, Fitzpatrick R, Barratt H, Bevan G, Black N, Boaden Ret al. Challenges, solutions, and future directions in the evaluation of service innovations in health care and public health. Southampton (UK): NIHR Journals Library; 2016 May. (Health Services and Delivery Research, No. 4.16). Essay 7. Google Scholar
  • 4 Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M Medical Research Council Guidance. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655. Crossref, MedlineGoogle Scholar
  • 5 Bauman LJ, Stein RE, Ireys HT. Reinventing fidelity: the transfer of social technology among settings. Am J Community Psychol. 1991;19(4):619–39. Crossref, MedlineGoogle Scholar
  • 6 Batalden M, Batalden P, Margolis P, Seid M, Armstrong G, Opipari-Arrigan Let al. Coproduction of healthcare service. BMJ Qual Saf. 2016;25(7):509–17. Crossref, MedlineGoogle Scholar
  • 7 Pawson R, Tilley N. Realistic evaluation. London: Sage; 1997. Google Scholar
  • 8 Plsek PE, Greenhalgh T. Complexity science: the challenge of complexity in health care. BMJ. 2001;323(7313):625–8. Crossref, MedlineGoogle Scholar
  • 9 Shiell A, Hawe P, Gold L. Complex interventions or complex systems? Implications for health economic evaluation. BMJ. 2008;336(7656):1281–3. Crossref, MedlineGoogle Scholar
  • 10 Rogers PJ. Using programme theory to evaluate complicated and complex aspects of interventions. Evaluation. 2008;14(1):29–48. CrossrefGoogle Scholar
  • 11 Dixon-Woods M, Bosk CL, Aveling EL, Goeschel CA, Pronovost PJ. Explaining Michigan: developing an ex post theory of a quality improvement program. Milbank Q. 2011;89(2):167–205. Crossref, MedlineGoogle Scholar
  • 12 Urbach DR, Govindarajan A, Saskin R, Wilton AS, Baxter NN. Introduction of surgical safety checklists in Ontario, Canada. N Engl J Med. 2014;370(11):1029–38. Crossref, MedlineGoogle Scholar
  • 13 Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher Det al. Better reporting of interventions: Template for Intervention Description and Replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687. Crossref, MedlineGoogle Scholar
  • 14 Hawe P. Lessons from complex interventions to improve health. Annu Rev Public Health. 2015;36:307–23. Crossref, MedlineGoogle Scholar
  • 15 Shoushtarian M, Barnett M, McMahon F, Ferris J. Impact of introducing Practical Obstetric Multi-professional Training (PROMPT) into maternity units in Victoria, Australia. BJOG. 2014;121(13):1710–8. Crossref, MedlineGoogle Scholar
  • 16 Microsystem Coaching Academy. Flow Coaching Academy (FCA) [Internet]. Sheffield (UK): MCA; [cited 2017 Dec 12]. Available from: http://www.sheffieldmca.org.uk/flow/ Google Scholar
  • 17 Davidoff F, Dixon-Woods M, Leviton L, Michie S. Demystifying theory and its use in improvement. BMJ Qual Saf. 2015;24(3):228–38. Crossref, MedlineGoogle Scholar
  • 18 Dixon-Woods M, McNicol S, Martin G. Ten challenges in improving quality in healthcare: lessons from the Health Foundation’s programme evaluations and relevant literature. BMJ Qual Saf. 2012;21(10):876–84. Crossref, MedlineGoogle Scholar
  • 19 Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. Crossref, MedlineGoogle Scholar
  • 20 Briñol P, Petty RE. Source factors in persuasion: a self-validation approach. Eur Rev Soc Psychol. 2009;20(1):49–96. CrossrefGoogle Scholar
  • 21 Davies HTO, Nutley SM, Mannion R. Organisational culture and quality of health care. Qual Health Care. 2000;9(2):111–9. Crossref, MedlineGoogle Scholar
  • 22 Rogers EM. Diffusion of innovations. 5th ed. New York (NY): Free Press; 2003. Google Scholar
  • 23 Norton MI, Mochon D, Ariely D. The IKEA effect: when labor leads to love. J Consum Psychol. 2012;22(3):453–60. CrossrefGoogle Scholar
  • 24 Dixon-Woods M, Leslie M, Tarrant C, Bion J. Explaining Matching Michigan: an ethnographic study of a patient safety program. Implement Sci. 2013;8:70. Crossref, MedlineGoogle Scholar
   
Loading Comments...