Skip to main content

Evaluating for learning and sustainability (ELS) framework: a realist synthesis

Abstract

Background

Learning Health Systems (LHS), in which continuous and equitable improvements support optimization of healthcare practices, outcomes, experience, and costs, offer enormous potential for health system transformation. Within the LHS model, evaluation of health innovations assists in question identification, data collection, and targeted action, which facilitates continuous improvement. Evaluation that catalyzes learning may contribute to health innovation implementation, refinement, and sustainability, however, there is little consensus as to why certain evaluations support learning, while others impede it.

Methods

Embedded in the implementation science literature, we conducted a realist synthesis to understand evaluative contextual factors and underlying mechanisms that best support health system learning and sustainable implementation of innovations. We sought to understand whether evaluations can ‘work’ to support learning and sustainability, in which contexts, for whom, and why. Working with an Expert Committee comprised of leaders in evaluation, innovation, sustainability, and realist methodology, we followed a five-stage process of: 1. Scoping the Review, 2. Building Theories, 3. Identifying the Evidence, 4. Evidence Selection and Appraisal, and 5. Data Extraction and Synthesis. Our Review Team and Expert Committee participated in iterative cycles of results interpretation and feedback.

Results

Our synthesis includes 60 articles capturing the mechanisms and contextual factors driving learning and sustainability through evaluation. We found that evaluations that support learning and sustainability incorporate favourable organizational preconditions and focus on implementing rapid cyclical feedback loops that contribute to a culture of innovation and evaluation sustainability. Our findings have been organized into 6 Context-Mechanism-Outcome Configurations (CMOCs): 1. Embracing Risk & Failure; 2. Increasing Capacity for Evaluation; 3. Co-Producing Evaluation; 4. Implementing Learning Feedback Loops; 5. Creating Sustainability Culture; and 6. Becoming a Learning Organization. We have also translated findings into a series of Action Strategies for evaluation implementation to support health systems learning and sustainability.

Conclusions

We identified key contextual factors and underlying mechanisms that make evaluations ‘work’ (or ‘not work’) to support learning and sustainability. Findings support the operationalization of LHS by translating CMOCs into Action Strategies for those tasked with completing evaluations with a view toward health system learning and innovation sustainability.

Peer Review reports

Introduction

The global COVID- 19 pandemic brought to light intractable challenges that health systems have grappled with for years, throwing into sharp relief the scale and severity of hospital overcrowding, surgical backlogs, and workforce burnout, among other challenges [1,2,3,4]. In the coming years, escalating healthcare acuity, growing service needs, and widening inequities will continue to pose a threat to healthcare systems globally [5, 6], prompting healthcare systems to begin to shift attention to the future [7], seeking to understand and articulate how they will transform the care they provide for the populations they serve. As healthcare systems contemplate new approaches to improving health and delivering healthcare that prioritize both the needs of patients and populations, it is essential to understand how we can approach complex challenges differently by generating, learning from, and sustaining health innovations in order to realize the promise of healthcare system transformation.

Health innovations are novel, contextually-situated approaches which hold promise to accelerate and sustain positive health impacts by responding to health needs and preferences (see Table 1 for terminology used throughout this manuscript) [8]. Health innovations can and should be driven by the needs of end-users of the healthcare system, including patients, families, and community members [9], as well as clinicians, researchers, and health system leaders [10]. While generating scientific and healthcare delivery innovations is a vital activity for health system improvement [10,11,12], equally important is the ability to sustain health innovations through robust implementation and evaluation processes [13,14,15]. Learning Health Systems (LHS) offer an approach to moving knowledge to action by engaging in continuous healthcare system improvement via innovation– converting evidence to knowledge, applying that knowledge to influence innovation performance, and generating new evidence through observation of innovation performance changes [16, 17]. While both LHS and other healthcare improvement methodologies such as continuous quality improvement utilize cyclical data-driven processes to improve care, LHS focus on creating scalable systems that address population and person-centred health and using rigorous methods to do so [17]. The essential function of evaluation within the LHS cycle is its capacity to catalyze systems learning from improvement efforts, supporting a trajectory of knowledge to action. Thus, embedded evaluation drives learning and innovation sustainability within an LHS by offering a consistent cycle of feedback with which to ensure continued innovation fit for context [18] by revealing areas where adaptation and tailoring could help to improve innovation fit.

Table 1 Terminology and definitions

The conceptual link between evaluation of health innovations and the sustainability of those innovations is supported empirically [25] and conceptually [18, 26] and is depicted by a directional arrow in the Knowledge to Action (K2 A) Framework [15] (Fig. 1). The K2 A framework is one example of an implementation framework that illustrates the link between evaluation and sustainability at the end of the innovation cycle. Unfortunately, this link is often overlooked, impeding the ability of evaluation to enhance sustainability throughout the innovation process. Additionally, many implementation science frameworks indicate a link between evaluation and sustainability, including the K2 A [15], the Exploration, Preparation, Implementation, and Sustainment (EPIS) framework [27], and Normalization Process Theory [28, 29]. However, the quality of the relationship between these two stages is as yet underexplored in terms of how evaluation can contribute to learning and innovation sustainability, and in which circumstances this takes place. As the purpose of implementation science is to move knowledge into practice, an understanding of how learning and sustainability happen with respect to evaluation, and under which conditions these processes take place, can further this aim. The purpose of this realist synthesis is to explore the quality of evaluations of learning health systems to assess which components of evaluations are necessary to generate learning and improve innovation sustainability. Findings will add to the LHS and implementation literature by specifying the necessary approaches for implementing evaluations that support learning and sustainability. In this synthesis, the evaluation of health innovations is the ‘intervention’ under study. Our realist synthesis question is: What types of evaluations work to promote LHS learning and health innovation sustainability, for whom do they work, under which circumstances, and why?”.

Fig. 1
figure 1

The knowledge to action framework [15]

Methods

We conducted a five-stage realist synthesis in line with RAMESES guidelines (see Additional file 1) [23, 30]. Realist synthesis is a complexity-compatible review method that examines interventions to understand “what works, for whom, under which circumstances, and why?” [31]. In realist terms, variations in outcomes (O) can be attributed to the differing contexts (C) into which interventions are implemented, and the resultant mechanisms (M) that are activated due to contextual factors such as time, place, people, and social and political structures [32, 33]. The working hypothesis of this paper is that different evaluative approaches, measurement, reporting, and data use techniques, and selection of indicators and outcomes work to support LHS learning and innovation sustainability, while others do not.

Our previously published protocol [32] contains additional details of our review methods. Our team undertook the following stages in this review: 1. Scoping the Review, 2. Building Theories, 3. Identifying the Evidence, 4. Evidence Selection and Appraisal, and 5. Data Extraction and Synthesis.

Scoping the review and building theories

Our group divided into two teams – an Expert Consultation Committee (co-authors CSG, ÉCB, JS, LJ, MB, MM, RJR, WPW) and a Review Team (BP, FB, TA). The Expert Consultation Committee was comprised of interdisciplinary subject matter experts in fields related to the scope of this review, such as learning health systems, evaluation, health innovation, sustainability, as well as realist methodology. Their role was to provide ongoing conceptual and methodological feedback throughout the review process to shape the research questions, literature review scope, and analysis process. The Review Team was comprised of health services research doctoral students and a research associate who, alongside the lead author, undertook the majority of the searching, screening, and the initial stages of data analysis. Initially, these teams worked together to combine their collective wisdom, using twenty-five foundational articles from the fields of LHS, evaluation, and health innovation sustainability, to formulate an Initial Program Theory of the conceptual link between evaluation, learning, and sustainability (See Fig. 2). We hypothesized that evaluation of an innovation begins with the development of an evaluation framework that includes the components on the left side of Fig. 2. The learning cycle from evaluation is based on the collection, analysis, synthesis, use and generation of evidence that must be adapted to suit local, contextual factors that promote sustainable uptake of new practices and a new evaluation-learning-sustainability cycle.

Fig. 2
figure 2

Initial program theory

Identifying the evidence

We conducted initial background searches to get a ‘feel’ for the evidence base [31], before building progressively focused searches aimed at illuminating the contexts and mechanisms associated with the link between evaluation, learning, and LHS sustainability. We began our primary search using the MEDLINE (National Library of Medicine) and Embase (Elsevier) databases using a combination of MeSH terms and key words conceptually centred around ‘evaluation’, ‘learning’, and ‘sustainability’, and ‘healthcare’. We limited our search to 2013–2023 to account for the growth of interest in the sustainability of health innovations over the last decade. A sample MEDLINE search can be found in Additional file 2. Our PRISMA Diagram (Fig. 3) details of the flow of records through this synthesis.

Fig. 3
figure 3

PRISMA flow diagram

Evidence selection and appraisal

Studies were screened and selected through a three-stage process, whereby at the title and abstract level, we included all articles published in English, from 2013 or later, and that related to healthcare innovations, evaluation, and learning and/or sustainability. We excluded articles not written in English, published before 2013, those that were not related to evaluation of healthcare innovations and learning or sustainability, and applied studies that were student-focused or focused on healthcare education (i.e., not in a healthcare setting). The decision to err on the side of inclusion at the title and abstract level stemmed from the realist philosophy that even studies with peripheral relationships to the initial program theory may contain ‘nuggets’ of truth that are useful in developing, iterating, or adjusting working hypotheses [34]. These ‘nuggets’ are often not evident at the title and abstract level and are rather found in the discussion section or lessons learned from published works. At the second level, we included full text articles that, in addition to the criteria above, described a relationship between evaluation and learning or evaluation and sustainability. At the third stage, the first 500 articles were screened for the above criteria, in addition to our assessment of the quality and rigour of the article, and the degree to which it enabled us to further develop our initial program theory. After 500 full text articles were screened at Stage 3, a joint meeting with the Expert Committee and Review Team determined that we had obtained enough high-quality information to move on to data extraction.

Data extraction and synthesis

Data from 60 included full text articles were extracted into a shared spreadsheet. Three authors (MB, TA, FB) worked together to extract data such as document characteristics (author, year, etc.), central arguments from each article, and realist-informed constructs (e.g., outcomes of interest, contextual factors, possible mechanisms) (see Additional file 1 for operational definitions of extraction criteria). Excerpts of extracted data were then exported to Dedoose [35] qualitative data software for analysis. Excerpts were initially analyzed inductively by grouping concepts into clusters. Next, we examined each cluster to determine where clusters overlapped with concepts from our initial program theory, and where new clusters had to be added or old ones deleted from the initial program theory for congruence. Our Review Team (MB, TA, FB) examined each cluster for relevant contexts, mechanisms, and outcomes to begin to form the basis of our CMOCs. In analyzing articles, we included a search for confirming and disconfirming data to ensure that we were aware of evidence that was at odds with our hypotheses.

Next, one author (MB) refined the initial CMOCs using the concept clusters. Initially 14 CMOCs were developed pertaining to six concepts contained within the initial program theory. Excerpts and supporting quotes were filed under each CMOC by one author (MB) while another author (TA) validated CMOCs using an agree/disagree/weak agreement system. CMOCs were refined on the basis of this validation and re-circulated until three authors (MB, TA, FB) achieved consensus as to the comprehensiveness of the CMOCs and the fit of the supporting data within them (Additional file 3). The 14 initial CMOCs were then presented to the Expert Committee for refinement. Through discussion, visual mapping, and asynchronous agreement ratings using bespoke surveys, the initial 14 CMOCs were collapsed into the final six, and the initial program theory was reorganized into the final refined program theory (Fig. 4). The final program theory incorporates the organizational and process-oriented constructs that allow evaluation to promote learning and the sustainability of healthcare innovations.

Fig. 4
figure 4

Refined program theory

Results

Our search identified 8,429 unique results. After title and abstract screening, 547 articles were screened in full text and 487 were excluded to retain 60 articles for inclusion in this synthesis. Data were extracted, analyzed, and synthesized from these articles to refine our initial program theory connecting evaluation to learning and health innovation sustainability. Our refined program theory is detailed below, followed by Table 2 with the CMOCs associated with each stage in the refined program theory.

Refined program theory

Our results suggest that evaluation can lead to learning and the sustainability of innovations in LHS when evaluation is undertaken in organizations that meet certain preconditions, and where explicit connections exist between distinct sequences of co-production, feedback, and culture change, leading to outcomes of learning and innovation sustainability. The necessary organizational preconditions include embracing risk and failure and working to continuously increase the capacity for evaluation. Under these conditions, evaluations that are co-produced with those involved in the work of innovation, and which drive ongoing and cyclical learning feedback loops can produce ongoing innovation improvements by learning from evaluation data and implementing those learnings to drive decision-making. Outcomes from Sequence 1 activities influence Sequence 2 activities, namely, the creation of a sustainability culture and an effective learning organization that can sustain innovation and evaluation.

Context-mechanism-outcome configurations

Table 2.

Table 2 Context-mechanism-outcome configurations

Discussion

Each of the CMOCs is associated with specific Action Strategies to inform healthcare organizations’ leadership and/or the evaluative team. These Action Strategies are presented in Table 3. The following text includes references to literature that support these strategies by emphasizing their influence on the mechanisms of each CMOC.

Table 3 Strategies to action evaluation for learning and sustainability

The first CMOC emphasizes the importance of embracing reasonable risk and failure in evaluations to reduce fear and fuel learning and innovation sustainability. Anticipation and acceptance of unexpected outcomes of evaluation that resemble program failures can enable psychological safety in staff and foster deeper learning of the complexities of innovation [87]. To eliminate fear of failure and establish psychological safety in staff, it is important to engage in activities that unite the team, fuel open communication, and emphasize the value of learning from both successes and failures for the purpose of continuous learning and long-term innovation sustainability (Table 3). These actionable steps to diminish fear and establish psychological safety are supported in literature, where leadership teams that are united in the goal of learning from evaluation regardless of the outcome create a psychologically safe space for honest feedback [88, 89]. These collaborative and unified evaluation practices most often include having open within-team communication about evaluation purposes and goals, which further fosters psychological safety, trust, and openness to feedback [89,90,91]. To engage in collaborative evaluation and open communication, innovation failures revealed by evaluation should be celebrated as learning opportunities to reduce the fear of punishment [92] and enable innovation and improvement [93,94,95].

The second CMOC outlines the importance of increasing evaluation capacity to fuel learning and innovation sustainability by increasing staff’s confidence in their evaluation abilities. Strategies such as providing clear governance and appropriate data and human resources, establishing routine evaluation practices, and encouraging the development of evaluation skills in teams can build staff confidence in their evaluation abilities and increase evaluation capacity (Table 3). Providing clear structures and processes in organizations can provide staff with a framework for decision making and action, increase role clarity, and increase psychological safety [96,97,98]. Providing resources such as training and time signals to staff that evaluation is valued, and fosters a sense of capability in staff [96, 99,100,101]. The importance of protected time for shared learning, including shared learning of evaluation practices, can build staff’s confidence in their abilities as they can learn from their peers and troubleshoot challenges [87, 93]. Lastly, continuous onboarding and reinforcement of the responsibilities of new staff ensures staff are integrated in roles and fosters confidence in their knowledge and abilities, including their perceptions about evaluation as a central aspect of their role [100, 102, 103].

CMOC 3 discusses the importance of co-production in evaluation to fuel learning and innovation sustainability by establishing feelings of evaluation ownership in staff. Specifically, co-produced evaluation may foster learning and program sustainability, and generate high quality evaluations with relevant and applicable outcomes [33, 104, 105]. Strategies such as collaboration between innovation and evaluation staff, re-visiting evaluation approaches to assess ongoing relevance, and possible establishment of “embedded evaluators” should be considered as they can lead to increased staff feelings of ownership and value in the evaluations they produce, contributing toward evaluation and innovation sustainability (Table 3) [105,106,107,108]. Of note, co-production often includes the engagement of patient, family, and community partners in the healthcare context, yet in the literature that we reviewed, patient engagement was a clear gap. There is evidence to suggest that inclusion of patients, families, and community partners in all areas of LHS can lead to better health outcomes, more meaningful healthcare services for patients, and improved policy making [9]. The lack of involvement of patients, families, and community partners from the health innovation evaluation literature highlights the missed opportunities resulting from their exclusion from this important area of work. Future work should include patients, families, and community partners in evaluation of health innovations to realize the full benefits of embedding their expertise into learning health systems.

Implementation of learning feedback loops (CMOC 4) through collaborative data management, actioning evaluation results, and establishment of feedforward workflows [46] may increase the relevance of evaluations for staff. Collaborative evaluation practices such as joint data analysis meetings between evaluators and staff foster a deeper understanding of the evaluation findings, and increase the relevance and meaning of results for decision makers [107,108,109]. Feedforward loops provide future-oriented support to encourage improved organizational performance [110] by removing anticipated future barriers to evaluation and innovations and effectively acting on evaluation data [93, 110,111,112]. These collaborative data management and feedforward practices emphasize the relevance of staff’s evaluation work by using evaluation data to create positive change, thus supporting sustainable evaluation.

The final two CMOCs highlight the importance of the establishment of sustainability culture and learning organizations through staff’s observation of the positive changes produced as a result of learning through evaluation. Translating learnings into visible improvements, providing feedback on these improvements, and establishing a cycle of learning and improvement showcases the value of evaluation and provides staff with evidence of positive impact, fostering their motivation to continue evaluation and innovation [92, 107, 113]. Strategies that support the establishment of learning organizations are also supported by literature where integration of experiential (tacit) knowledge and explicit (data-driven) knowledge fosters staff participation in collaborative knowledge exchange [114,115,116,117]. Both the creation of a sustainability culture and establishment of a learning organization through action strategies that encourage staff observation of positive change and learning as a driver of change encourages continuous learning and innovation sustainability.

Finally, while many of the relationships explored in our CMOCs are derived from extant literature, future investigation should involve exploration of mid-range theories associated with the mechanisms of each CMOC. For instance, the CMOCs identified as Organizational Preconditions might relate to psychological safety where a culture of shame, blame, and an absence of risk tolerance may fuel fear in staff and may discourage them from viewing evaluation as a tool of learning and innovation sustainability [118]. CMOCs identified in Sequence 1 may relate to the role of evaluative inquiry where staff collaboration and cyclical learning from evaluations support the creation of learning organizations that view evaluation as a tool of learning and innovation sustainability [119]. Finally, Sequence 2 may relate to the pillars of the Dynamic Sustainability Framework (DSF) [18] where a sustainability culture and establishment of a learning organization may reflect the DSF’s emphasis on continuous problem solving for the purpose of improvement.

Strengths and limitations

Due to the explosive growth in interest in health innovation sustainability from 2013 onward, we focused our review on the body of literature linking evaluation to learning and sustainability from the last decade. Additionally, we limited our screening of full text articles to the first 500 articles obtained for review efficiency. While these methodological decisions were validated with our Expert Committee and by checking our refined program theory for comprehensiveness and coherence, it is possible that inclusion of articles pre-dating 2013 or from outside of the first 500 full text articles would have changed the development of our refined program theory or CMOCs.

Strengths of this work included our ability to capture articles from a breadth of fields for inclusion in this work, adding to the generalizability and completeness of results. Additionally, the involvement of the Expert Committee in methodological and conceptual guidance throughout this work, and the diversity of expertise among members of this committee added to the rigour of results. Members of this Expert Committee included those with lived experience evaluating, developing and sustaining healthcare innovations, and leading healthcare organizations and influencing systems. It is through these combined lenses that guidance was sought and given for this work. Admittedly, a different group of experts with different experiences may have ultimately derived different results, but we are confident that the combined expertise of this group and the rigour with which this synthesis was conducted have yielded a valid and important contribution to our understanding of what makes evaluation ‘work’ to promote learning and sustainability of health innovations..

Conclusion and recommendations

Through this realist synthesis, we found that organizational preconditions, as well as evaluation cycles that drive learning from health innovations, may over time shift organizations to establish a culture of innovation and evaluation sustainability. Our desire to translate findings into practical applications spurred the development of evidence-based Action Strategies associated with each phase of our program theory and CMOCs. Our results have implications for the Implementation Science community and those conducting evaluations of health innovations more broadly. Future research could include investigating mid-range theories connected to each identified mechanism and conducting field tests of the program theory and CMOCs by applying the action strategies and studying their effects on learning and health innovation sustainability.

Data availability

Requests for data beyond that publicly available in the manuscript and supporting files may be submitted to the corresponding author and will be available upon reasonable request.

References

  1. McMahon M, Nadigel J, Thompson E, Glazier RH. Informing Canada’s Health System Response to COVID-19: Priorities for Health Services and Policy Research. Healthc Policy. 2020;16(1):112–24.

    PubMed  PubMed Central  Google Scholar 

  2. Brophy JT, Keith MM, Hurley M, McArthur JE. Sacrificed: Ontario Healthcare Workers in the Time of COVID-19. New Solut. 2021;30(4):267–81.

    Article  PubMed  Google Scholar 

  3. Sen-Crowe B, Sutherland M, McKenney M, Elkbuli A. A Closer Look Into Global Hospital Beds Capacity and Resource Shortages During the COVID-19 Pandemic. J Surg Res. 2021;260:56–63.

    Article  CAS  PubMed  Google Scholar 

  4. Craig-Schapiro R, Salinas T, Lubetzky M, et al. COVID-19 outcomes in patients waitlisted for kidney transplantation and kidney transplant recipients. Am J Transplant. 2021;21(4):1576–85.

    Article  CAS  PubMed  Google Scholar 

  5. Evans RG, McGrail KM, Morgan SG, Barer ML, Hertzman C. APOCALYPSE NO: Population aging and the future of health care systems. Can J Aging. 2001;20:160–91.

    Article  Google Scholar 

  6. Dall TM, Gallo PD, Chakrabarti R, West T, Semilla AP, Storm MV. An aging population and growing disease burden will require a large and specialized health care workforce by 2025. Health Aff (Millwood). 2013;32(11):2013–20.

    Article  PubMed  Google Scholar 

  7. Haldane V, De Foo C, Abdalla SM, et al. Health systems resilience in managing the COVID-19 pandemic: lessons from 28 countries. Nat Med. 2021;27(6):964–80.

    Article  CAS  PubMed  Google Scholar 

  8. World Health Organization. Health Innovation for Impact. https://www.who.int/teams/digital-health-and-innovation/health-innovation-for-impact. Published 2023. Accessed Mar 2023.

  9. Lee-Foon NK, Smith M, Greene SM, Kuluski K, Reid RJ. Positioning patients to partner: exploring ways to better integrate patient involvement in the learning health systems. Res Involv Engagem. 2023;9(1):51.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Bird M, McGillion M, Chambers EM, et al. A generative co-design framework for healthcare innovation: development and application of an end-user engagement framework. Res Involv Engagem. 2021;7(1):12.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Tschimmel K. Design thinking as an effective toolkit for innovation. Proceedings of the XXIII ISPIM Conference: Action for Innovation: Innovating from Experience. Barcelona: Spain. 2012.

  12. Beckman S, Barry M. Innovation as a learning process: embedding design thinking. Calif Manag Rev. 2007;50(1):25–56.

  13. Braithwaite J, Glasziou P, Westbrook J. The three numbers you need to know about healthcare: the 60–30-10 Challenge. BMC Med. 2020;18(1):102.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Greenhalgh T, Russell J. Why do evaluations of eHealth programs fail? An alternative set of guiding principles. PLoS Med. 2010;7(11):e1000360.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Graham ID, Logan J, Harrison MB, et al. Lost in knowledge translation: time for a map? J Contin Educ Health Prof. 2006;26(1):13–24.

    Article  PubMed  Google Scholar 

  16. Friedman CP, Rubin JC, Sullivan KJ. Toward an Information Infrastructure for Global Health Improvement. Yearb Med Inform. 2017;26(1):16–23.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  17. Reid R, Wodchis W, Kuluski K, et al. Actioning the Learning Health System: an applied framework for integrating research into health systems. SSM - Health Systems. 2024;2:100010.

  18. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Wong G, Westhorp G, Pawson R, Greenhalgh T. Realist Synthesis: RAMESES Training Materials. 2013. https://www.ramesesproject.org/media/Realist_reviews_training_materials.pdf. Accessed 19 Oct 2022.

  20. Menear M, Blanchette MA, Demers-Payette O, Roy D. A framework for value-creating learning health systems. Health Res Policy Syst. 2019;17(1):79.

    Article  PubMed  PubMed Central  Google Scholar 

  21. van Diggele C, Burgess A, Roberts C, Mellis C. Leadership in healthcare education. BMC Med Educ. 2020;20(Suppl 2):456.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Institute of Medicine. The Learning Healthcare System: Workshop Summary. Paper presented at: IOM Roundtable on Evidence-Based Medicine; 2007.

    Google Scholar 

  23. Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: realist syntheses. BMC Med. 2013;11:21.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Fleiszer AR, Semenic SE, Ritchie JA, Richer MC, Denis JL. The sustainability of healthcare innovations: a concept analysis. J Adv Nurs. 2015;71(7):1484–98.

    Article  PubMed  Google Scholar 

  25. Bailie J, Laycock AF, Peiris D, et al. Using developmental evaluation to enhance continuous reflection, learning and adaptation of an innovation platform in Australian Indigenous primary healthcare. Health Res Policy Syst. 2020;18(1):45.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Cote-Boileau E, Denis JL, Callery B, Sabean M. The unpredictable journeys of spreading, sustaining and scaling healthcare innovations: a scoping review. Health Res Policy Syst. 2019;17(1):84.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23.

    Article  PubMed  Google Scholar 

  28. May CM, Finch T. Implementing, Embedding, and Integrating Practices: An Outline of Normalization Process Theory. Sociology. 2009;43(3):535–54.

    Article  Google Scholar 

  29. May CR, Mair F, Finch T, et al. Development of a theory of implementation and integration: Normalization Process Theory. Implement Sci. 2009;4:29.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Wong G, Westhorp G, Greenhalgh J, Manzano A, Jagosh J, Greenhalgh T. Quality and reporting standards, resources, training materials and information for realist evaluation: the RAMESES II project. Health Serv Deliv Res. 5(28):1–134.

  31. Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist synthesis: an introduction. ESRC Res Methods Program. 2004;2:55.

  32. Bird M, Cote-Boileau E, Wodchis WP, et al. Exploring the impact of evaluation on learning and health innovation sustainability: protocol for a realist synthesis. Syst Rev. 2023;12(1):188.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Best A, Greenhalgh T, Lewis S, Saul JE, Carroll S, Bitz J. Large-system transformation in health care: a realist review. Milbank Q. 2012;90(3):421–56.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Pawson R. Digging for nuggets: How ‘bad’ research can yield ‘good’ evidence. Int J Soc Res Methodol. 2006;9:127–42.

    Article  Google Scholar 

  35. Dedoose 9.0.85. Web Application for Managing, analyzing, and preseing Qualitative and Mixed Method Research Data. SocioCultural Research Consultants, LLC. www.dedoose.com. Published 2021. Accessed.

  36. Armstrong N, Brewster L, Tarrant C, et al. Taking the heat or taking the temperature? A qualitative study of a large-scale exercise in seeking to measure for improvement, not blame. Soc Sci Med. 2018;198:157–64.

    Article  PubMed  PubMed Central  Google Scholar 

  37. He AJ. Scaling-up through piloting: dual-track provider payment reforms in China’s health system. Health Policy Plan. 2023;38(2):218–27.

    Article  PubMed  Google Scholar 

  38. Abimbola S, Patel B, Peiris D, et al. The NASSS framework for ex post theorisation of technology-supported change in healthcare: worked example of the TORPEDO programme. BMC Med. 2019;17(1):233.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Steele Gray C. Overcoming Political Fragmentation: The Potential of Meso-Level Mechanisms Comment on "Integration or Fragmentation of Health Care? Examining Policies and Politics in a Belgian Case Study". Int J Health Policy Manag. 2022;12:7075.

  40. Agency for Healthcare Research and Quality. Module 8: Organizational Learning and Sustainability. In. Communication and Optimal Resolution (CANDOR) Toolkit. Rockville: MD2016: https://www.ahrq.gov/patient-safety/settings/hospital/candor/modules/notes8.html.

  41. Ellis LA, Sarkies M, Churruca K, et al. The science of learning health systems: scoping review of empirical research. JMIR Med Inform. 2022;10(2):e34907.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Labrique A, Vasudevan L, Weiss W, Wilson K. Establishing Standards to Evaluate the Impact of Integrating Digital Health into Health Systems. Glob Health Sci Pract. 2018;6(Suppl 1):S5–17.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Alami H, Fortin JP, Gagnon MP, Pollender H, Tetu B, Tanguay F. The Challenges of a Complex and Innovative Telehealth Project: A Qualitative Evaluation of the Eastern Quebec Telepathology Network. Int J Health Policy Manag. 2018;7(5):421–32.

    Article  PubMed  Google Scholar 

  44. Reed JE, Howe C, Doyle C, Bell D. Simple rules for evidence translation in complex systems: A qualitative study. BMC Med. 2018;16(1):92.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Foley T, Horwitz L, Zahran R. Realising the Potential of Learning Health Systems. UK: Newcastle University; 2021.

    Google Scholar 

  46. Sheikh K, Abimbola S. Learning health systems: Pathways to progress. World Health Organization; 2021.

  47. Hazel E, Chimbalanga E, Chimuna T, et al. Using Data to Improve Programs: Assessment of a Data Quality and Use Intervention Package for Integrated Community Case Management in Malawi. Glob Health Sci Pract. 2017;5(3):355–66.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Ament SM, Gillissen F, Moser A, et al. Identification of promising strategies to sustain improvements in hospital practice: a qualitative case study. BMC Health Serv Res. 2014;14:641.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Reynolds J, DiLiberto D, Mangham-Jefferies L, et al. The practice of “doing” evaluation: lessons learned from nine complex intervention trials in action. Implement Sci. 2014;9:75.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Fleiszer AR, Semenic SE, Ritchie JA, Richer MC, Denis JL. A unit-level perspective on the long-term sustainability of a nursing best practice guidelines program: An embedded multiple case study. Int J Nurs Stud. 2016;53:204–18.

    Article  PubMed  Google Scholar 

  51. Greenhalgh T, Wherton J, Papoutsi C, et al. Beyond Adoption: A New Framework for Theorizing and Evaluating Nonadoption, Abandonment, and Challenges to the Scale-Up, Spread, and Sustainability of Health and Care Technologies. J Med Internet Res. 2017;19(11):e367.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Goldman J, Rotteau L, Flintoft V, Jeffs L, Baker GR. Measurement and Monitoring of Safety Framework: a qualitative study of implementation through a Canadian learning collaborative. BMJ Qual Saf. 2022;32(8):470–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjqs-2022-015017.

  53. Swanson NM, Elgersma KM, McKechnie AC, et al. Encourage, Assess, Transition (EAT): a quality improvement project implementing a direct breastfeeding protocol for preterm hospitalized infants. Adv Neonatal Care. 2022;23(2):107–19. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/ANC.0000000000001037.

  54. Cresswell KM, Bates DW, Sheikh A. Ten key considerations for the successful implementation and adoption of large-scale health information technology. J Am Med Inform Assoc. 2013;20(e1):e9–13.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Brunton L, Sammut-Powell C, Birleson E, et al. Scale-up of ABC care bundle for intracerebral haemorrhage across two hyperacute stroke units in one region in England: a mixed methods evaluation of a quality improvement project. BMJ Open Qual. 2022;11(2):e001601.

  56. Steels S, Ainsworth J, van Staa TP. Implementation of a “real-world” learning health system: Results from the evaluation of the Connected Health Cities programme. Learn Health Syst. 2021;5(2):e10224.

    Article  PubMed  Google Scholar 

  57. van Gemert-Pijnen JE, Nijland N, van Limburg M, et al. A holistic framework to improve the uptake and impact of eHealth technologies. J Med Internet Res. 2011;13(4):e111.

    Article  PubMed  PubMed Central  Google Scholar 

  58. Ovretveit J, Gustafson D. Evaluation of quality improvement programmes. Qual Saf Health Care. 2002;11(3):270–5.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  59. Rey E, Laprise M, Lufkin S. Sustainability monitoring: Principles, challenges, and approaches. In: Neighbourhoods in Transition: Brownfield Regeneration in European Metropolitan Areas. Springer; 2022. p. 121–42.

  60. Rycroft-Malone J. The PARIHS framework–a framework for guiding the implementation of evidence-based practice. J Nurs Care Qual. 2004;19(4):297–304.

    Article  PubMed  Google Scholar 

  61. Shelton RC, Cooper BR, Stirman SW. The Sustainability of Evidence-Based Interventions and Practices in Public Health and Health Care. Annu Rev Public Health. 2018;39:55–76.

    Article  PubMed  Google Scholar 

  62. Proctor E, Luke D, Calhoun A, et al. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implement Sci. 2015;10:88.

    Article  PubMed  PubMed Central  Google Scholar 

  63. Bonten TN, Rauwerdink A, Wyatt JC, et al. Online Guide for Electronic Health Evaluation Approaches: Systematic Scoping Review and Concept Mapping Study. J Med Internet Res. 2020;22(8):e17774.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Fletcher A, Jamal F, Moore G, Evans RE, Murphy S, Bonell C. Realist complex intervention science: Applying realist principles across all phases of the Medical Research Council framework for developing and evaluating complex interventions. Evaluation (Lond). 2016;22(3):286–303.

    Article  PubMed  Google Scholar 

  65. Greenhalgh T, Papoutsi C. Studying complexity in health services research: desperately seeking an overdue paradigm shift. BMC Med. 2018;16(1):95.

    Article  PubMed  PubMed Central  Google Scholar 

  66. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Dekker-van Doorn C, Wauben L, van Wijngaarden J, Lange J, Huijsman R. Adaptive design: adaptation and adoption of patient safety practices in daily routines, a multi-site study. BMC Health Serv Res. 2020;20(1):426.

    Article  PubMed  PubMed Central  Google Scholar 

  68. Bhatta S, Rajbhandari S, Kalaris K, Carmone AE. The Logarithmic Spiral of Networks of Care for Expectant Families in Rural Nepal: A Descriptive Case Study. Health Syst Reform. 2020;6(2):e1824520.

    Article  PubMed  Google Scholar 

  69. Abbott PA, Foster J, Marin Hde F, Dykes PC. Complexity and the science of implementation in health IT–knowledge gaps and future visions. Int J Med Inform. 2014;83(7):e12–22.

    Article  PubMed  Google Scholar 

  70. Desveaux L, Budhwani S, Stamenova V, Bhattacharyya O, Shaw J, Bhatia RS. Closing the Virtual Gap in Health Care: A Series of Case Studies Illustrating the Impact of Embedding Evaluation Alongside System Initiatives. J Med Internet Res. 2021;23(9):e25797.

    Article  PubMed  PubMed Central  Google Scholar 

  71. Safaeinili N, Brown-Johnson C, Shaw JG, Mahoney M, Winget M. CFIR simplified: Pragmatic application of and adaptations to the Consolidated Framework for Implementation Research (CFIR) for evaluation of a patient-centered care transformation within a learning health system. Learn Health Syst. 2020;4(1):e10201.

    Article  PubMed  Google Scholar 

  72. Hutchinson E, Nayiga S, Nabirye C, et al. Opening the “black box” of collaborative improvement: a qualitative evaluation of a pilot intervention to improve quality of malaria surveillance data in public health centres in Uganda. Malar J. 2021;20(1):289.

    Article  PubMed  PubMed Central  Google Scholar 

  73. Schlieter H, Marsch LA, Whitehouse D, et al. Scale-up of Digital Innovations in Health Care: Expert Commentary on Enablers and Barriers. J Med Internet Res. 2022;24(3):e24582.

    Article  PubMed  PubMed Central  Google Scholar 

  74. Corrado J, Jackson O, Baxandall D, et al. Get Parkinson’s medications on time: the Leeds QI project. Age Ageing. 2020;49(5):865–72.

    Article  PubMed  Google Scholar 

  75. Kara N, Firestone R, Kalita T, et al. The BetterBirth Program: Pursuing Effective Adoption and Sustained Use of the WHO Safe Childbirth Checklist Through Coaching-Based Implementation in Uttar Pradesh. India Glob Health Sci Pract. 2017;5(2):232–43.

    Article  PubMed  Google Scholar 

  76. Fleiszer AR, Semenic SE, Ritchie JA, Richer MC, Denis JL. An organizational perspective on the long-term sustainability of a nursing best practice guidelines program: a case study. BMC Health Serv Res. 2015;15:535.

    Article  PubMed  PubMed Central  Google Scholar 

  77. Andersson AC. Managers’ views and experiences of a large-scale county council improvement program: limitations and opportunities. Qual Manag Health Care. 2013;22(2):152–60.

    Article  PubMed  Google Scholar 

  78. Austin EJ, LeRouge C, Lee JR, et al. A learning health systems approach to integrating electronic patient-reported outcomes across the health care organization. Learn Health Syst. 2021;5(4):e10263.

    Article  PubMed  PubMed Central  Google Scholar 

  79. Jones SL, Ashton CM, Kiehne L, et al. Reductions in Sepsis Mortality and Costs After Design and Implementation of a Nurse-Based Early Recognition and Response Program. Jt Comm J Qual Patient Saf. 2015;41(11):483–91.

    PubMed  PubMed Central  Google Scholar 

  80. Godfrey CM, Kircher C, Ashoor HM, et al. Absorptive capacity in the adoption of innovations in health: a scoping review. JBI Evid Synth. 2023;21(1):6–32.

    Article  PubMed  Google Scholar 

  81. Gotlib Conn L, McKenzie M, Pearsall EA, McLeod RS. Successful implementation of an enhanced recovery after surgery programme for elective colorectal surgery: a process evaluation of champions’ experiences. Implement Sci. 2015;10:99.

    Article  PubMed  PubMed Central  Google Scholar 

  82. Douglas S, Button S, Casey SE. Implementing for Sustainability: Promoting Use of a Measurement Feedback System for Innovation and Quality Improvement. Adm Policy Ment Health. 2016;43(3):286–91.

    Article  PubMed  Google Scholar 

  83. Chandra-Mouli V, Gibbs S, Badiani R, Quinhas F, Svanemyr J. Programa Geracao Biz, Mozambique: how did this adolescent health initiative grow from a pilot to a national programme, and what did it achieve? Reprod Health. 2015;12:12.

    Article  PubMed  PubMed Central  Google Scholar 

  84. Chandler J, Rycroft-Malone J, Hawkes C, Noyes J. Application of simplified Complexity Theory concepts for healthcare social systems to explain the implementation of evidence into practice. J Adv Nurs. 2016;72(2):461–80.

    Article  PubMed  Google Scholar 

  85. Gage AD, Gotsadze T, Seid E, Mutasa R, Friedman J. The influence of Continuous Quality Improvement on healthcare quality: A mixed-methods study from Zimbabwe. Soc Sci Med. 2022;298:114831.

    Article  PubMed  Google Scholar 

  86. Steele Gray C, Baker RG, Breton M, et al. Will the “new” become the “normal”? Exploring sustainability of rapid health system transformations. In: Waring J, Denis J-L, Pedersen AR, Tenbensel T, eds. Organising care in a time of COVID-19: Implications for leadership, governance, and policy: Palgrave Macmillan Cham; 2021.

  87. Edmondson A. Psychological safety and learning behavior in work teams. Adm Sci Q. 1999;44(2):350–83.

    Article  Google Scholar 

  88. Khanna R, Guler I, Nerkar A. Fail often, fail big, and fail fast? Learning from small failures and R&D performance in the pharmaceutical industry. Acad Manag J. 2016;59(2):436–59.

    Article  Google Scholar 

  89. Olson A. Building a collaborative culture in a middle school: A case study. University of South Florida. 2019.

  90. Edmondson AC, Lei Z. Psychological safety: The history, renaissance, and future of an interpersonal construct. Annu Rev Organ Psychol Organ Behav. 2014;1(1):23–43.

    Article  Google Scholar 

  91. Kostova T, Roth K. Adoption of an organizational practice by subsidiaries of multinational corporations: Institutional and relational effects. Acad Manag J. 2002;45(1):215–33.

    Article  Google Scholar 

  92. Deci EL, Ryan RM. The" what" and" why" of goal pursuits: Human needs and the self-determination of behavior. Psychol Inq. 2000;11(4):227–68.

    Article  Google Scholar 

  93. Edmondson AC. Strategies for learning from failure. Harv Bus Rev. 2011;89(4):48–55.

    PubMed  Google Scholar 

  94. Fisher CD, Ashkanasy NM. The emerging role of emotions in work life: An introduction. Wiley Online Library. 2000;21:123–9.

    Google Scholar 

  95. Pisano GP. You need an innovation strategy. Harv Bus Rev. 2015;93(6):44–54.

    Google Scholar 

  96. Kozlowski SW, Ilgen DR. Enhancing the effectiveness of work groups and teams. Psychological science in the public interest. 2006;7(3):77–124.

    Article  PubMed  Google Scholar 

  97. Kahn WA. Psychological conditions of personal engagement and disengagement at work. Acad Manag J. 1990;33(4):692–724.

    Article  Google Scholar 

  98. Salas E, Sims DE, Burke CS. Is there a “big five” in teamwork? Small group research. 2005;36(5):555–99.

    Article  Google Scholar 

  99. Salas E, Tannenbaum SI, Kraiger K, Smith-Jentsch KA. The science of training and development in organizations: What matters in practice. Psychological science in the public interest. 2012;13(2):74–101.

    Article  PubMed  Google Scholar 

  100. Joo B-K, Shim JH. Psychological empowerment and organizational commitment: the moderating effect of organizational learning culture. Hum Resour Dev Int. 2010;13(4):425–41.

    Article  Google Scholar 

  101. Noe RA, Tews MJ, McConnell DA. Learner engagement: A new perspective for enhancing our understanding of learner motivation and workplace learning. Acad Manag Ann. 2010;4(1):279–315.

    Article  Google Scholar 

  102. Cable DM, Parsons CK. Socialization tactics and person-organization fit. Pers Psychol. 2001;54(1):1–23.

    Article  Google Scholar 

  103. Saks AM, Ashforth BE. The role of dispositions, entry stressors, and behavioral plasticity theory in predicting newcomers’ adjustment to work. J Organ Behav. 2000;21(1):43–62.

    Article  Google Scholar 

  104. Stufflebeam DL, Zhang G. The CIPP evaluation model: How to evaluate for improvement and accountability. New York: Guilford Publications; 2017.

  105. Cousins JB, Whitmore E. Framing participatory evaluation. N Dir Eval. 1998;1998(80):5–23.

    Article  Google Scholar 

  106. Bryson JM. What to do when stakeholders matter: stakeholder identification and analysis techniques. Public Manag Rev. 2004;6(1):21–53.

    Article  Google Scholar 

  107. Fetterman D, Wandersman A. Empowerment evaluation: Yesterday, today, and tomorrow. Am J Eval. 2007;28(2):179–98.

    Article  Google Scholar 

  108. Patton MQ. Developmental evaluation: applying complexity concepts to enhance innovation and use. New York: Guilford Press; 2010.

  109. Patton MQ. Essentials of utilization-focused evaluation. Los Angeles: Sage; 2011.

  110. Goldsmith M, Morgan H. Leadership is a Contact Sport: The “Follow-Up Factor” in Management Development, Strategy+ Business. 2004;36:71–79.

  111. Ulrich D, Smallwood WN, Sweetman K. The leadership code: five rules to lead by. Brighton: Harvard Business Press; 2008.

  112. London M, Smither JW. Feedback orientation, feedback culture, and the longitudinal performance management process. Hum Resour Manag Rev. 2002;12(1):81–100.

    Google Scholar 

  113. Kluger AN, DeNisi A. The effects of feedback interventions on performance: a historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychol Bull. 1996;119(2):254.

    Article  Google Scholar 

  114. Wenger E. Communities of practice: learning, meaning, and identity. Cambridge: Cambridge University Press; 1999.

  115. Garvin DA, Edmondson AC, Gino F. Is yours a learning organization? Harv Bus Rev. 2008;86(3):109.

    PubMed  Google Scholar 

  116. Edmondson AC. The fearless organization: creating psychological safety in the workplace for learning, innovation, and growth. New York: Wiley; 2018.

  117. Crossan MM, Lane HW, White RE. An organizational learning framework: From intuition to institution. Acad Manag Rev. 1999;24(3):522–37.

    Article  Google Scholar 

  118. Frazier ML, Fainshmidt S, Klinger RL, Pezeshkan A, Vracheva V. Psychological safety: A meta-analytic review and extension. Pers Psychol. 2017;70(1):113–65.

    Article  Google Scholar 

  119. Torres RT, Preskill H. Evaluation and organizational learning: Past, present, and future. The American Journal of Evaluation. 2001;22(3):387–95.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This work was partially supported with funding from the Canada Research Chair Program via Dr. Carolyn Steele Gray. Dr. Marissa Bird receives salary support from the Canadian Institutes of Health Research Health System Impact Embedded Scientist Award. The funding bodies had no role in study design, data collection, or analysis and interpretation.

Author information

Authors and Affiliations

Authors

Contributions

MB conceived of the study and contributed to data collection, synthesis, and drafting the initial manuscript. MM, JS, WPW, LJ, ÉCB, RR, and CSG provided expert guidance on realist methodology and iterative conceptual development of the study topic. TA and FB provided support for data collection and synthesis. BP provided support for manuscript preparation. All authors contributed to critical revision of the manuscript, including provision of intellectual content, and have read and approved the final manuscript.

Corresponding author

Correspondence to Marissa Bird.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bird, M., MacPhee, M., Shaw, J. et al. Evaluating for learning and sustainability (ELS) framework: a realist synthesis. BMC Health Serv Res 25, 683 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12913-025-12743-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12913-025-12743-4

Keywords