- Systematic Review
- Open access
- Published:
The integration of quality improvement and implementation science methods and frameworks in healthcare: a systematic review
BMC Health Services Research volume 25, Article number: 558 (2025)
Abstract
Objectives
Quality Improvement (QI) and Implementation Science (IS) are both frequently utilised in health research. Little is known about how they are integrated within studies, and whether combined they add value. This systematic review sought to investigate how QI and IS theories and strategies are integrated within healthcare-based studies.
Methods
A systematic search was conducted across five databases. Duplicates, studies published prior to 2014, systematic and scoping reviews, and study protocols were removed. The retrieved title abstracts were screened, and the full texts of eligible studies were reviewed in pairs using Covidence software. Of the included studies, data were extracted using a predefined template, and studies were critically appraised using the QI Minimum Quality Criteria Set. Frequency analysis of the use of QI or IS tools was conducted, as well as a narrative analysis of the integration of QI and IS in each study.
Results
The database search returned 3,407 title abstracts, of which 1,618 were screened. Assessment for eligibility resulted in the identification of 149 studies, of which the full texts were reviewed, and 12 studies included in the final analysis. These 12 studies integrated QI and IS methods to implement an intervention in tertiary healthcare. The Plan-Do-Study-Act (PDSA) cycle was the most frequently used QI tool and the Theoretical Domains Framework, Behaviour Change Wheel (including Capabilities, Opportunity and Motivation) and the Consolidated Framework for Implementation Research were the most frequently used IS frameworks.
Conclusion
The study highlights a lack of consistent terminology across the QI and IS fields, as well as opportunities for greater integration of the two fields to enhance study design, implementation and sustainability, and to improve healthcare performance.
Introduction
Quality Improvement (QI) and Implementation Science (IS) share a common goal of improving quality in healthcare. While there are similarities across both disciplines their histories and modus operandi vary. There are many definitions of QI; however, the most commonly quoted is the Academy of Medical Royal Colleges definition which suggests moving away from a single method or set of tools, and to think of QI as a systematic continuous approach to problem solving in healthcare with the aim of improving service provision and provide better quality of care and ultimately outcomes for patients [1]. QI has a long track record grounded in healthcare and QI studies commonly focus on identifying specific local and context specific challenges in a health system at the provider, clinic or patient level [2]. Adopting a wide range of assessment and measurement methods, many of which have been adapted from business, such as Lean and Six Sigma [3], QI identifies the locus of a health system challenge to design and test setting specific interventions [1, 4].
Implementation Science (IS), “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services” ([5], p1) has a more recent history originating in rural sociology [6]. IS draws on theories, models and frameworks from behaviour change and social psychology to design and test implementation strategies to support uptake or adoption of evidence-based interventions. IS explicitly considers the role of creating generalisable evidence that can be used in other settings beyond the immediate context. Both QI and IS share a common ambition, attention to process and outcomes with some common methods. A recent review has compared and contrasted studies using QI or IS methods and approaches to achieve practice change in cancer care, highlighting potential for synergies to reduce duplication and enhance care outcomes [7].
Despite having two complementary approaches to improving quality in healthcare, endeavours to bring the two disciplines together have been somewhat limited and the use of terminology in both improvement and implementation research has been unclear. While much of the terminology of QI and IS appears at face value to be straightforward, there is concern in the field that the underuse and misuse of theories, models and frameworks presents as a challenge to growing the evidence base in improvement and implementation research [8].
The aim of this review was to understand the way in which QI and IS theories and strategies are integrated within healthcare-based studies. To the best of our knowledge this synthesis has not been previously undertaken across healthcare services.
Methods
The protocol for this systematic review was registered on Prospero (2024) (registration no. CRD42024553059). The review follows the Preferred Reporting Items for Systematic Reviews and Meta- Analyses (PRISMA) guidelines [9] (Supplementary file 1. PRISMA checklist). This systematic review aimed to answer the research question: “How do hospital-based studies integrate QI and IS methods, theories, tools and strategies?”.
Search strategy
Title abstract searches were conducted across 5 databases (Ovid Embase, Ovid MEDLINE, Ovid Emcare CINAHL, Web of Science) in June 2024. Librarian advice and support was sought to refine the search strategy (Supplementary file 2 Medline search strategy). The search included studies from 2014 to June 2024 using the Embase search string:
-
(exp Implementation Science/or exp "diffusion of innovation"/or ("The Consolidated Framework for Implementation Research" or "Theoretical domains framework" or "Reach effectiveness adoption implementation Maintenance" or "RE AIM" or "The Knowledge-to-Action Framework" or "Diffusion of Innovation* Theory" or "Implementation climate scale" or "Com-b" or "reach, effectiveness, adoption, implementation, and maintenance framework").ti,ab,kf.) AND (exp Quality Improvement/or total quality management/or exp "Root Cause Analysis"/or ("Quality Improvement" or "total quality management" or "Continuous Improvement" or "Improvement science" or "lean methodology" or "Lean management" or "Plan-Do-Study-Act cycle" or "PDSA" or RCA or "Root cause analys*" or Kaizen or "Six sigma" or "six sigma methodology" or "Institute for Healthcare Improvement Model for Improvement" or "Theory of constraint*").ti,ab,kf.).
Inclusion and exclusion criteria
Inclusion criteria
Studies were included if they were: based in a hospital setting; about a healthcare condition/healthcare professionals; and Integrated QI methods, theories and frameworks with IS theories, models or frameworks within the implementation of an intervention. Studies must have stated they used QI methods and have provided evidence of using QI methods/models/theories/frameworks. Studies must have also stated they used IS methods and have provided evidence of using IS methods/models/theories/frameworks. Studies must have the full text of an empirical study available and been published in a peer reviewed journal, between 2014 to June 2024 in English. Studies were limited to tertiary hospital settings to enable comparison between similar settings, while studies published since 2014 were included to review contemporary literature reflecting current trends in methodology use and integration.
Exclusion criteria
Studies were excluded if they: used IS theory/models/frameworks for diagnostic purposes (for example, using IS theory to identify barriers and facilitators to the implementation of an intervention, without reporting the application of those findings in the implementation of the intervention). Review articles identified by the search were reviewed for snowballing of additional studies but otherwise excluded from analysis.
Study selection
Titles and abstracts were downloaded from databases and screened against the inclusion criteria. Titles were divided and screened by six pairs of reviewers: MB paired with PH, SB, SW, SH, ZF and LAE using Covidence software [10]. Full texts of the abstracts which met the inclusion criteria were then retrieved, divided and reviewed by four pairs: MB paired with SB, SW, ZF and SH, again using Covidence software. All disagreements were discussed as a group and resolved through team consensus. MB reviewed all titles and full texts to increase consistency and rigour.
Data extraction
Data were extracted from each eligible study and recorded in a purpose designed Excel spreadsheet. Data included: citation; the location of the study (country and setting e.g., hospital); the study design; the population studied (including staff or patients); data collection methods; QI change initiative; study aim; IS elements identified in the study; QI elements identified in the study; and the described process of integration of QI and IS elements. We also extracted whether ethics approval was sought or received, and whether studies described following a reporting guideline. Data were extracted from the included studies by MB and verified by one co-author (ZF). Disagreements or discrepancies were resolved by team consensus.
Quality appraisal
The Quality Improvement Minimum Quality Criteria Set (QI-MQCS) was used to critically appraise the reporting of the included studies. This tool guides the assessment of each study across 16 domains, or reporting standards, to guide whether the minimum criteria were met for each study. For a study to be considered high quality, a minimum of 14 or more criteria must be reported [11]. This tool was deemed appropriate given all included studies identified as a QI project.
Data analysis and synthesis
After extracting key data, a frequency count of each QI or IS theory/tool/method used was conducted along with a narrative synthesis [12] of the methods of QI and IS integration in the included studies. This narrative analysis identified why each tool/method/theory was used, for example, to identify barriers and facilitators (B&Fs) to implementation. This process of categorising the use of each tool allowed the inductive identification of key study phases, in which each of the tools and methods were used. These study phases were reviewed and defined by five reviewers (MB, PH, SW, SB and ZF) and agreed upon through team consensus. The frequency count of the use of QI or IS methods/tools/theories was then used to identify how frequently QI or IS methods/tools/theories were used across the different study phases. A greater explanation of the analysis can be seen in Supplementary file 3. The key inductively identified study phases included:
-
The System diagnostic phase, which we defined as an assessment of the extent and/or nature of an issue being targeted to improve performance or outcomes, and identification of B&Fs to implementation. This included: QI methods/tools/theories used to identify B&Fs to implementation (e.g., Process Mapping, Fishbone diagram/Cause and effect diagram, Pareto chart, Force field analysis, Impact effort matrix, and histograms), and IS tools/theories used to identify B&Fs to implementation (e.g., COM-B, TDF, CFIR).
-
The Intervention design phase which typically involves the design, development and refinement of an intervention. This included: QI/IS methods/tools/theories used to inform the QI design.
-
The Implementation of intervention phase which typically included intervention testing and embedded strategies to implement the intervention. This included: QI tools that guided implementation strategies (e.g., Plan, Do, Study, Act (PDSA), Audit and & Feedback (A&F), and Champions), IS tools/theories that guided implementation strategies, and Feasibility and useability testing.
-
The Scale/spread or sustainability phase which included scale up of the intervention to a larger or different team or setting with consideration of ongoing maintenance of the implementation of the intervention. This included IS tools/theories used to determine whether it was appropriate to upscale the intervention across the organisation.
-
As well as these four phases, Methodology (which included methodologies that were applied across the entire span of the study, such as Lean six sigma), and Measurement tools such as Control charts and Run charts were included in the analysis.
Results
Study selection
The five-database search returned 3,407 titles (Ovid Medline (n = 1,384), Ovid Embase (n = 1,018), Ovid Emcare (n = 406), CINAHL (n = 137) and Web of Science (n = 462). Duplicates were removed (n = 1,056) as well as studies published prior to 2014 (n = 616), and systematic reviews, scoping reviews and study protocols (n = 117). A total of 1,618 title abstracts were then screened in Covidence software, resulting in 1,469 studies being excluded that did not meet the inclusion criteria. Full text screening was undertaken on the remaining 149 studies, and a further 137 studies were excluded. A total of 12 manuscripts met the inclusion criteria and were included in the final review [13,14,15,16,17,18,19,20,21,22,23,24] (Fig. 1). No additional studies were identified during the snowball analysis of included studies.
The six main reasons for exclusion were: 1) If QI was stated but not described, which typically included studies that described the project as a QI project, but did not clearly describe QI methods or tools (n = 74); 2) If IS was stated but not described, which typically included studies that described the project integrating IS elements or theories, but did not clearly describe the IS theory or methods (n = 23); 3) If IS was used for diagnostic purposes, which typically included studies that used an IS theory, framework or model to inform their evaluation of barriers and facilitators to implementation, but did not report the application of those findings (n = 64); 4) if the study was not hospital- or tertiary care- based (n = 24); 5) the full text search identified that the title referred to a conference abstract, preprint or thesis (n = 18), and; 6) No empirical data were reported (including reviews) (n = 4), noting that some studies had multiple reasons for exclusion.
The interrater reliability between pairs was initially poor, with Cohen’s Kappa scores [25] ranging from slight agreement (0.10–0.20) to fair agreement (0.21–0.40), reflecting the complexity of this review. As a result, all disagreements were discussed as a team in regular team meetings, and consensus reached as to whether a manuscript would be included or excluded, and why.
Critical appraisal
The QI-MQCS tool was used to critically appraise the 12 included studies. Only one quarter of studies (25%) (n = 3) [13, 20, 24] met the QI-MQCS minimum standard for reporting with a minimum score of 14/16 QI criteria [11] (Supplementary file 4). The mean QI-MQCS quality score was 11.8 (95% CI 10.97–12.70). All studies reported the following domains: Organisational motivation, Intervention rationale, Intervention description, Implementation, Data source, Timing, Limitations. The domains that were least often reported included: Spread (n = 3), Health outcomes (n = 3), Study design (n = 4), Penetration/Reach (n = 7), Sustainability (n = 7), Comparator (n = 7), Adherence/Fidelity (n = 8), Organisational readiness (n = 10), Organisational characteristics (n = 10).
Study characteristics
Study design
Of the 12 included studies, over half described their study as a QI study without explicitly reporting a study design or methodology [14,15,16, 19, 22,23,24]. Five studies provided details about their study design, describing their studies as a staggered, pre-post quasi-experimental implementation study [13], implementation research [17], a sequential explanatory mixed methods study [18], participatory design methodology [20], and a participatory research study [21].
Study setting and topics
All studies were conducted in hospital settings, most commonly within the United States of America (USA) (n = 4), Canada (n = 2) (with an additional study potentially based in Canada, although it was not explicitly described [19]), the United Kingdom (UK) (n = 2), Brazil (n = 1), Ghana (n = 1), and Uganda (n = 1). The QI project topics were mostly heterogenous. Two studies were focused on reducing sepsis, one in a Neonatal Intensive Care Unit (NICU) [16], and the other in adult patients [22], and two studies were related to improving the appropriate use of laboratory tests, one in the Emergency Department (ED) [23] and one specifically reducing Blood Urea Nitrogen (BUN) ordering [18]. Other studies were focused on enhancing vital sign collection [13], developing a virtual cardiac rehabilitation program [14], developing a standardised post-fall debrief tool [15], implementing a screening tool to improve pain management referrals [17], improving SpO2 maintenance in NICU [19], developing an individualised performance data dashboard for clinicians [20], developing a care protocol for premature newborns in their first hour of life [21], and introducing an intradialytic exercise program for haemodialysis patients [24] (Table 1).
Study participants
All of the studies involved healthcare professionals (HCPs), while some studies also included administrators [14], managers [21] and quality and risk management staff [15]. Studies included a mostly heterogenous set of patient cohorts with various health conditions: four studies included ‘hospitalised’ patients [13], including three studies of patients in the ED setting [18, 20, 23]; three studies included sick infants [16], including preterm infants [19] and babies and their mothers [21]; Other studies included cardiac rehabilitation patients [14]; fall patients [15]; children and young people with sickle cell disease [17]; patients with sepsis [22]; and patients on haemodialysis [24].
Study methods
The most commonly reported data collection methods were: surveys [13, 15, 16, 19,20,21, 24]; observations [13, 14, 16, 19, 21]; interviews and focus groups [13, 14, 18, 22, 24]; medical record and/or laboratory information system review [15, 18, 21, 23]; workshops [14, 20, 21]; and audits [19, 24] (Table 1).Of the 12 included studies, four reported receiving ethics, and six studies reported receiving ethics exemption. Only five reported using a reporting guideline [14, 18, 22,23,24] including three that reported using the Standards for Quality Improvement Reporting Excellence (SQUIRE) reporting guidelines [14, 23, 24], one that reported using the Template for Intervention Development and Replication (TIDier) reporting guidelines [22], and another that reported using the Good Reporting of a Mixed Methods Study [18] (Table 1).
QI and IS components
Of the 12 included studies, 12 key QI methods/tools were utilised including: Plan-Do-Study-Act (PDSA) cycles (n = 9), process mapping (n = 5), audit and feedback (A&F) (n = 5), QI champions (n = 4), fish bone diagram/cause and effect diagrams (n = 2), pareto charts (n = 1), force field analysis (n = 1), histograms (n = 1), impact effort matrix (n = 1), Lean six sigma (n = 1), control charts (n = 1) and run charts (n = 1) (Fig. 2, Table 1). Across the 12 included studies, the six IS theories and strategies used included: the Theoretical Domains Framework (TDF) (n = 5), Behaviour Change Wheel (BCW) including Capabilities, Opportunity and Motivation (COM-B) (n = 5), the Consolidated Framework for Implementation Research (CFIR) (n = 3), the Interactive Systems Framework for Dissemination and Implementation (ISF) (n = 1), the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) (n = 1) framework, and the Behaviour Change Technique Taxonomy (BCT) (n = 1) (Fig. 2, Table 1). The most commonly paired IS and QI methods were the BCW/COM-B, TDF and CFIR used with PDSA, process mapping, and audit and feedback methods (Fig. 2).
QI and IS integration
The narrative synthesis of studies identified that the process of QI and IS integration in the 12 studies typically followed one of two patterns: 1) IS theory/models were used to inform the initial development and design of the QI project in three studies [13, 17, 23]; 2) IS theory/models were used to inform the modification of the QI and QI implementation through the identification of determinants in 8 studies [13, 14, 17, 18, 20,21,22, 24]; or both [13, 17] (Table 1). A concise synopsis of the integration of QI and IS tools/theory developed from the narrative synthesis of studies can be seen in Table 1. These simplified steps highlight how QI and IS were utilised in each study.
The key QI and IS methods/tools and theories used across the 12 included studies (see Table 1) were categorised into the six inductively identified phases of QI and IS studies. These included: The System diagnostic phase (which included process mapping, fishbone diagrams, pareto charts, force field analysis, impact effort matrices, histograms, BCW/COM-B, TDF, CFIR); the Intervention design phase (which included BCW/COM-B, CFIR); the Implementation of intervention phase which included intervention testing (PDSA) and embedded intervention strategies (audit and feedback, champions, BCW/COM-B, TDF, ISF, BCT); the Scale/spread phase (which included REAIM); Methodology (which included Lean Six Sigma); and Measurement tools (which included control charts and run charts). QI tools were used more in the System diagnostic, Intervention design and Implementation of intervention phases, however these three phases also utilised IS tools (Fig. 3).
Discussion
This systematic review found 12 peer-reviewed studies that attempted to integrate QI and IS methods to implement a program in acute healthcare. The TDF, COM-B/BCW and CFIR were the most frequently used IS frameworks and the PDSA cycle was the most frequently used QI tool. As highlighted in Table 1, QI and IS methods were used sequentially or in parallel with one another, in a stepwise process to inform each stage of the study, however, no studies combined the methodologies, per se. The QI and IS methods/tools and theories were used in a distinct and independent manner across all of the included studies.
In addition to the 12 studies included in this review, the reasons for excluding studies during the full text review may provide some insight into how QI and IS are being used in health care. Of the 149 studies that underwent full text review, 65% (n = 97) were excluded because they described using QI or IS, however did not provide explicit descriptions or evidence of the use of individual frameworks or tools. This emphasises the lack of consistent reporting and terminology within and between the QI and IS fields. This definitional problem has been highlighted previously in reviews or commentaries comparing and contrasting the two fields [26, 27]. For the 12 studies that were included, the use of research methodological standards was the exception not the rule (n = 5, 42%), which may also contribute to the lack of consistent terminology. Similarly, there was inconsistent use of reporting guidelines to support the presentation of findings. These findings advocate for greater use of guidelines to enhance the rigour of QI and IS studies, as well as support more consistent terminology, through the use of the many guidelines currently available such as SQUIRE [28], the Standards for Reporting Implementation Studies (STARi) [29] or TIDier [30]. Agreed upon and harmonised definitions in both fields regarding concepts such as context, determinants, frameworks, strategies, and interventions will allow methods and results in studies to be more rigorously evaluated and learning to be shared [26].
Close to half of the studies at full text review stage were excluded (43%, n = 64/149) because they had used IS tools and theories only for “diagnostic” purposes or in other words, understanding the healthcare problem by identifying barriers and facilitators to implementation, rather than applying the findings to implement the intervention. An intervention applying these diagnostic findings may be reported in subsequent publications, but these were not identified by this review. This observation, that many studies use IS tools and theories solely for diagnostic purposes, aligns with previous findings from a systematic review on the use of the TDF to support healthcare clinician behaviour change; of the 60 studies in the review, just over half used the framework to inform barriers to, or to design implementation of interventions, but not undertake the intervention [31]. The observation also links to one of the key findings of our study: that in the different phases of implementation, there were differences in the use of QI and IS frameworks and tools. Whilst both QI and IS were used in the System diagnostic phase, and Intervention design phase, the Implementation phase tended to be dominated by the QI tool PDSA cycles. More guidance may be required on using IS frameworks to integrate tools from QI into implementation and evaluation. A number of prominent authors have highlighted that more integration of PDSA tools into IS studies is warranted [26].
The choice of IS frameworks used in the 12 included studies may assist in explaining the variable application of IS in these studies. Of the 16 instances of IS frameworks used in our 12 included studies, 81% (n = 13/16) utilised the COM-B/BCW, TDR, or CFIR. In Nilsen’s model of IS implementation theories, models and frameworks [26], these three are all used to assist with understanding or explaining what influences implementation outcomes. They do not assist with describing and/or guiding the process of translating research into practice, like the Knowledge to Action framework [32]. In other words, they are providing frameworks of what to do, rather than providing a mechanism to test the strategies and to respond or make changes. Greater guidance is needed to support the use of flexible IS methods and theories that can support rapid implementation of improvements within the context of a complex adaptive system such as healthcare [33].
Similarly, calls have been made to provide more theory to QI studies [34]. The results of our study bear this out where 3/12 studies used IS frameworks to inform the design phase. Designing interventions using both informal and formal theories supports the analysis and description of the rationale and assumptions about mechanism of actions, and the link between processes to outcomes [34]. In turn, they can inform an evaluation framework.
Overall, the review identified some integration of QI and IS across design, system diagnostic and implementation phases, however the domains of spread, reach and sustainability require further work. There was also minimal discussion of the impact of integration of QI and IS in the included studies.
Strengths and limitations
A strength of the review was the adherence to an international standard of systematic review methodology (PRISMA). Five databases were searched to maximise the opportunity for studies to be included. The reviewers were all experienced in the fields of IS and QI methods.
There are several limitations to this review. Firstly, the included IS studies tended to use the COM-B and CFIR frameworks, however this was largely due to the use of those terms in the search string, which was not exhaustive. This was underpinned by an assumption that the term “implementation science” would yield studies using a broad range of frameworks. Future analysis using search terms reflecting other IS frameworks may be useful to enhance these findings. Another limitation of the review was that agreement between reviewers on which studies to include was variable. This reflected two issues: that definitions for QI and IS studies are not harmonised; and that studies may state that they fit under an IS or QI banner, but they do not necessarily explicitly describe the respective tools. To mitigate this low Kappa score, all disagreements were discussed as a team, and consensus reached as to whether a manuscript would be included or excluded, and why. This review was also limited to studies set in a tertiary hospital setting, and published since 2014, limiting a comparison to other settings and to older literature. The review only included studies that clearly demonstrated and explained the QI and IS tools used, meaning that studies that did not explain their use of QI or IS clearly were excluded. The review also only included studies published in English.
Conclusion and implications for future research
QI and IS methodologies have been developed independently over time, but this review has identified studies where the integration of the two approaches has been attempted. To encourage further integration of QI and IS, greater guidance is needed on the best approach to the harmonisation of existing frameworks and the use of consistent terminology. These actions would help to move researchers beyond the diagnostic role often taken and encourage theory informed action. There is a clear need for research guidance on how and when to select, justify, and integrate appropriate QI and IS methods and theory within healthcare studies, supported by greater use of reporting guidelines in QI and IS studies, to enhance overall implementation and sustainability of improvement projects.
Data availability
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
Abbreviations
- BCT:
-
Behaviour Change Technique Taxonomy
- BCW:
-
Behaviour Change Wheel
- BUN:
-
Blood Urea Nitrogen
- CFIR:
-
Consolidated Framework for Implementation Research
- ED:
-
Emergency Department
- HCPs:
-
Healthcare professionals
- IS:
-
Implementation Science
- ISF:
-
Interactive Systems Framework for Dissemination and Implementation
- NICU:
-
Neonatal Intensive Care Unit
- PDSA:
-
Plan-Do-Study-Act
- PRISMA:
-
Preferred Reporting Items for Systematic Reviews and Meta- Analyses
- QI:
-
Quality Improvement
- QI-MQCS:
-
QI Minimum Quality Criteria Set
- RE-AIM:
-
Reach, Effectiveness, Adoption, Implementation, and Maintenance
- SQUIRE:
-
Standards for Quality Improvement Reporting Excellence
- TIDier:
-
Template for Intervention Development and Replication
- TDF:
-
Theoretical Domains Framework
- UK:
-
United Kingdom
- USA:
-
United States of America
References
Academy of Medical Royal Colleges. Quality improvement - Training for better outcomes. AoMRC: London; 2016.
Malone S, Newland J, Kudchadkar SR, Prewitt K, McKay V, Prusaczyk B, et al. Sustainability in pediatric hospitals: an exploration at the intersection of quality improvement and implementation science. Front Health Serv. 2022;2:1005802. https://doiorg.publicaciones.saludcastillayleon.es/10.3389/frhs.2022.1005802.
Glasgow JM, Scott-Caziewell JR, Kaboli PJ. Guiding inpatient quality improvement: a systematic review of Lean and Six Sigma. Jt Comm J Qual Patient Saf. 2010;36(12):533-AP5. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/S1553-7250(10)36081-8.
Hibbert PD, Basedow M, Braithwaite J, Wiles LK, Clay-Williams R, Padbury R. How to sustainably build capacity in quality improvement within a healthcare organisation: a deep-dive, focused qualitative analysis. BMC Health Serv Res. 2021;21(1):588.
Eccles MP, Mittman BS. Welcome to implementation Science. Implement Sci. 2006;1(1). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1748-5908-1-1.
Barr J, Paulson SS, Kamdar B, Ervin JN, Lane-Fall M, Liu V, et al. The coming of age of implementation science and research in critical care medicine. Crit Care Med. 2021;49(8):1254–75. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/CCM.0000000000005131.
Check DK, Zullig LL, Davis MM, Davies L, Chambers D, Fleisher L, et al. Improvement science and implementation science in cancer care: identifying areas of synergy and opportunities for further integration. J Gen Intern Med. 2021;36:186–95. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11606-020-06138-w.
Birken SA, Powell BJ, Shea CM, Haines ER, Alexis Kirk M, Leeman J, et al. Criteria for selecting implementation science theories and frameworks: results from an international survey. Implement Sci. 2017;12:1–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-017-0656-y.
Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmj.n71.
Veritas Health Innovation. Covidence systematic review software. Melbourne; 2024. Available at www.covidence.org.
Hempel S, Shekelle PG, Liu JL, Danz MS, Foy R, Lim Y, et al. Development of the Quality Improvement Minimum Quality Criteria Set (QI-MQCS): a tool for critical appraisal of quality improvement intervention publications. BMJ Qual Saf. 2015;24(12):796–804. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjqs-2014-003151.
Popay Popay J. Guidance on the Conduct of Narrative Synthesis in Systematic Reviews, in ESRC Methods Programme. Swindon: Economic and Social Research Council (ESRC); 2006.
Cummings MJ, Goldberg E, Mwaka S, Kabajaasi O, Vittinghoff E, Cattamanchi A, et al. A complex intervention to improve implementation of World Health Organization guidelines for diagnosis of severe illness in low-income settings: a quasi-experimental study from Uganda. Implement Sci. 2017;12(1):126. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-017-0654-0.
Duran AT, Keener-DeNoia A, Stavrolakes K, Fraser A, Blanco LV, Fleisch E, et al. Applying user-centered design and implementation science to the early-stage development of a telehealth-enhanced hybrid cardiac rehabilitation program: quality improvement study. JMIR Form Res. 2023;7:e47264. https://doiorg.publicaciones.saludcastillayleon.es/10.2196/47264.
Farley H, Stepanek M, Aquino C, Whalen M. Creating a standardized post-fall debrief tool: a quality improvement project. J Nurs Care Qual. 2023;38(2):120–5. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/NCQ.0000000000000667.
Kallam B, Pettitt-Schieber C, Owen M, Agyare Asante R, Darko E, Ramaswamy R. Implementation science in low-resource settings: using the interactive systems framework to improve hand hygiene in a tertiary hospital in Ghana. Int J Qual Health Care. 2018;30(9):724–30. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/intqhc/mzy111.
Kingsley RA. A healthcare improvement initiative to increase multidisciplinary pain management referrals for youth with sickle cell disease. Pain Manag Nurs. 2020;21(5):403–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.pmn.2020.03.005.
Mathura P, Marini S, Hagtvedt R, Spalding K, Duhn L, Kassam N, et al. Factors of a physician quality improvement leadership coalition that influence physician behaviour: a mixed methods study. BMJ Open Quality. 2023;12(2):06. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjoq-2022-002016.
Middleton K, Williams C, Bernard D, Gautham KS, Shivananda S. Designing behavioral interventions using the capability-opportunity-motivation-behavior model and the theoretical domains framework to optimize oxygen saturation maintenance by NICU providers. Can J Respir Ther. 2022;58:77–83. https://doiorg.publicaciones.saludcastillayleon.es/10.29390/cjrt-2021-075.
Patel S, Pierce L, Jones M, Lai A, Cai M, Sharpe BA, et al. Using participatory design to engage physicians in the development of a provider-level performance dashboard and feedback system. Jt Comm J Qual Patient Saf. 2022;48(3):165–72. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.jcjq.2021.10.003.
Silva ESD, Primo CC, Gimbel S, Almeida MVS, Oliveira NS, Lima EFA. Elaboration and implementation of a protocol for the Golden Hour of premature newborns using an Implementation Science lens. Rev Lat Am Enfermagem. 2023;31:e3956. https://doiorg.publicaciones.saludcastillayleon.es/10.1590/1518-8345.6627.3957.
Steinmo SH, Michie S, Fuller C, Stanley S, Stapleton C, Stone SP. Bridging the gap between pragmatic intervention design and theory: using behavioural science tools to modify an existing quality improvement programme to implement “Sepsis Six.” Implement Sci. 2016;11:14. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-016-0376-8.
Vanstone JR, Patel S, Degelman ML, Abubakari IW, McCann S, Parker R, et al. Development and implementation of a clinician report to reduce unnecessary urine drug screen testing in the ED: a quality improvement initiative. Emerg Med J. 2022;39(6):471–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/emermed-2020-210009.
Young HML, Jeurkar S, Churchward DR, Dungey M, Stensel DJ, Bishop NC, et al. Implementing a theory-based intradialytic exercise programme in practice: a quality improvement project. Clin Kidney J. 2018;11(6):832–40. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/ckj/sfy050.
Byrt T, Bishop J, Carlin JB. Bias, prevalence and kappa. J Clin Epidemiol. 1993;46(5):423–9.
Nilsen P, Thor J, Bender M, Leeman J, Andersson-Gäre B, Sevdalis N. Bridging the silos: a comparative analysis of implementation science and improvement science. Front Health Serv. 2022;1:817750. https://doiorg.publicaciones.saludcastillayleon.es/10.3389/frhs.2021.817750.
Koczwara B, Stover AM, Davies L, Davis MM, Fleisher L, Ramanadhan S, et al. Harnessing the synergy between improvement science and implementation science in cancer: a call to action. Am Soc Clin Oncol. 2018;14(6):335–40.
Ogrinc G, Davies L, Goodman D, Batalden P, Davidoff F, Stevens D. SQUIRE 2.0 (S tandards for QU ality I mprovement R eporting E xcellence): revised publication guidelines from a detailed consensus process. J Contin Educ Nurs. 2015;46(11):501–7. https://doiorg.publicaciones.saludcastillayleon.es/10.3928/00220124-20151020-02.
Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for reporting implementation studies (StaRI) statement. BMJ. 2017;356. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmj.i6795.
Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmj.g1687.
Dyson J, Cowdell F. How is the Theoretical Domains Framework applied in designing interventions to support healthcare practitioner behaviour change? A systematic review. Int J Qual Health Care. 2021;33(3):mzab106. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/intqhc/mzab106.
Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, Caswell W, et al. Lost in knowledge translation: time for a map? J Contin Educ Heal Prof. 2006;26(1):13–24. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/chp.47.
Rapport F, Smith J, Hutchinson K, Clay-Williams R, Churruca K, Bierbaum M, et al. Too much theory and not enough practice? The challenge of implementation science application in healthcare practice. J Eval Clin Pract. 2022;28(6):991–1002. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/jep.13600.
Davidoff F, Dixon-Woods M, Leviton L, Michie S. Demystifying theory and its use in improvement. BMJ Qual Saf. 2015;24(3):228–38. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjqs-2014-003627.
Acknowledgements
The authors would like to thank the librarian at Macquarie University who provided support regarding the syntax for the database searches.
Funding
This research was funded by The Flinders Foundation (MB).
Author information
Authors and Affiliations
Contributions
The study was designed by M.B., S.B., S.W., Z.F., and P.H. The database searches were conducted by M.B. Title abstract reviews were conducted by M.B., S.B., S.W., Z.F., S.H., L.A.E., and P.H. Full text reviews were conducted by M.B., S.B., S.W., Z.F., S.H., and PH. Discussion and analysis of the study findings was conducted by M.B., S.B., S.W., Z.F., S.H., L.A.E., A.G., R.P., and P.H. The drafting of the manuscript was conducted by M.B., S.B., S.W., Z.F., and P.H. All authors reviewed and amended drafts of the manuscript, and approved the final manuscript (M.B., S.B., S.W., Z.F., S.H., L.A.E., A.G., R.P., and P.H.).
Corresponding author
Ethics declarations
Ethics approval and consent to participate
None. Ethical approval to conduct the systematic review was not required.
Consent for publication
Not applicable.
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
12913_2025_12730_MOESM3_ESM.pdf
Supplementary Material 3. Explanation and examples of the analysis for Fig. 3.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Bierbaum, M., Best, S., Williams, S. et al. The integration of quality improvement and implementation science methods and frameworks in healthcare: a systematic review. BMC Health Serv Res 25, 558 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12913-025-12730-9
Received:
Accepted:
Published:
DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12913-025-12730-9