Advertisement
Mayo Clinic Proceedings Home

Reporting on the Strategies Needed to Implement Proven Interventions: An Example From a “Real-World” Cross-Setting Implementation Study

Open AccessPublished:April 22, 2016DOI:https://doi.org/10.1016/j.mayocp.2016.03.014

      Abstract

      The objective of this study was to empirically demonstrate the use of a new framework for describing the strategies used to implement quality improvement interventions and provide an example that others may follow. Implementation strategies are the specific approaches, methods, structures, and resources used to introduce and encourage uptake of a given intervention's components. Such strategies have not been regularly reported in descriptions of interventions' effectiveness, or in assessments of how proven interventions are implemented in new settings. This lack of reporting may hinder efforts to successfully translate effective interventions into “real-world” practice. A recently published framework was designed to standardize reporting on implementation strategies in the implementation science literature. We applied this framework to describe the strategies used to implement a single intervention in its original commercial care setting, and when implemented in community health centers from September 2010 through May 2015. Per this framework, the target (clinic staff) and outcome (prescribing rates) remained the same across settings; the actor, action, temporality, and dose were adapted to fit local context. The framework proved helpful in articulating which of the implementation strategies were kept constant and which were tailored to fit diverse settings, and simplified our reporting of their effects. Researchers should consider consistently reporting this information, which could be crucial to the success or failure of implementing proven interventions effectively across diverse care settings.

      Trial Registration

      Abbreviations and Acronyms:

      ALL (a system-level QI intervention designed to increase the percentage of patients with diabetes appropriately prescribed cardioprotective medications— Aspirin, Lovastatin (any statin), and Lisinopril (any angiotensin-converting enzyme inhibitor/angiotensin receptor blocker)), CHC (community health center), KP (Kaiser Permanente), QI (quality improvement)
      Implementation science involves “methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice … to improve the quality and effectiveness of health services. It includes the study of influences on healthcare professional and organisational behaviour.”
      • Eccles M.P.
      • Mittman B.S.
      Welcome to implementation science.
      Such inquiry can involve assessing which approaches to implementation are most effective in different settings. These approaches, often called “implementation strategies,” have been defined as “methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice … the specific means or methods for adopting … interventions.”
      • Proctor E.K.
      • Powell B.J.
      • McMillen J.C.
      Implementation strategies: recommendations for specifying and reporting.
      These aspects of implementation are typically underreported. This article empirically demonstrates the value of reporting on implementation strategies applied in a cross-setting implementation study, using a recently proposed reporting framework.
      Reporting on implementation research commonly addresses how intervention components (intervention elements considered key to impacting outcomes in their setting of origin; eg, scripted outreach calls, automated electronic health record–based alerts, and dedicated staff time for patient follow-up) are implemented in new settings. Such reporting illuminates how interventions can be adapted in such settings while still achieving targeted effects. However, the implementation strategies used to support adoption of the intervention components likely affect an intervention's success in new settings, but are less commonly reported, and adaptations made to such strategies are rarely mentioned.
      • Stange K.C.
      • Glasgow R.E.
      Considering and Reporting Important Contextual Factors in Research on the Patient-Centered Medical Home.
      • Damschroder L.J.
      • Aron D.C.
      • Keith R.E.
      • Kirsh S.R.
      • Alexander J.A.
      • Lowery J.C.
      Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science.
      • Proctor E.K.
      • Landsverk J.
      • Aarons G.
      • Chambers D.
      • Glisson C.
      • Mittman B.
      Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges.
      • Slaughter S.E.
      • Hill J.N.
      • Snelgrove-Clarke E.
      What is the extent and quality of documentation and reporting of fidelity to implementation strategies: a scoping review.
      • Powell B.J.
      • Waltz T.J.
      • Chinman M.J.
      • et al.
      A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project.

      Powell BJ, Beidas RS, Lewis CC, et al. Methods to improve the selection and tailoring of implementation strategies [published online: August 21, 2015]. J Behav Health Serv Res.

      Successful cross-setting implementation of effective interventions likely requires consideration of both intervention components and implementation strategies,
      • Proctor E.K.
      • Powell B.J.
      • McMillen J.C.
      Implementation strategies: recommendations for specifying and reporting.
      • Slaughter S.E.
      • Hill J.N.
      • Snelgrove-Clarke E.
      What is the extent and quality of documentation and reporting of fidelity to implementation strategies: a scoping review.
      so lack of reporting on how implementation strategies were reproduced or adapted in new settings creates a barrier to future implementation.
      • Powell B.J.
      • Waltz T.J.
      • Chinman M.J.
      • et al.
      A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project.

      Powell BJ, Beidas RS, Lewis CC, et al. Methods to improve the selection and tailoring of implementation strategies [published online: August 21, 2015]. J Behav Health Serv Res.

      One reason why intervention strategies may be underreported is that implementation science has no widely accepted taxonomy for differentiating intervention components from implementation strategies, and until recently lacked specific guidelines for reporting on implementation strategies. To address these gaps, Proctor et al
      • Proctor E.K.
      • Powell B.J.
      • McMillen J.C.
      Implementation strategies: recommendations for specifying and reporting.
      proposed standards for reporting on implementation strategies. Proctor et al's framework lists 7 reportable domains of implementation strategies: actor, action, target of the action, temporality, dose, outcomes affected, and justification (Table 1). The authors define implementation strategies as a distinct group of factors to be recognized and reported but note that some factors could be defined as either intervention components or implementation strategies, which complicates reporting.
      Table 1Proctor et al's 7 Domains of an Implementation Strategy
      • Proctor E.K.
      • Powell B.J.
      • McMillen J.C.
      Implementation strategies: recommendations for specifying and reporting.
      DomainExplanationMeasurement
      ActorWho delivers the strategyQualitative
      ActionSteps to be taken to carry out the strategyQualitative
      Target of the actionWho/what the actors are attempting to impact, based on conceptual models of implementation; multiple targets possibleQualitative
      TemporalityWhen does the strategy take place; What is the order of the strategiesQuantitative or qualitative
      DoseFrequency and intensityQuantitative or qualitative
      Outcomes affectedWhat will the strategy changeQuantitative or qualitative
      JustificationBasis for the strategy in research or practiceQualitative
      We applied Proctor et al's framework to report on the strategies used to implement a diabetes quality improvement (QI) intervention, proven effective in an integrated care system, in 11 primary care community health centers (CHCs). Our goal was to demonstrate the framework's utility for reporting practical information on factors needed to implement a proven intervention into a new setting, and provide a concrete example of such reporting. Some elements in this example could be considered either intervention components or implementation strategies, such as the automated alerts. For illustrative purposes, we define intervention components as the tools provided to the CHCs and implementation strategies as the methods used to support the uptake of these tools.
      Proctor et al suggest that implementation strategies may be described at varying levels of granularity.
      • Powell B.J.
      • McMillen J.C.
      • Proctor E.K.
      • et al.
      A compilation of strategies for implementing clinical innovations in health and mental health.
      We demonstrate how we applied Proctor et al's framework to the overarching, multifaceted implementation strategies used in both settings; we also demonstrate use of the framework at a more granular level, by applying it to the discrete implementation elements within the overarching practice facilitation strategy used in CHCs.

      The Intervention: Kaiser Permanente's ALL Initiative

      The ALL initiative is a system-level QI intervention designed to increase the percentage of patients with diabetes appropriately prescribed cardioprotective medications—Aspirin, Lovastatin (any statin), and Lisinopril (any angiotensin-converting enzyme inhibitor/angiotensin receptor blocker). The ALL initiative was implemented at Kaiser Permanente (KP) on the basis of compelling evidence for these medicines' effectiveness.
      • Wong W.
      • Jaffe M.
      • Wong M.
      • Dudl R.J.
      Community implementation and translation of Kaiser Permanente's cardiovascular disease risk-reduction strategy.
      • Pettay H.S.
      • Branthaver B.
      • Cristobal K.
      • Wong M.
      The Care Management Institute: harvesting innovation, maximizing transfer.
      • Dudl R.J.
      • Wang M.C.
      • Wong M.
      • Bellows J.
      Preventing myocardial infarction and stroke with a simplified bundle of cardioprotective medications.
      Adult KP patients who took the ALL medications had notable reduced risks of cardiovascular disease hospitalization; overall rates of myocardial infarctions also declined substantially.
      • Yeh R.W.
      • Sidney S.
      • Chandra M.
      • Sorel M.
      • Selby J.V.
      • Go A.S.
      Population trends in the incidence and outcomes of acute myocardial infarction.
      The strong underlying evidence, and considerable impact of the ALL initiative in KP, indicated the potential benefits of attempting to implement the ALL initiative in CHCs.
      Intervention components of KP were electronic health record–based tools designed to streamline identification of patients missing indicated medications and prescribing the medications (Table 2). Implementation strategies of KP (Table 3) were not reported in formal publications; our understanding of these strategies was gained through extensive communication with KP leadership. The strategies were selected because they harnessed existing infrastructure.
      • Dudl R.J.
      • Wang M.C.
      • Wong M.
      • Bellows J.
      Preventing myocardial infarction and stroke with a simplified bundle of cardioprotective medications.
      Table 2Summary of ALL Intervention Components in KP and as Adapted for CHCs
      ComponentPurposeAt KPAt CHCs
      Automated EHR point-of-care alertsSupport real-time identification of patients indicated for but not prescribed ALL medication(s)Added to KP tool that identifies multiple care gapsSeparate EHR alert for this “care gap” only
      EHR registriesSupport identification of patients indicated for but not prescribed ALL medication(s), to facilitate outreachIn patient panel tool that identifies ALL-indicated patients in addition to other care gapsStand-alone ALL-specific rosters
      EHR order setsFacilitate prescribing ALL medication(s)One-click preprogrammed prescription order setsOrder sets with commonly prescribed dosages/medications
      Patient education materialsIncrease patient knowledge about, adherence to ALL medicationsNoExamination room posters, patient handouts in 3 languages
      Patient adherence tracking and outreachImprove patient adherence to prescribed medication(s)Reminder calls to patients, if prescriptions were not picked upNo standardized adherence tracking; outreach varied between clinics
      ACE = angiotensin-converting enzyme; ALL = a system-level QI intervention designed to increase the percentage of patients with diabetes appropriately prescribed cardioprotective medications—Aspirin, Lovastatin (any statin), and Lisinopril (any ACE inhibitor/angiotensin receptor blocker); EHR = electronic health record; KP = Kaiser Permanente; QI = quality improvement.
      Table 3Implementation Strategies Used, per Proctor et al's Framework
      ACE = angiotensin-converting enzyme; ALL = a system-level QI intervention designed to increase the percentage of patients with diabetes appropriately prescribed cardioprotective medications—Aspirin, Lovastatin (any statin), and Lisinopril (any ACE inhibitor/angiotensin receptor blocker); CHC = community health center; KP = Kaiser Permanente; QI = quality improvement.
      Proctor et al's framework domains, applied to describe the overarching strategiesIn KP: Overarching strategy = Top-downIn CHCs: Overarching strategy = Practice facilitation
      ActorNational/regional health plan leadership, and regional ALL “champions” identified to encourage local uptake; protected time to do soClinic/service organization ALL “champions” identified to encourage local uptake; site coordinators/practice facilitators; study research staff
      ActionChampions receive protected timeStaff oriented to ALL/the underlying evidence at department meetings
      Organizational structure supports top-down practice change directives; regional directives say such prescribing is the expected standard of care. Providers informed of new policies, expectations; oriented to ALL and its underlying evidence at department meetings, and through other existing mechanisms in place to support communication related to such directivesEncourage uptake by providing intensive support. Onsite study staff provided practice facilitation; trained on intervention components, underlying evidence; implementation oversight; technical assistance; create lists of indicated patients to individual providers and monthly performance reports
      Adherence incentivized by linking staff incentives to performance, enabled by existing reimbursement structures; augmented with quarterly performance reports on ALL prescribingIntensive staff engagement: Clinic staff asked for feedback on intervention tools and how they fit in workflows; tools adapted on the basis of feedback; monthly meetings between study team and clinic staff
      Target of the action
      This domain was unchanged.
      Change prescribing for indicated patients: providers to prescribe ALL medications for patients who meet criteriaChange prescribing for indicated patients: providers to prescribe ALL medications for patients who meet criteria
      TemporalityOne-time rollout; ongoing monitoring and incentivizing3-4 y postimplementation practice facilitation, support
      DoseOne-time directiveOngoing intensive practice facilitation
      Outcomes affected
      This domain was unchanged.
      Appropriate prescribing of ALL medications to indicated patients; goal is improvement in diabetes care qualityAppropriate prescribing of ALL medications to indicated patients; goal is improvement in diabetes care quality
      JustificationKP used existing communication mechanisms to encourage uptake of ALL practice changesPractice facilitation literature supported this approach in diverse organizational settings with fewer resources
      a ACE = angiotensin-converting enzyme; ALL = a system-level QI intervention designed to increase the percentage of patients with diabetes appropriately prescribed cardioprotective medications—Aspirin, Lovastatin (any statin), and Lisinopril (any ACE inhibitor/angiotensin receptor blocker); CHC = community health center; KP = Kaiser Permanente; QI = quality improvement.
      b This domain was unchanged.

      Research Into Practice: Implementing the ALL Intervention in CHCs

      In Portland, Oregon, 11 CHCs participated in a randomized trial testing the feasibility of implementing the ALL intervention in the CHC setting (CTI NCT02299791; NHLBI 1R18HL095481). In Table 2, we show how the intervention components from KP's ALL initiative (defined as the specific intervention tools) were adapted when implemented in the CHCs.
      • Gold R.
      • Nelson C.
      • Cowburn S.
      • et al.
      Feasibility and impact of implementing a private care system's diabetes quality improvement intervention in the safety net: a cluster-randomized trial.
      • Bunce A.E.
      • Gold R.
      • Davis J.V.
      • et al.
      Ethnographic process evaluation in primary care: explaining the complexity of implementation.
      In brief, both settings received electronic health record–based alerts, registries, and order sets, all of which were adapted somewhat to fit local resources. At KP, the tools supported outreach to enhance patient adherence; in the CHCs, they included patient education materials. We showed a substantial increase in guideline-concordant prescribing in the CHCs, indicating that the intervention was successfully implemented; results are reported elsewhere.
      • Gold R.
      • Nelson C.
      • Cowburn S.
      • et al.
      Feasibility and impact of implementing a private care system's diabetes quality improvement intervention in the safety net: a cluster-randomized trial.

      Using Proctor et al's Framework to Report on Implementation Strategies

      Adaptations made to the implementation strategies as used in the CHCs are presented in Table 3. We applied Proctor et al's framework's domains to the KP and CHC implementation strategies during our study analyses to refine our understanding of how implementation strategies differed across sites, and how to report on these differences. Table 3 outlines how we applied the framework to the multifaceted, overarching implementation strategies used at KP and the CHCs, to describe the specific components within these larger approaches. Overall, KP used a top-down strategy; the CHCs used a practice facilitation strategy. The affected target (clinic staff) and outcome (prescribing rates) were the same in both settings. Differences between KP and the CHCs in resources and organizational structure, however, necessitated adaptations to the strategies' actor, action, temporality, and dose. These adaptations, and their justifications, are described below. To further demonstrate potential uses of this framework, Table 4 presents how it could be applied at a more granular level, to describe the discrete elements within the CHCs' implementation strategy in more detail.
      Table 4Application of Proctor et al's Reporting Framework to the Specific Elements Within the Overarching Implementation Strategy Used in the CHCs
      CFIR = Consolidated Framework for Implementation Research4; CHC = community health center; MD = doctor of medicine.
      Individual elements of practice facilitation implementation strategyProctor et al reporting framework domains
      ActorActionTarget of the actionTemporalityDoseOutcomes affectedJustification
      Ivers et al.16,17
      Engagement of clinic leadership during preimplementation planning processStudy teamIdentify clinic champions—MDs interested in quality improvement, diabetes care; often in leadership roleBuild ownership and acceptance of the intervention among clinic leadership; prepare site for implementationPresubmission of proposalOne timeImproved staff trust, understanding, uptake of interventionStructural, staff engagement, culture (CFIR)
      Study team/clinic champions/clinic leadershipDesign implementation processPreimplementationOngoing discussions first 9 mo of studyDesign quality and packaging, planning, engaging (CFIR)
      Clinic leadershipHire practice facilitator—current clinic staff with interest in quality improvement, diabetes care (final selection—nurse, panel managers, quality improvement specialist)One time within first 9 mo of studyNetworks and communication (CFIR)
      Study teamTrain clinic champions and practice facilitatorsMultiple informal trainings, and information provided as requestedEnable peer-to-peer training and coachingKnowledge and beliefs, self-efficacy (CFIR)
      Communication of organizational support for the interventionClinic championCommunicate expectations of behavior change related to the interventionBuild knowledge and acceptance of the intervention among clinic staffExplicitly at start of implementation, then as needed1-h meeting at each clinic, then informally as neededImproved staff trust, understanding, uptake of interventionStructural, networks and communication, culture (CFIR)
      Share evidence underlying intervention with colleagues/other clinic staffAnnually at start of implementation years 1 and 21-h meeting at each clinicEvidence strength and quality, engaging, relative advantage (CFIR)
      Provision of intensive implementation support (Note: Various combinations of listed Actions applied in different sites; this was intentional, to allow for flexibility to meet the needs of each site)Practice facilitatorsProvide formal clinicwide staff training on intervention components and underlying evidence (often in conjunction with clinic champion)Build knowledge and acceptance of the intervention among clinic staffAnnually at start of implementation years 1 and 21-h meeting at each clinicImproved staff trust, understanding, uptake of interventionEvidence strength and quality, relative advantage, staff engagement, knowledge and beliefs, self-efficacy (CFIR)
      Lead care team–based trainings with a focus on details of the intervention tools and the implications for clinic workflowsFacilitate use of the intervention tools by clinic staff within varied care team workflowsAnnually at start of implementation years 1 and 2Half hour with each teamImproved ability to use the intervention tools in existing workflows; improved staff trust and use of the toolsStructural; staff engagement; knowledge and beliefs; self-efficacy (CFIR)
      Lead clinic staff in Plan-Do-Study-Act cycles related to use of intervention tools in clinic workflowIteratively throughout implementation years 1-3As needed/requestedAdaptability, trialability (CFIR)
      Be the go-to person for intervention assistance—available onsite to answer questions, provide technical assistanceAs needed, 4-y postimplementation, first wave of clinics; 3 y, second waveAs needed/requestedNetworks and communication, knowledge and belief (CFIR)
      Check in with clinic staff to ask about problems with, concerns about use of the intervention toolsRegularly; variability in dose by siteIndividual stage of change, knowledge and beliefs, staff engagement (CFIR)
      Share any barriers to uptake of the tools/potential fixes with study teamMonthly1-h meetingImproved intervention tools
      Use reporting tools to create and provide lists of target patients to individual providersGive clinic providers knowledge of which of their patients lacked an indicated prescription4-y postimplementation, first wave of clinics; 3 y, second waveVaried by site; ranged from every 6 wk to 1 time over course of studyAppropriately prescribe for identified patientsReflecting and evaluating, executing (CFIR); audit and feedback
      Ivers et al.16,17
      Use reporting tools to create and provide panel-level monthly performance metrics to individual providersGive clinic providers information about care gaps on their panelInvestigate care gaps, leading to appropriate prescribing
      Ongoing engagement of clinic staffStudy teamProvide forum for clinic leadership/staff feedback on intervention tools, and their fit in workflowsGive clinic staff an opportunity to ask for any needed changes to the intervention toolsMonthly for 4-y postimplementation, first wave of clinics; 3 y, second wave1-h meetingEnhance staff trust in and use of the toolsReflecting and evaluating, adaptability, engaging (CFIR)
      Iterate/update the intervention tools as requested by clinic staff, as possibleSupport use of intervention tools by entire care teamMonthly throughout implementation year 1Made several minor adaptations; one major adaptation made, end of implementation year 1Enhance staff trust in and use of the toolsReflecting and evaluating, intervention source, adaptability (CFIR)
      Provide clinic-level monthly performance metrics to clinic leadershipGive clinic leadership feedback on uptake/impact of the interventionMonthly for 4-y postimplementation, first wave of clinics; 3 y, second wave1 report per clinicSustain organizational support for the interventionReflecting and evaluating, executing (CFIR); Audit and feedback
      Ivers et al.16,17
      Share updates on intervention and relevant clinical evidenceGive clinic staff a refresher on the intervention, and adaptations made recently, and its impactAnnually, study years 2-31-h meetingImproved staff trust, understanding, uptake of interventionKnowledge, staff engagement (CFIR)
      a CFIR = Consolidated Framework for Implementation Research
      • Damschroder L.J.
      • Aron D.C.
      • Keith R.E.
      • Kirsh S.R.
      • Alexander J.A.
      • Lowery J.C.
      Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science.
      ; CHC = community health center; MD = doctor of medicine.
      b Ivers et al.
      • Ivers N.
      • Jamtvedt G.
      • Flottorp S.
      • et al.
      Audit and feedback: effects on professional practice and healthcare outcomes.
      • Ivers N.M.
      • Grimshaw J.M.
      • Jamtvedt G.
      • et al.
      Growing literature, stagnant science? Systematic review, meta-regression and cumulative analysis of audit and feedback interventions in health care.

      Differences in Main Actor, Action

      In KP's top-down strategy, the main actor was regional health plan leadership and the main action was identifying ALL as KP's standard of care and offering provider incentives for appropriately prescribing the ALL medications; in brief, provider bonuses were tied to performance on a number of quality measures, including those targeted by ALL. In contrast, the practice facilitation strategy in the CHCs emphasized staff engagement, practice facilitation, and direct support. The CHCs chose current clinic employees (eg, nurses, panel managers, and QI specialists) to be practice facilitators; the study paid for their time. These facilitators provided an on-the-ground link between clinic staff and study team, trained other staff on the evidence behind ALL and the intervention tools, tested the tools, oversaw implementation, and solicited staff feedback. Intervention components were adapted and refined throughout the first implementation year (eg, tailoring training materials) on the basis of this feedback. Thus, in the CHCs' implementation of ALL, the main actor was the onsite practice facilitator and the main action involved providing information, practical tools, encouragement, hands-on assistance, ongoing support, and actively seeking feedback. The research team's presence in the clinics (for study meetings and qualitative data collection) provided another forum for staff interaction; thus, the researchers were an additional actor in this implementation. In some of the CHCs, these actions/actors were augmented by the concurrent introduction of changes to the clinic's diabetes standard of care—an additional action wherein changes targeted by the intervention were presented to staff as part of this new standard.

      Differences in Temporality, Dose

      Temporality and dose differed between the 2 settings. At KP, the intervention tools were designed by KP leadership, and then broadly implemented, followed by ongoing feedback reporting and incentives. In addition to a one-time directive regarding providers' prescribing practices, each KP region identified clinician “champions” to encourage uptake of QI initiatives, including ALL, and protected champions' time for related activities. Kaiser Permanente then monitored providers' adherence to the new practices as part of its ongoing quality assessment processes.
      In the CHCs, the first step involved staff engagement, followed by implementation, then ongoing follow-up and feedback reporting. The CHCs identified clinician champions at each organization, and the research grant paid for 5% of their time during the 5-year study. Unlike KP, however, the practice facilitators provided additional intensive staff engagement and support throughout the intervention's implementation and follow-up processes.

      Differences in Justification

      In both settings, the overarching justification for the chosen implementation strategy was its fit within each organization's culture and capacity. Strategies used at KP to direct and incentivize uptake of the ALL initiative harnessed KP's resources, communication mechanisms, and leadership structures. At CHCs, local context was assessed a priori on the basis of insider knowledge (the study team included CHC staff) and initial findings from a qualitative process evaluation.
      • Bunce A.E.
      • Gold R.
      • Davis J.V.
      • et al.
      Ethnographic process evaluation in primary care: explaining the complexity of implementation.
      The CHCs' organizational structure emphasized collaborative processes and provider autonomy and they lacked the resources to provide financial incentives; thus, practice facilitation and clinic staff engagement were a better fit in the CHCs.

      Discussion

      This article is one of the first
      • Bunger A.C.
      • Hanson R.F.
      • Doogan N.J.
      • Powell B.J.
      • Cao Y.
      • Dunn J.
      Can learning collaboratives support implementation by rewiring professional networks?.
      to demonstrate the application of Proctor et al's framework for reporting on strategies used to implement an intervention across care settings. This framework is the first to explicitly establish implementation strategies as a distinct group of factors to be recognized and reported. In doing so, it builds on earlier efforts to advance implementation science,
      • Proctor E.K.
      • Landsverk J.
      • Aarons G.
      • Chambers D.
      • Glisson C.
      • Mittman B.
      Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges.
      • Proctor E.
      • Carpenter C.
      • Brown C.H.
      • et al.
      Advancing the science of dissemination and implementation: three “6th NIH Meetings” on training, measures, and methods.
      and on previous work to guide reporting on practice change/QI efforts, such as the Standards for QUality Improvement Reporting Excellence guidelines
      • Ogrinc G.
      • Davies L.
      • Goodman D.
      • et al.
      SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): Revised Publication Guidelines From a Detailed Consensus Process.
      • Ogrinc G.
      • Mooney S.E.
      • Estrada C.
      • et al.
      The SQUIRE (Standards for QUality Improvement Reporting Excellence) guidelines for quality improvement reporting: explanation and elaboration.
      and the Workgroup for Intervention Development and Evaluation Research recommendations for reporting on behavior change interventions.
      • Albrecht L.
      • Archibald M.
      • Arseneau D.
      • Scott S.D.
      Development of a checklist to assess the quality of reporting of knowledge translation interventions using the Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations.
      Proctor et al's framework helped organize our description of how implementation strategy elements at KP were adapted in the CHCs. This helped us differentiate between the strategies and articulate which were modified, improving our understanding of their effects. For example, KP's culture and resources enabled establishing care guidelines and financially rewarding providers who met them (per Proctor et al's framework, the main actor and action). The CHCs, however, emphasized personal engagement, reflecting their collaborative approach to practice change, and more limited fiscal capabilities. Specifying the justification for the two different approaches, and examining our findings in light of that specification, helped us understand the characteristics of the CHC practice facilitators (actors) that were most effective (ie, they were trusted by clinic staff, and received dedicated time for their work in this role) and helped explain some of the diversity in results by site. Members of the researcher team, often present in the clinics, provided another opportunity for engagement (secondary actors). Such intensive person-to-person engagement (action) is likely impractical outside of a research context, particularly in underresourced settings.
      We encountered some challenges in applying the framework. In some cases, we found that the implementation strategy's components did not always fit neatly within the framework's domains. For example, given the strategy's multifaceted, deliberately flexible process, it could be challenging to determine the main drivers of change (eg, if the main actor is the practice facilitator and how best to describe the role of research team involvement?). In addition, we used the framework to guide our description of the overarching, multifaceted implementation strategies used at KP and the CHCs; for example, we define the “action” of the CHCs' overarching strategy as the provision of support and resources. This demonstrated the essential cohesiveness of the overall implementation approach, and supported brevity, but meant that potentially important details were omitted. Thus, we also present Table 4 to demonstrate how the framework domains (actor, action, etc) could be applied to each specific element within these implementation strategies. Future users of the framework will need to determine what level of granularity to report on, on a case-by-case basis, taking into consideration that reporting with more granularity is needed to serve the field.
      Similar challenges may be faced by others attempting to report on implementation strategies, which often include multiple components within an overarching strategy. We suggest that authors explicitly state the level of granularity at which they chose to apply Proctor et al's framework (the overarching approach, or discrete components within that approach). More granular reporting would enable justifying the choices underlying each component of a multifaceted implementation strategy, and the impact of each component on its targeted outcome. Future iterations of the framework could provide further guidance about how to clearly differentiate between intervention components and implementation strategies, and how to describe whether the framework is applied to an overarching strategy or its component elements.

      The Importance of Reporting on Implementation Strategies

      Consistent reporting on implementation strategies, including details about which strategies contribute to an intervention's success and how they can be adapted for diverse settings, should be encouraged.

      Powell BJ, Beidas RS, Lewis CC, et al. Methods to improve the selection and tailoring of implementation strategies [published online: August 21, 2015]. J Behav Health Serv Res.

      Proctor et al's framework for reporting could help ensure that interventions proven effective in controlled research settings can be successfully implemented in real-world practice. Standardized reporting may be particularly important for interventions that allow for flexibility in implementation, as is often necessary to meet local needs.
      • Simpson K.M.
      • Porter K.
      • McConnell E.S.
      • et al.
      Tool for evaluating research implementation challenges: a sense-making protocol for addressing implementation challenges in complex research settings.
      Furthermore, “real-world” clinicians seeking to replicate effective interventions need evidence about which intervention components are critical, and which strategies may best support effective implementation in new settings. If specific strategies are essential to such implementation, failure to report on them means they may not be applied in future work. Restrictions on manuscript length may inhibit such reporting; journal editors could address this by requiring reporting on implementation strategies, or relaxing length restrictions for articles that include such reports.
      Careful specification when reporting on implementation strategies should be encouraged to support the potential for replication of proven implementation strategies, and for building a body of research comparing the effectiveness of specific strategies, including meta-analysis. This should involve authors clearly naming the discrete or component implementation strategies that are used, ideally using established definitions such as those in Proctor et al's framework. Although there are challenges to doing so, as noted above, such standardization would greatly serve the field of implementation science.

      Conclusion

      An important barrier to the effective cross-setting implementation of successful interventions is a lack of knowledge about how best to conduct context-specific implementation. Proctor et al's framework
      • Proctor E.K.
      • Powell B.J.
      • McMillen J.C.
      Implementation strategies: recommendations for specifying and reporting.
      provides guidelines that could improve how implementation strategies are documented. This, in turn, could address barriers to the dissemination of effective interventions, which could help facilitate “real-world” practices implementing effective interventions. This article illustrates the value of this framework in reporting on context-specific adaptations made to implementation strategies.

      Acknowledgments

      We gratefully acknowledge the OCHIN practice-based research network health centers. We also acknowledge the contributions of Lisa Fox and Wiley Chan.

      Supplemental Online Material

      References

        • Eccles M.P.
        • Mittman B.S.
        Welcome to implementation science.
        Implement Sci. 2006; 1: 1
        • Proctor E.K.
        • Powell B.J.
        • McMillen J.C.
        Implementation strategies: recommendations for specifying and reporting.
        Implement Sci. 2013; 8: 139
        • Stange K.C.
        • Glasgow R.E.
        Considering and Reporting Important Contextual Factors in Research on the Patient-Centered Medical Home.
        (ARHQ Publication No. 13-0045-EF) Agency for Healthcare Research and Quality, Rockville, Md2013
        • Damschroder L.J.
        • Aron D.C.
        • Keith R.E.
        • Kirsh S.R.
        • Alexander J.A.
        • Lowery J.C.
        Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science.
        Implement Sci. 2009; 4: 50
        • Proctor E.K.
        • Landsverk J.
        • Aarons G.
        • Chambers D.
        • Glisson C.
        • Mittman B.
        Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges.
        Adm Policy Ment Health. 2009; 36: 24-34
        • Slaughter S.E.
        • Hill J.N.
        • Snelgrove-Clarke E.
        What is the extent and quality of documentation and reporting of fidelity to implementation strategies: a scoping review.
        Implement Sci. 2015; 10: 129
        • Powell B.J.
        • Waltz T.J.
        • Chinman M.J.
        • et al.
        A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project.
        Implement Sci. 2015; 10: 21
      1. Powell BJ, Beidas RS, Lewis CC, et al. Methods to improve the selection and tailoring of implementation strategies [published online: August 21, 2015]. J Behav Health Serv Res.

        • Powell B.J.
        • McMillen J.C.
        • Proctor E.K.
        • et al.
        A compilation of strategies for implementing clinical innovations in health and mental health.
        Med Care Res Rev. 2012; 69: 123-157
        • Wong W.
        • Jaffe M.
        • Wong M.
        • Dudl R.J.
        Community implementation and translation of Kaiser Permanente's cardiovascular disease risk-reduction strategy.
        Perm J. 2011; 15: 36-41
        • Pettay H.S.
        • Branthaver B.
        • Cristobal K.
        • Wong M.
        The Care Management Institute: harvesting innovation, maximizing transfer.
        Perm J. 2005; 9: 37-39
        • Dudl R.J.
        • Wang M.C.
        • Wong M.
        • Bellows J.
        Preventing myocardial infarction and stroke with a simplified bundle of cardioprotective medications.
        Am J Manag Care. 2009; 15: e88-e94
        • Yeh R.W.
        • Sidney S.
        • Chandra M.
        • Sorel M.
        • Selby J.V.
        • Go A.S.
        Population trends in the incidence and outcomes of acute myocardial infarction.
        N Engl J Med. 2010; 362: 2155-2165
        • Gold R.
        • Nelson C.
        • Cowburn S.
        • et al.
        Feasibility and impact of implementing a private care system's diabetes quality improvement intervention in the safety net: a cluster-randomized trial.
        Implement Sci. 2015; 10: 83
        • Bunce A.E.
        • Gold R.
        • Davis J.V.
        • et al.
        Ethnographic process evaluation in primary care: explaining the complexity of implementation.
        BMC Health Serv Res. 2014; 14: 607
        • Ivers N.
        • Jamtvedt G.
        • Flottorp S.
        • et al.
        Audit and feedback: effects on professional practice and healthcare outcomes.
        Cochrane Database Syst Rev. 2012; 6: CD000259
        • Ivers N.M.
        • Grimshaw J.M.
        • Jamtvedt G.
        • et al.
        Growing literature, stagnant science? Systematic review, meta-regression and cumulative analysis of audit and feedback interventions in health care.
        J Gen Intern Med. 2014; 29: 1534-1541
        • Bunger A.C.
        • Hanson R.F.
        • Doogan N.J.
        • Powell B.J.
        • Cao Y.
        • Dunn J.
        Can learning collaboratives support implementation by rewiring professional networks?.
        Adm Policy Ment Health. 2016; 43: 79-92
        • Proctor E.
        • Carpenter C.
        • Brown C.H.
        • et al.
        Advancing the science of dissemination and implementation: three “6th NIH Meetings” on training, measures, and methods.
        Implement Sci. 2015; 10: A13
        • Ogrinc G.
        • Davies L.
        • Goodman D.
        • et al.
        SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): Revised Publication Guidelines From a Detailed Consensus Process.
        J Contin Educ Nurs. 2015; 46: 501-507
        • Ogrinc G.
        • Mooney S.E.
        • Estrada C.
        • et al.
        The SQUIRE (Standards for QUality Improvement Reporting Excellence) guidelines for quality improvement reporting: explanation and elaboration.
        Qual Saf Health Care. 2008; 17: i13-i32
        • Albrecht L.
        • Archibald M.
        • Arseneau D.
        • Scott S.D.
        Development of a checklist to assess the quality of reporting of knowledge translation interventions using the Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations.
        Implement Sci. 2013; 8: 52
        • Simpson K.M.
        • Porter K.
        • McConnell E.S.
        • et al.
        Tool for evaluating research implementation challenges: a sense-making protocol for addressing implementation challenges in complex research settings.
        Implement Sci. 2013; 8: 2