Evaluate Implementation

Implementation Science Frameworks Facilitate Evaluation

Implementation Science frameworks can be utilized to identify how and what to assess. Two widely used evaluation models are:

  • The Proctor model (see Proctor et al 2011) which includes
    • Implementation Outcomes (acceptability, adoption, appropriateness, costs, feasibility, fidelity, penetration, sustainability)
    • Service Outcomes (efficiency, safety, effectiveness, equity, patient-centeredness, timeliness)
    • Client Outcomes (satisfaction, function, symptomatology)
  • Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM, see Glasgow et al 2019)

For example, Manalili and colleagues (2020) used RE-AIM to create a quantitative data collection and evaluation plan.

Stover and colleagues (2020) compared the evaluation of four case studies of implementation of patient-reported measures in routine care settings. They provide example measures of acceptability, appropriateness, feasibility, adoption, reach/penetration, fidelity, cost, and sustainability.

Evaluation Case Studies

  • The Implementing Patient-Reported Outcome Measures in Clinical Practice: A Companion Guide to the ISOQOL User’s Guide describes the formal and informal evaluation of the value of patient-reported outcome assessment used in 10 healthcare systems (see Section 9). Learn more>>
  • Sisodia and colleagues (2020) conducted an evaluation of implementation across a large healthcare system.
  • In a 2018 review, Anatchkova and colleagues utilized the 2015 ISOQOL User’s Guide for Implementing Patient-Reported Outcomes Assessment in Clinical Practice to create a framework for evaluating publications on the implementation of PROs in oncology clinical practice. In their systematic literature review, they reviewed 36 publications, most of which (n=29, 81%) were reports on intervention or feasibility research. Only three studies described ongoing routine PRO assessment to manage patient care outside of the scope of a research study. Aims for PRO collection were varied including improving symptom monitoring, patient-provider communication, patient-centeredness of care, and quality of care. Only about half of the studies reported intervention results on patient outcomes. The authors also reported a need for guidelines for interpreting PRO scores and for strategies to address concerning PRO scores “…particularly when the identified needs of patients extend beyond the expertise or training found in a routine oncology clinical practice such as depression or lack of social support.” Learn more>>

Evaluation Resources

  • An example of metrics with data definitions that can be used for monitoring implementation is available through the ePROs in Clinical Care website. See ePRO Sample Implementation Monitoring Plan.
  • The RE-AIM website (RE-AIM.org) includes numerous resources such as measures and checklists, planning tools, key articles and guidance.
  • The National Implementation Research Network’s Active Implementation Hub is a free, online resource to build knowledge and improve the performance of people engaged in implementation.
  • Lewis CC, Fischer S, Weiner BJ, Stanick C, Kim M, Martinez RG. Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria. Implement Sci. 2015 Nov 4;10:155. doi: 10.1186/s13012-015-0342-x. PMID: 26537706; PMCID: PMC4634818.
  • Kessler RS, Purcell EP, Glasgow RE, Klesges LM, Benkeser RM, Peek CJ. What does it mean to "employ" the RE-AIM model? Eval Health Prof. 2013 Mar;36(1):44-66. doi: 10.1177/0163278712446066. Epub 2012 May 21. PMID: 22615498.