Introduction

Criminal and juvenile justice systems are increasingly training staff in evidence-based practices and programs (EBPs) to enhance public safety.[1] EBPs incorporate objective and reliable research and data to guide policy and practice decisions, with the aim of improving outcomes.[2] State and federal legislatures continue to expand their support for integrating EBPs through the provision of funding.[3] While EBPs can contribute to increased public safety, reduced recidivism, and increased accountability, they must be implemented with fidelity—adherence to their core components with regular training—and make plans to ensure sustainability.

Despite the promise of EBPs, their success is varied, limited by a lack of organizational capacity to effectively implement and sustain them. An evidence-based approach is needed not only on the selection of the EBP, but also on successful implementation with both short- and long-term sustainability plans. Implementation science examines how EBPs can be best implemented and how implementation affects immediate and future outcomes.

Policymakers, researchers and practitioners must also focus on maintaining fidelity to the core components of EBPs and how, aside from training, the practicing agency intends to move forward with full implementation and long-term sustainability. When translating research into practice, real-world outcomes and benefits are significantly shaped by program quality, organizational development, policy alignment, and quality of implementation, or how the new policy, program, or practice is integrated into an organization, funded, executed, and evaluated.[4]

What are Evidence-Based Practices?

EBPs, while newer to the field of criminal justice and social science in general, have been used in other fields of study. EBPs originated in the medical field, applied to distinguish between effective and ineffective medical practices through scientific methods, statistical analysis, research, and patient outcomes.[5] Businesses, production and manufacturing sectors, education, public health, mental health, foster care and child well-being services, to name a few, have all moved towards utilizing EBPs.

Beginning around the late 1970s, criminal justice researchers sought to develop a systematic process to identify effective and ineffective criminal justice programs and practices; assess study quality of the current literature; and analyze how the current literature’s methodological quality supports, or does not support, its outcomes (i.e., the strength and accuracy of outcomes).[6]

Although the terms evidence-based practices and evidence-based programs are frequently used interchangeably, they have slightly different meanings. Evidence-based practices are skills, techniques, strategies, policy initiatives, or core intervention components that have accumulated a significant amount of supporting research through high-quality process and outcome evaluations.[7] Evidence-based programs consist of structured, multi-faceted interventions, comprised of coordinated services or practices, designed to target complex client/consumer problems.[8] Essentially, evidence-based programs help provide the framework for evidence-based practices. For example, Aggression Replacement Training (ART) is an evidence-based program that helps increase prosocial behavior in chronically violent and aggressive youth and adolescents.[9] Evidence-based practices are incorporated within ART’s three components: social skills training, anger control, and moral reasoning. [10]

To determine whether a program or practice is effective, ideally, evaluations employ a randomized experiment design, which involves the use of random assignment of participants into either the experimental group (in the program) or the control group not in the program). A comparison is then made of the impact of that program or practice on similarly situated individuals. [11] Although these randomized control trials (RCT) are the “gold standard,” they can be difficult to employ in social science; it is important to always abide by ethical and methodological design considerations when employing an RCT design. [12] When an RCT is not an option, a quasi-experimental design can be employed, utilizing a comparison group that was not randomly assigned.[13] Ultimately, these studies must demonstrate causal evidence linking the practice or program with the desired outcomes, while ruling out factors other than the program or practice that may contribute to or influence outcomes or factors that may contribute to differences between groups prior to treatment. [14]

Though there is no specific, agreed upon number of studies that must be reached in order for a program or practice to be considered evidence-based, the Centers for Disease Control and Prevention (CDC) developed a guide to facilitate a common understanding of the continuum of evidence of effectiveness, which can help guide practitioners, policymakers, researchers, and other decision-makers in criminal justice. [15] Another guide, Standards of Evidence, developed by the Society for Prevention Research—provides criteria for efficacy and dissemination of evidence. [16] Generally, for something to be deemed evidence-based, the research and its outcomes should be rigorous, replicable, generalizable, and objective (Table 1). [17]

Table 1

Defining Evidence-Based Practices

Overall Effect Requirements Terminology
No Effect Little or no evidence exists through the use of reliable, rigorous, replicable, and generalizable research indicating the programs achieve what they are intended to achieve.[18] Not Evidence-Informed or Evidence-Based
Promising Some evidence exists through the use of reliable, rigorous, replicable, and generalizable research indicating the programs achieve what they are set out to achieve.[19] Evidence-Informed
Effective Strong evidence exists through use of reliable, rigorous, replicable, and generalizable research indicating programs achieve what they are set out to achieve.[20] Evidence-Based

National Resources

Once an evidence-based program or practice is identified, planning for implementation and sustainability prior to training is fundamental for the success of the organization, its staff, and its clients. It is important to consider that even the most empirically sound programs and practices can produce outcomes that are inconsistent, unsustainable, harmful, or generally undesirable when poorly implemented.[21] Research supports the necessity for fidelity and high quality implementation in order to most successfully implement and sustain EBPs.[22]

Implementation Science and Evidence-Based Programs and Practices

Before implementing an EBP, it is important to gauge an organization’s development and capacity to implement a new program.[23] This helps provide insight on organizational culture, including shared values or beliefs that govern staff within an organization, climate or how staff experience organizational culture; leadership; communication and decision-making within an organization; alignment of policies and practices with the potential adoption of a new EBP; alignment of policies and practices with the mission of the organization; and goals and strategic plan.[24]

Measuring Readiness: How Ready is your Organization to Make a Change?

Assessment is one way to measure an organization’s readiness for change, which can impact the implementation of EBPs and their success.[25] Readiness includes preparing staff at every level for the implementation and sustainability of the program or practice, as well as aligning policies and practices to support the staff and the organization in using the EBP.[26] Organizational readiness is an imperative precursor to successful implementation and sustainability—without it, change is more difficult to make and may ultimately result in a failed program.[27] Successfully implementing change can be difficult, however, with many organizational barriers impeding the process.[28]

Several factors are associated with readiness for change, including motivational readiness, institutional resources, staff attributes, and organizational climate (Figure 1). Lower levels of staff cynicism toward change, favorable perceptions of leadership, a supportive environment within the organization, and an increase in interagency networks can influence an organization’s readiness for change.[29]

Figure 1

Organizational readiness factors

Figure 1
Source: Texas Christian University’s Organizational Readiness for Change Tool; Lehman, W. E. K., Greener, J. M., & Simpson, D. D. (2002). Assessing organizational readiness for change. Journal of Substance Abuse Treatment, 22, 197-209.

Organizations can measure their readiness for change by using one of the following tools:

Are Your Staff Sufficiently Informed of the Change?

In order for organizations to prepare for implementation, all staff should have a clear understanding of

  • Who is in charge of managing the implementation and the EBP itself.
  • What the program or practice is —requiring the education staff and stakeholders with regard to the why and how of the program or practice.
  • The essential functions of the program or practice and how those functions are operationalized.
  • What the strategic plan looks like for short- and long-term implementation and the sustainability of the program or practice.
  • To the process for communicating questions, comments, or concerns about the program or practice.
  • The quality assurance process—or tracking of program process and outcomes on a continual basis—that assesses fidelity to the program or practice. This also helps identify any obstacles or barriers, areas that may need modification, and any gaps in information or services.[31]

Does your Organization have an Implementation team?

A cross-section of staff and stakeholders should comprise an organization’s implementation team to most successfully implement and sustain an EBP.[32] The implementation team can support implementation, sustainability, and the process of scaling-up (i.e., increasing in capacity or use of) the EBP.[33] Stakeholders and all organization staff should understand that implementation and sustainability is an ongoing, multi-stage process that often includes barriers and resistance to change. Further, support from organization leaders and upper-level staff and management, including involvement in the training process and experience or practice with line staff in the use of the program or practice, helps support broader acceptance of organizational change.[34] In particular, the implementation process can be significantly less stressful to all levels of staff when implementation efforts focus on the individual staff, agency, and system levels of implementation, making the process more inclusive and transparent from the bottom-up and the top-down.[35] One way to do this is by using NIRN’s Hexagon Tool [36], which helps organizations appropriately select and assess elements of a program or practice.

What are your Organization’s Strengths and Weaknesses?

In addition, an organization should examine its current capacity, assessing both strengths and weaknesses to understand the organization’s landscape for implementation. One way to do this is through a SWOT analysis—a group process to compile and analyze the organizations strengths, weaknesses, opportunities, and threats.[37] Such analysis can help inform strategic planning by identifying potential strategies for implementation.

During the planning process, an organization should think “big picture” on the program or practice goal to account for how change may impact the organization as a whole. Implementing change may require a holistic look at organizational policies, regulations, guidelines, procedures, and practices, and how they may align with the new change.[38] In particular, organizations should consider how:

  • The EBP connects to other parts of the organization as well as to organizational and systemic goals.
  • Current policies (organizational, local, county, state, federal) may support or conflict with the EBP—consider new policies to support EBPs while discontinuing ineffective or conflicting policies in order to align policy and practice.
  • Communication will occur within the organization and external to the organization, promoting honest and open discussion.
  • The use of data can drive decision-making. This allows for continual assessments of the organization and its goals, policies, practices, and outcomes.[39]

Strategic Planning

Assessing Implementation Drivers for Change

In addition to organizational readiness assessments, strategic planning requires assessing key “drivers” for organizational change prior to implementation and on an ongoing basis.

Organizations should assess the components that drive change within an organization, or drivers. Drivers are categorized as competency drivers, organization drivers, and leadership drivers (Figure 2).[40] These implementation drivers support organizational capacity for creating “program, practice, and systems level changes needed to achieve improved population outcomes.”[41] These drivers are compensatory and integrated in nature, working together to enhance quality, fidelity, and consistency in program implementation to improve outcomes.[42]

Figure 2

Implementation drivers of organizational change

Figure 2
Figure Source: Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487.

Competency Drivers

Competency drivers assist in the development and enhancement of competency and confidence among staff.[43] These drivers are vital for selecting, teaching, understanding, supporting, and assessing the use of the EBP.

Staff selection refers to hiring qualified staff at all levels within the organization to train, coach, assess, and execute EBPs. Research suggests that individuals who are open to learning, obtaining, and honing in on new or previously learned skills may be more willing to learn and integrate new ways of working within their current responsibilities.[44] Further, findings from an evaluation of staff quality and program effectiveness of 64 community-based correctional facilities and halfway houses in Ohio indicated that 28 percent of program variation was explained by staff characteristics, training, and supervision; positive staff characteristics and training were associated with substantial recidivism reduction.[45]

Training, both pre-service (beginning) and in-service (continued), is the impetus for staff behavioral change in implementing and assessing a new EBP. Training is the first step in assisting the transfer of knowledge from research to practice. Training should incorporate information on EBP theory, philosophy, values, and rationales of key components and for the organization’s adoption of the EBP, and the ability to listen, watch, practice, and receive feedback on new skills. [46]

Ongoing coaching and consultation integrates newly learned concepts into practice. This can greatly increase staff competency and confidence in using the new program or practice.[47] For example, in a meta-analysis of 21 studies on motivational interviewing, skills eroded for staff who were not provided coaching and feedback post-training compared to those who were.[48] An estimated 10 percent of information is ultimately transferred from training to practice.[49] Further, in a study of probation and parole officers trained in an evidence-based supervision model, officers who were trained and coached monthly engaged in more proficient use of practices than those who were not coached after the initial training.[50]

Staff performance assessments can evaluate staff use of and fidelity to the EBP, as well as outcomes related to those processes.[51] Staff performance assessments should incorporate staff use of, competency in, and fidelity to program or practice components.[52] This helps practitioners gain better understanding of their strengths and areas for improvement and organizations gain better understanding of implementation progress and efforts.[53]

Organization Drivers

Organization drivers help create supportive environments that increase the accessibility and efficacy of staff selection, training, coaching, and performance via a supportive and welcoming administration, as well as access to funding and resources. [54] In a study of EBPs in substance use treatment, researchers found that a supportive environment for new programs and practices, training, resources, and interagency networks was related to increased use of EBPs.[55]

Decision support data systems are central in identifying and assessing key aspects of organizational performance, such as fidelity, outcomes, and quality improvement information, to support the continued improvement and efficacy of EBP implementation and sustainability.[56] This includes continuous quality improvement or CQI [57] as a way to monitor organizational processes and outcomes in order to ensure that the EBPs are delivered as intended. CQI ultimately helps organizations improve performance of current and/or new practices.[58] In addition, a quality assurance process is instrumental, as the audit helps to best identify and rectify staff deviation from EBP policy or protocol.[59]

Facilitative administrative supports are an organization’s policies, protocols, structures, culture, and climate that act to enhance, support, and facilitate organizational changes to align with staff needs.[60] To do this, leadership provides support to the organization as a whole, making use of organizational data to inform decision-making, keep staff on track, and remain focused on the successful development of skills in order to implement and successfully use a new EBP (or enhance other attempted EBP previously implemented) EBP.[61]

Systems intervention are strategies to help analyze system-level factors that support outcomes, such as the ever-changing context of federal, state, organizational, funding resources and availability, human resources, and community-level policies and practices.[62]

Leadership Drivers

Implementation of any EBP requires effective leadership to support the staff as well as the organization as a whole in obtaining desired outcomes. Leadership drivers help identify appropriate leadership skills, capabilities, and strategies in order to institute, repurpose, modify/adjust, and monitor both competency and organization drivers throughout implementation and further sustainability.[63]

Technical leadership is most common with traditional management styles that identify, clearly and precisely, strategies, problems, and solutions.[64] Technical leadership is most synonymous with effective management. It primarily deals with problems for which there is an agreed upon understanding of the nature of the problem and how to resolve it.[65]

Adaptive leadership tends to occur when the problems arise from more complex organizational conditions in which organizational actors are not all in agreement with the problem and solution. Generally, adaptive leadership styles incorporate the use of a group of people who work collaboratively to identify the problem and possible solutions.[66] The importance of leadership drivers is knowing when each type of leadership is most appropriate.[67]

The use of implementation frameworks and implementation drivers throughout the process of implementation and future sustainability is vital to the quality and effective delivery of an EBP.[68] These implementation drivers are integrated in nature; a change or adjustment in one implementation driver might necessitate change in other implementation drivers.[69] These drivers also are compensatory in nature; where there is less of one driver, another may be used to supplement. However, this should be done only after careful consideration of each implementation driver.[70]

The Challange of Adapting an EBP to Meet Local Needs and Capabilities

One challenge in implementing EBPs is knowing when or how to adapt the EBP to reflect local differences. While an EBP is the ideal, not every local, county, or state operates the same; this may result in potential barriers to implementing an EBP with fidelity, for example, for differing local policies, capacity, target population, and staff skill level.[71] Adaptations may be necessary, but it is recommended to first implement with fidelity to the original EBP. Research indicates that adaptations made after implementation were more successful than those made prior to implementation.[72] Further, adaptations must be structured around the essential functions (or core components) of the EBP in order to prevent compromising the effectiveness of the program or practice.[73] Adaptations can be done with high or low fidelity, aligning or drifting from the essential functions of the EBP.[74] Further, adaptation may be appropriate only up to a certain point; too much and the changes may result in program “drift,” transforming the EBP into something that is not an EBP.[75]

In a 2013 study of the sustainability of evidence-based school, community/family-focused, and family treatment programs, researchers noted that sustainability suffered when many changes were made to fit the program to the population, setting, and needs. [76] Most frequently, programs identified adaptations to program procedures, dose, and content.[77] The top five reasons for these adaptations were:

  • Lack of time (80 percent).
  • Limited resources (72 percent).
  • Participant retention difficulty (71 percent).
  • Resistance from implementers (64 percent).
  • Difficulty recruiting participants (61 percent).[78]

Most adaptations were reactive in nature, related to logistical fit (i.e., compatibility issues related to implementers’ or target populations’ capacity—resources, time, skills, knowledge, and schedules),[79] and were negatively aligned with—or had drifted from—the program goals and theory.[80] Another study of school-based substance use prevention programs found that, on average, teachers had more than three adaptations to the program—63 percent of which were negatively adapted (i.e., drifted from the original program theory and goals), increasing the likelihood of null or harmful effects on the youth served.[81]

In a study of an evidence-based adolescent and family programs in several sites, a majority of program modifications were related to philosophical issues—misalignment of organizational or participant values or philosophy (58 percent) —while the deletion of program components was most likely due to logistical issues—misalignment of organizational and program capacity or context (77 percent).[82] Forty-two percent of additions and changes to the program were as likely to be aligned with the program’s theory or logic model ( positively aligned) as they were to be misaligned with the program’s theory or logic model ( negatively aligned), whereas deletion of program components and functions was most likely negatively aligned with the program’s theory and goals (82 percent).[83] This suggests that program modifications involving the removal of parts of the program are more likely to misalign with the original program theory and logic model—or the components that are most effective within a program. This research suggests that adaptation to EBPs should be done in an objective manner, based on technical, theoretical, and rigorous evidence of such adaptations rather than subjective stances or individual beliefs.[84]

Conclusion

Implementation is a complex, continuous process; EBPs should be continually monitored and evaluated for efficacy and fidelity as they relate to process and outcomes. Adopting a comprehensive, multifaceted program or practice within an organizational setting must be strategically translated into a complex, ever-changing system, interplaying between EBP characteristics, providers, and organizational and service delivery settings.[85] Because criminal justice organizations are dynamic, influenced by internal and external political, social, moral, and economic pressures, it is important for organizations to build an internal capacity to deliver the program or practice and adapt or modify with the ebb and flow of the organization. This upfront planning can greatly improve not only the outcomes of the EBP, but also how staff receive these changes and ultimately use them in their day-to-day activities.


  1. Lipsey, M. W. (2010). Improving the effectiveness of juvenile justice programs: A new perspective on evidence-based practice. Washington, DC: Georgetown University, Center for Juvenile Justice Reform.; Przybylski, R. (2012). Introduction: Special issue on evidence-based policy and practice. Justice Research and Policy, 14(1), 1–15. ↩︎

  2. Lum, C., Koper, C., & Telep, C. (2015). Evidence-based policing matrix. Fairfax, VA: George Mason Univeristy, Center for Evidence-Based Crime Policy.; Lum, C., Koper, C. S., & Telep, C. W. (2011). The evidence-based policing matrix. Journal of Experimental Criminology, 7, 3-26.; Lum, C., & Koper, C. S. (2015). Evidence-based policing. In R. Dunham and G. Alpert (Eds.) Critical Issues in Policing (7th ed.). Long Grove, IL: Waveland Press. ↩︎

  3. Taxman, F. S., Pattavina, A., & Caudy, M. (2014). Justice reinvestment in the United States: An empirical assessment of the potential impact of increased correctional programming and recidivism. Victims & Offenders: An International Journal of Evidence-Based Research, Policy, and Practice, 9(1), 50-75. ↩︎

  4. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). ↩︎

  5. Williams-Taylor, L. (2007). Evidence-based programs and practices: What does it all mean? Boynton Beach, FL: Research Review, Children’s Services Council of Palm Beach County. Retrieved from http://bit.ly/2gQW1Yq ; Przybylski, R. & Orchowsky, S. (2015). EBPs. Washington, DC: Justice Research and Statistics Association. Retrieved from http://bit.ly/2todZmB. ↩︎

  6. Williams-Taylor, L. (2007). Evidence-based programs and practices: What does it all mean? Boynton Beach, FL: Research Review, Children’s Services Council of Palm Beach County. Retrieved from http://bit.ly/2gQW1Yq ; Przybylski, R. & Orchowsky, S. (2015). EBPs. Washington, DC: Justice Research and Statistics Association. Retrieved from http://bit.ly/2todZmB. ↩︎

  7. Andrews, D. A., & Bonta, J. (2012). The psychology of criminal conduct (5th ed.). Newark, NJ: LexisNexis.; EPISCenter. (N.d.) Defining evidence based programs. University Park, PA: Author. Retrieved from http://www.episcenter.psu.edu/ebp/definition. ↩︎

  8. Andrews, D. A., & Bonta, J. (2012). The psychology of criminal conduct (5th ed.). Newark, NJ: LexisNexis.; EPISCenter. (N.d.) Defining evidence based programs. University Park, PA: Author. Retrieved from http://bit.ly/2kWLRaF.; Williams-Taylor, L. (2007). Evidence-based programs and practices: What does it all mean? Boynton Beach, FL: Research Review, Children’s Services Council of Palm Beach County. Retrieved from http://bit.ly/2gQW1Yq ; Przybylski, R. & Orchowsky, S. (2015). EBPs. Washington, DC: Justice Research and Statistics Association. Retrieved from http://bit.ly/2todZmB. ↩︎

  9. Glick, B. (1996). Aggression Replacement Training in Children and Adolescents. Hatherleigh Guide to Child and Adolescent Therapy, 5, 191–226. ↩︎

  10. Glick, B. (1996). Aggression Replacement Training in Children and Adolescents. Hatherleigh Guide to Child and Adolescent Therapy, 5, 191–226. ↩︎

  11. Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental design for generalized causal inference. Belmont, CA: Wadsworth, Cengage Learning. ↩︎

  12. This includes how individuals are chosen—or not chosen—to participate in the experimental group, particularly because many of the programs are also intertwined with legal and procedural decisions; when randomization should be carried out based; and lack of popularity among criminal justice agencies in employing RCT designs. See Asscher, J. J., Dekovic, M., van der Laan, P. H., Prins, P. J. M., & van Arum, S. (2007). Implementing randomized experiments in criminal justice settings: An evaluation of multisystemic therapy in the Netherlands. Journal of Experimental Criminology, 3(2), 113-129. ↩︎

  13. Trochim, W. M. K., & Donnelly, J. P. (2007). Research methods knowledge base. Mason, OH: Thompson Custom Publishing. ↩︎

  14. Gendreau, P. (1996). The principles of effective intervention with offenders. In A. T. Harland (Ed.), Choosing correctional interventions that work: Defining the demand and evaluating the supply (pp. 117-130). Newbury Park, CA: Sage.; Andrews, D. A., & Bonta, J. (2012). The psychology of criminal conduct (5th ed.). Newark, NJ: LexisNexis.; Orchowsky, S. (2014). An introduction to evidence-based practices. Justice Research and Statistics Association. ↩︎

  15. Puddy, R. W., & Wilkins, N. (2011). Understanding evidence part 1: Best available research evidence. A guide to the continuum of evidence of effectiveness. Atlanta, GA: Centers for Disease Control and Prevention. ↩︎

  16. Flay, B. R., Biglan, A., Boruch, R. F., Castro, F. G., Gottfredson, D., Kellam, S., … Ji, P. (2005). Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prevention Science, 6(3), 151-175. http://bit.ly/2yyWrtw. ↩︎

  17. Trochim, W. M. K., & Donnelly, J. P. (2007). Research methods knowledge base. Mason, OH: Thompson Custom Publishing. ↩︎

  18. Crimesolutions.gov. (2011). Glossary. Washington, DC: National Institute of Justice, Office of Justice Programs. Washington, DC Retrieved from http://bit.ly/2oieUTg.; Blueprints Program Model. (2012-2016). Program criteria. Boulder, CO: Blueprints for Healthy Youth Development. Retrieved from http://bit.ly/2yqORS3. ↩︎

  19. Crimesolutions.gov. (2011). Glossary. Washington, DC: National Institute of Justice, Office of Justice Programs. Washington, DC Retrieved from http://bit.ly/2oieUTg.; Blueprints Program Model. (2012-2016). Program criteria. Boulder, CO: Blueprints for Healthy Youth Development. Retrieved from http://bit.ly/2yqORS3. ↩︎

  20. Crimesolutions.gov. (2011). Glossary. Washington, DC: National Institute of Justice, Office of Justice Programs. Washington, DC Retrieved from http://bit.ly/2oieUTg.; Blueprints Program Model. (2012-2016). Program criteria. Boulder, CO: Blueprints for Healthy Youth Development. Retrieved from http://bit.ly/2yqORS3. ↩︎

  21. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). ↩︎

  22. Lowenkamp, C. T., Flores, A. W., Holsinger, A. M., Makarios, M. D., & Latessa, E. J. (2010). Intensive supervision programs: Does program philosophy and the principles of effective intervention matter? Journal of Criminal Justice, 38, 368-375. Schweitzer, M., Kishimoto, E., Latessa, E. J., & Rogalski-Davis, L. (2015). Implementing an evidence-based program model: A real world approach to effective correctional treatment. Offender Programs Report, 19(3), 33-48. ↩︎

  23. Among other sources, the National Implementation Research Network (NIRN) out of the University of North Carolina at Chapel Hill’s FPG Child Development Institute, provides important resources regarding implementation science and how to most effectively implement and sustain EBPs in a variety of fields, including criminal justice. Through NIRN’s amalgamation and consolidation of research and literature, they created many quick planning tools in addition to an Active Implementation Hub that provides resources including cases and examples; research, publications, presentations; and learning modules. Organizations can use the concepts and frameworks as they consider, and later implement, EBPs within their organization. ↩︎

  24. Crime and Justice Institute and Guevara, M., Loeffler-Cobia, J., Rhyne, C., & Sachwald, J. (2011). Putting the pieced together: Practical strategies for implementing evidence-based practices. Washington, DC: National Institute of Corrections. ↩︎

  25. Lerch, J., Viglione, J., Eley, E., James-Andrews, S., & Taxman, F. (2011). Organizational readiness in corrections. Federal Probation, 75(5), 5-10. ↩︎

  26. Weiner, B. J. (2009). A theory of organizational readiness for change. Implementation Science, 4(67), 1-9. ↩︎

  27. Weiner, B. J. (2009). A theory of organizational readiness for change. Implementation Science, 4(67), 1-9. ↩︎

  28. Taxman, F. S., Henderson, C., Young, D., & Farrell, J. (2014). The impact of training interventions on organizational readiness to support innovations in juvenile justice offices. Administration and Policy in Mental Health, 41(2), 177-188. ↩︎

  29. Farrell, J. L., Young, D. W., & Taxman, F. S. (2011). Effects of organizational factors on use of juvenile supervision practices. Criminal Justice and Behavior, 38(6), 565-583.; Henderson, C. E., Taxman, F. S., & Young, D. W. (2007). A Rasch model analysis of evidence-based treatment practices used in the criminal justice system. Drug and Alcohol Dependence, 93, 163-175. ↩︎

  30. Aarons, G. A., Ehrhart, M. G., Torres, E. M., Finn, N. K., & Roesch, S. C. (2016). Validation of the implementation leadership scale (ILS) in substance use disorder treatment organizations. Journal of Substance Abuse Treatment, 68, 31-35. ↩︎

  31. Van Dyke, M., Blasé, K., Sims, B., & Fixsen, D. (2013).Implementation drivers: Team review and planning. Chapel Hill, NC: University of North Carolina Chapel Hill, National Implementation Research Network (NIRN), Frank Porter Graham Child Development Institute. ↩︎

  32. Gotham, H. J., White, M. K., Bergethon, H. S., Feeney, T., Cho, D. W., & Keehn, B. (2008). An implementation story: Moving the GAIN from pilot project to statewide use. Journal of Psychoactive Drugs, 40(1), 97. ↩︎

  33. The National Implementation Research Networks’s Active Implementation Hub. (2013-2015). Retrieved from http://unc.live/2x5sbmy ↩︎

  34. Gotham, H. J., White, M. K., Bergethon, H. S., Feeney, T., Cho, D. W., & Keehn, B. (2008). An implementation story: Moving the GAIN from pilot project to statewide use. Journal of Psychoactive Drugs, 40(1), 97. ↩︎

  35. Gotham, H. J., White, M. K., Bergethon, H. S., Feeney, T., Cho, D. W., & Keehn, B. (2008). An implementation story: Moving the GAIN from pilot project to statewide use. Journal of Psychoactive Drugs, 40(1), 97. ↩︎

  36. While the tool is framed for use in schools, it is appropriate for criminal and juvenile justice with the modification of the language in the tool to reflect criminal and juvenile justice. ↩︎

  37. The NCJA Center for Justice Planning (n.d.). SWOT Analysis. Washington, DC: US Department of Justice, Bureau of Justice Assistance, and the National Criminal Justice Association. Retrieved from http://bit.ly/2gkcLUV. ↩︎

  38. Crime and Justice Institute and Guevara, M., Loeffler-Cobia, J., Rhyne, C., & Sachwald, J. (2011). Putting the pieced together: Practical strategies for implementing evidence-based practices. Washington, DC: National Institute of Corrections. ↩︎

  39. Crime and Justice Institute and Guevara, M., Loeffler-Cobia, J., Rhyne, C., & Sachwald, J. (2011). Putting the pieced together: Practical strategies for implementing evidence-based practices. Washington, DC: National Institute of Corrections. ↩︎

  40. Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540.; Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  41. Bertram, R. M., Blasé, K. A., & Fixsen, D. L. (2013). Improving programs and outcomes: Implementation frameworks 2013. Houston, TX.: Draft for Building the Research & Practice Gap Symposium in Houston, April 5th-6th, 2013. ↩︎

  42. Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540.; Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005).Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). ↩︎

  43. Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  44. Alexander, M. (2011). Applying implementation research to improve community corrections: Making sure that “new” thing sticks. Federal Probation, 75(2), 47-51. ↩︎

  45. Makarios, M., Lovins, L., Latessa, E. J., & Smith, P. (2016). Staff quality and treatment effectiveness: An examination of the relationship between staff factors and the effectiveness of correctional programming. Justice Quarterly, 33(2), 348-367. ↩︎

  46. Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487.; Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540. ↩︎

  47. Schoenwald, S. K., Sheidow, A. J., & Letourneau, E. J. (2004). Toward effective quality assurance in evidence-based practice: Links between expert consultation, therapist fidelity, and child outcomes. Journal of Clinical Child and Adolescent Psychology, 33(1), 94-104. ↩︎

  48. Schwalbe, C. S., Oh, H. Y., & Zweben, A. (2014). Sustaining motivational interviewing: A meta-analysis of training studies. Addiction, 109, 1287-1294. ↩︎

  49. Rogers, R. W., Wellins, R. S., & Connor, D. R. (2002). White Paper – The power of realization. BrRetreived from http://bit.ly/2yxwKsN. ; Paparozzi, M. A., & Guy, R. (2013). The trials and tribulations of implementing what works: Training rarely trumps values. Federal Probation, 77(2). Retrieved from http://bit.ly/2yzLcBi. ↩︎

  50. Labrecque, R. M., & Smith, P. (2015). Does training and coaching matter? An 18-month evaluation of a community supervision model. Victims & Offenders, 12, 233-252.; Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  51. Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  52. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005).Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). ↩︎

  53. Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540. ↩︎

  54. Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487.; Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540. ↩︎

  55. Henderson, C. E., Taxman, F. S., & Young, D. W. (2007). A Rasch model analysis of evidence-based treatment practices used in the criminal justice system. Drug and Alcohol Dependence, 93, 163-175.; Farrell, J. L., Young, D. W., & Taxman, F. S. (2011). Effects of organizational factors on use of juvenile supervision practices. Criminal Justice and Behavior, 38(6), 565-583. ↩︎

  56. Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540.; Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  57. This is just one resource an organization can use to develop CQI in order to integrate frequent reporting and information related to organizational and programmatic functioning. The Plan-Do-Study-Act (PDSA) is also a great resource for implementing quality assurance planning and processes. ↩︎

  58. Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540.; Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  59. Carey, M. (2010). Coaching packet: Continuous quality improvement. Silver Springs, MD: The Center for Effective Public Policy. ↩︎

  60. Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540.; Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  61. Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540. ↩︎

  62. Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  63. Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  64. Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  65. National Implementation Research Network. (n.d.). Leadership. Chapel Hill, NC: Author. Retrieved from https://nirn.fpg.unc.edu/module-2/leadership-drivers. ↩︎

  66. National Implementation Research Network. (n.d.). Leadership. Chapel Hill, NC: Author. Retrieved from https://nirn.fpg.unc.edu/module-2/leadership-drivers. ↩︎

  67. National Implementation Research Network. (n.d.). Leadership. Chapel Hill, NC: Author. Retrieved from https://nirn.fpg.unc.edu/module-2/leadership-drivers. ↩︎

  68. Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540.; Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  69. Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540.; Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  70. Bertram, R. M., Schaffer, P. , & Charnin, L. (2014). Changing organization culture: Data driven participatory evaluation and revision of wrapround implementation. Journal of Evidence-Based Social Work, 11(1-2), 18-29.; Fixsen, D.L., Blase, K.A., Naoom, S.F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19(5), 531-540.; Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477-487. ↩︎

  71. Kemp, L. (2016). Adaptation and fidelity: A recipe analogy for achieving both in population scale implementation. Prevention Science, 17, 429-438. ↩︎

  72. Winter, S. G., & Szulanski, G. (2001). Replication as strategy. Organization Science, 12(6), 730-743. ↩︎

  73. McHugh, R. K., Murray, H. W., & Barlow, D. H. (2009). Balancing fidelity and adaptation in the dissemination of empirically-supported treatments: The promise of transdiagnostic interventions. Behavior Research and Therapy, 47(11), 946-953. ↩︎

  74. Moore, J. E., Bumbarger, B. K., & Rhoades Cooper, B. (2013). Examining adaptations of evidence-based programs in natural contexts. Journal on Primary Prevention, 34, 147-161. ↩︎

  75. Kemp, L. (2016). Adaptation and fidelity: A recipe analogy for achieving both in population scale implementation. Prevention Science, 17, 429-438. ↩︎

  76. Moore, J. E., Bumbarger, B. K., & Rhoades Cooper, B. (2013). Examining adaptations of evidence-based programs in natural contexts. Journal on Primary Prevention, 34, 147-161. ↩︎

  77. Moore, J. E., Bumbarger, B. K., & Rhoades Cooper, B. (2013). Examining adaptations of evidence-based programs in natural contexts. Journal on Primary Prevention, 34, 147-161.; Kemp, L. (2016). Adaptation and fidelity: A recipe analogy for achieving both in population scale implementation. Prevention Science, 17, 429-438. ↩︎

  78. Moore, J. E., Bumbarger, B. K., & Rhoades Cooper, B. (2013). Examining adaptations of evidence-based programs in natural contexts. Journal on Primary Prevention, 34, 147-161. ↩︎

  79. Moore, J. E., Bumbarger, B. K., & Rhoades Cooper, B. (2013). Examining adaptations of evidence-based programs in natural contexts. Journal on Primary Prevention, 34, 147-161. ↩︎

  80. Moore, J. E., Bumbarger, B. K., & Rhoades Cooper, B. (2013). Examining adaptations of evidence-based programs in natural contexts. Journal on Primary Prevention, 34, 147-161. ↩︎

  81. Dusenbury, L., Brannigan, R., Hansen, W. B., Walsh, J., & Falco, M. (2005). Quality of implementation: Developing measures crucial to understanding the diffusion of preventive interventions. Health Education Research, 20, 308-313.; Dusenbury, L., Brannigan, R., Falco, M., & Hansen, W. B. (2003). A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research, 18, 237-256. ↩︎

  82. Cooper, B. R., Shrestha, G., Hyman, L., & Hill, L. (2016). Adaptations in a community-based family intervention: Replication of two coding schemes. The Journal of Primary Prevention, 37(1), 33-52.; Supplee, L., & Metz, A. (2015). Opportunities and challenges in evidence-based social policy. Social Policy Report, 28 (4). ↩︎

  83. Cooper, B. R., Shrestha, G., Hyman, L., & Hill, L. (2016). Adaptations in a community-based family intervention: Replication of two coding schemes. The Journal of Primary Prevention, 37(1), 33-52.; Supplee, L., & Metz, A. (2015). Opportunities and challenges in evidence-based social policy. Social Policy Report, 28 (4). ↩︎

  84. Kemp, L. (2016). Adaptation and fidelity: A recipe analogy for achieving both in population scale implementation. Prevention Science, 17, 429-438. ↩︎

  85. Backer, T. (2002). Finding the balance: Program fidelity and adaptation in substance abuse prevention: A state-of-the-art review. Washington, DC: Center for Substance Abuse Prevention, Substance Abuse and Mental Health Services Administration, U.S. Department of Health and Human Services.; Henderson, C. E., Taxman, F. S., & Young, D. W. (2007). A Rasch model analysis of evidence-based treatment practices used in the criminal justice system. Drug and Alcohol Dependence, 93, 163-175.; Liddle, H. A., Rowe, C. L., Quille, T. J., Dakof, G. A., Mills, D. S., Sakran, E., & Biaggi, H. (2002). Transporting a research-based adolescent drug treatment into practice. Journal of Substance Abuse Treatment, 22(4), 231-243.; Schoenwald, S. K., & Hoagwood, K. (2001). Effectiveness, transportability, and dissemination of interventions: What matters when? Psychiatric Services, 52(9) 1190-1197. ↩︎