Education Drivers
Dimensions of Treatment Integrity Overview
Historically, treatment integrity has been defined as implementation of an intervention as planned (Gresham, 1989). More recently, treatment integrity has been reimagined as multidimensional (Dane & Schneider, 1998). In this conceptualization of treatment integrity are four dimensions relevant to practice: (a) exposure (dosage), (b) adherence, (c) quality of delivery, and (d) student responsiveness.
Exposure (dosage) refers to the amount (frequency and duration) of an intervention a student is receiving. For example, a student’s supplemental reading support may call for the intervention to occur three times per week and 30 minutes per session. If the delivery is less frequent or sessions are shorter, then the student’s exposure to the intervention is less than optimal and the outcome may be compromised.
Adherence is the most commonly measured dimension of treatment integrity (Sanetti, Chafouleas, Christ, & Gritter, 2009). It is the extent to which those responsible for implementing an intervention are doing so as prescribed. Most interventions are multicomponent packages and in some instances very complex. Whether a specific feature of an intervention occurred as planned is usually how adherence is measured.
Quality of delivery is the degree to which the implementation is executed with enthusiasm and sincerity. This dimension is underrepresented in the scholarly literature primarily because of its subjective nature; however, it is an important facet of treatment integrity and warrants more research. Consider the following: Many interventions for challenging behavior include praising students when they are behaving appropriately. Some teachers are effusive with their praise and vary it in many ways so that it does not become rote. Others may praise in a very monotone and rote manner. The differences in the way praise is delivered is likely to influence the impact of the intervention even if both individuals who are praising are doing so with 100% adherence to the intervention protocol.
Student responsiveness is the degree to which the student is engaged during the intervention. This dimension is a bit controversial. Some argue that it should not be a part of treatment integrity measures because it is a measure of student behavior and measures of treatment integrity should reflect what adult educators are doing. The counterargument is that even with high integrity for exposure, adherence, and quality of delivery, it is possible that the student’s lack of engagement with the intervention may negatively impact the intervention. For example, a student receiving an intervention to improve fluency in basic math may minimally participate in instruction even though the intervention is implemented with high integrity across all other dimensions of treatment integrity. This poor participation may be a function of placing the student in the instructional program at his or her failure level. Conversely, a student placed in the instructional program at his or her mastery level might not be engaged because the instruction is boring. Student responsiveness to an intervention can be an important indicator of the appropriateness of the instructional program.
Each of these dimensions can influence the impact of an intervention, but it is also important to be mindful of the interaction among variables. Consider the previously mentioned intervention protocol that calls for a student to receive supplemental reading support three times a week for 30 minutes each session. Both of these measures are part of the exposure dimension. If the student receives only one session per week and that session lasts for 30 minutes, then he or she is exposed to the intervention a third of the time prescribed. Similarly, the student could receive the reading intervention three times a week but for only 10 minutes each session. The student is still exposed to the intervention a third of the prescribed time. To complicate matters, even if the instructor perfectly implements the intervention during the session (adherence), the outcome is likely to be degraded because exposure was limited. This example highlights the importance of measuring all of the dimensions of treatment integrity and not just adherence. If just adherence is assessed and the outcome is less than desired, it might be determined that the intervention was ineffective even with a high level of adherence. Failure to consider all dimensions of treatment integrity can result is errors in decision making.
References
Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18(1), 23–45.
Gresham, F. M. (1989). Assessment of treatment integrity in school consultation and prereferral intervention. School Psychology Review, 18(1), 37–50.
Sanetti, L. M. H., Chafouleas, S. M., Christ, T. J., & Gritter, K. L. (2009). Extending use of direct behavior rating beyond student assessment: Applications to treatment integrity assessment within a multitiered model of school-based intervention delivery. Assessment for Effective Intervention, 34(4), 251–258.
Publications
To produce better outcomes for students two things are necessary: (1) effective, scientifically supported interventions (2) those interventions implemented with high integrity. Typically, much greater attention has been given to identifying effective practices. This review focuses on features of high quality implementation.
Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258-271.
Schools are often expected to implement innovative instructional programs. Most often these initiatives fail because what we know from implementation science is not considered as part of implementing the initiative. This chapter reviews the contributions implementation science can make for improving outcomes for students.
Detrich, R. Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform. Handbook on Innovations in Learning, 31.
Reform efforts tend to come and go very quickly in education. This paper makes the argument that the sustainability of programs is closely related to how well those programs are implemented.
Detrich, R., Keyworth, R. & States, J. (2010). Treatment Integrity: A Fundamental Unit of Sustainable Educational Programs. Journal of Evidence-Based Practices for Schools, 11(1), 4-29.
Data Mining
Presentations
This book is compiled from the proceedings of the sixth summit entitled “Performance Feedback: Using Data to Improve Educator Performance.” The 2011 summit topic was selected to help answer the following question: What basic practice has the potential for the greatest impact on changing the behavior of students, teachers, and school administrative personnel?
States, J., Keyworth, R. & Detrich, R. (2013). Introduction: Proceedings from the Wing Institute’s Sixth Annual Summit on Evidence-Based Education: Performance Feedback: Using Data to Improve Educator Performance. In Education at the Crossroads: The State of Teacher Preparation (Vol. 3, pp. ix-xii). Oakland, CA: The Wing Institute.
One of the primary goals of implementation science is to insure that programs are implemented with integrity. This paper presents an integrated model of implementation that emphasizes treatment integrity.
Berkel, C., Mauricio, A. M., Schoenfelder, E., Sandler, I. N., & Collier-Meek, M. (2011). Putting the pieces together: An Integrated Model of program implementation. Prevention Science, 12, 23-33.
Previous research indicates that manipulating dimensions of reinforcement during differential reinforcement of alternative behavior (DRA) for situations in which extinction cannot be implemented is a potential approach for treating destructive behavior.
Briggs, A. M., Dozier, C. L., Lessor, A. N., Kamana, B. U., & Jess, R. L. (2019). Further investigation of differential reinforcement of alternative behavior without extinction for escape‐maintained destructive behavior. Journal of applied behavior analysis, 52(4), 956-973.
Dane and Schneider propose treatment integrity as a multi-dimensional construct and describe five dimensions that constitute the construct.
Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: are implementation effects out of control. Clinical psychology review, 18(1), 23-45.
To produce better outcomes for students two things are necessary: (1) effective, scientifically supported interventions (2) those interventions implemented with high integrity. Typically, much greater attention has been given to identifying effective practices. This review focuses on features of high quality implementation.
Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258-271.
Reform efforts tend to come and go very quickly in education. This paper makes the argument that the sustainability of programs is closely related to how well those programs are implemented.
Detrich, R., Keyworth, R. & States, J. (2010). Treatment Integrity: A Fundamental Unit of Sustainable Educational Programs. Journal of Evidence-Based Practices for Schools, 11(1), 4-29.
This Article introduces implicit bias-an aspect of the new science of unconscious mental processes that has substantial bearing on discrimination law.
Greenwald, A. G., & Krieger, L. H. (2006). Implicit bias: Scientific foundations. California Law Review, 94(4), 945-967.
This study evaluated the differences in estimates of treatment integrity be measuring different dimensions of it.
Hagermoser Sanetti, L. M., & Fallon, L. M. (2011). Treatment Integrity Assessment: How Estimates of Adherence, Quality, and Exposure Influence Interpretation of Implementation. Journal of Educational & Psychological Consultation, 21(3), 209-232.
This paper suggests a model for selecting interventions that match the context of classrooms.
Harn, B., Parisi, D., & Stoolmiller, M. (2013). Balancing fidelity with flexibility and Fit: What do we really know about fidelity of implementation in schools?. Exceptional Children, 79(2), 181-193.
This paper reviews perspectives on treatment integrity from psychotherapy, substance abuse treatment, prevention science, and school psychology.
Schulte, A. C., Easton, J. E., & Parker, J. (2009). Advances in Treatment Integrity Research: Multidisciplinary Perspectives on the Conceptualization, Measurement, and Enhancement of Treatment Integrity. School Psychology Review, 38(4), 460-475.
The authors argue measuring treatment integrity will improve the quality of their decisions about the effectiveness of treatments. Measurement of different dimensions of treatment integrity allow researchers and practitioners to understand the relationships among them.
Yeaton, W. H., & Sechrest, L. (1981). Critical Dimensions in the Choice and Maintenance of Successful Treatments: Strength, Integrity, and Effectiveness. Journal of Consulting & Clinical Psychology, 49(2), 156-167.