Treatment integrity is a core component of data-based decision making (Detrich, 2013). The usual approach is to consider student data when making decisions about an intervention; however, if there are no data about how well the intervention was implemented, then meaningful judgments cannot be made about effectiveness.
This book is compiled from the proceedings of the sixth summit entitled “Performance Feedback: Using Data to Improve Educator Performance.” The 2011 summit topic was selected to help answer the following question: What basic practice has the potential for the greatest impact on changing the behavior of students, teachers, and school administrative personnel?
States, J., Keyworth, R. & Detrich, R. (2013). Introduction: Proceedings from the Wing Institute’s Sixth Annual Summit on Evidence-Based Education: Performance Feedback: Using Data to Improve Educator Performance. In Education at the Crossroads: The State of Teacher Preparation (Vol. 3, pp. ix-xii). Oakland, CA: The Wing Institute.
This study evaluated the effects of performance feedback to increase the implementation of skills taught during in-service training.
Auld, R. G., Belfiore, P. J., & Scheeler, M. C. (2010). Increasing Pre-service Teachers’ Use of Differential Reinforcement: Effects of Performance Feedback on Consequences for Student Behavior. Journal of Behavioral Education, 19(2), 169-183.
Significant dollars are spent each school year on professional development programs to improve teachers’ effectiveness. This study assessed the integrity with which pre-service teachers used a differential reinforcement of alternate behavior (DRA) strategy taught to them during their student teaching experience.
Auld, R. G., Belfiore, P. J., & Scheeler, M. C. (2010). Increasing pre-service teachers’ use of differential reinforcement: Effects of performance feedback on consequences for student behavior. Journal of Behavioral Education, 19(2), 169-183.
This study evaulates the effects of performance feedback as part of proffessional development across three studies.
Barton, E. E., Pribble, L., & Chen, C.-I. (2013). The Use of E-Mail to Deliver Performance-Based Feedback to Early Childhood Practitioners. Journal of Early Intervention, 35(3), 270-297.
One of the primary goals of implementation science is to insure that programs are implemented with integrity. This paper presents an integrated model of implementation that emphasizes treatment integrity.
Berkel, C., Mauricio, A. M., Schoenfelder, E., Sandler, I. N., & Collier-Meek, M. (2011). Putting the pieces together: An Integrated Model of program implementation. Prevention Science, 12, 23-33.
We reviewed parametric analyses of treatment integrity levels published in 10 behavior analytic journals through 2017. We discuss the general findings from the identified literature as well as directions for future research.
Brand, D., Henley, A. J., Reed, F. D. D., Gray, E., & Crabbs, B. (2019). A review of published studies involving parametric manipulations of treatment integrity. Journal of Behavioral Education, 28(1), 1-26.
Treatment fidelity data (descriptive and statistical) are critical to interpreting and generalizing outcomes of intervention research. Despite recommendations for treatment fidelity reporting from funding agencies and researchers, past syntheses have found treatment fidelity is frequently unreported in educational interventions and fidelity data are seldom used to analyze its relation to student outcomes.
Capin, P., Walker, M. A., Vaughn, S., & Wanzek, J. (2018). Examining how treatment fidelity is supported, measured, and reported in K–3 reading intervention research. Educational psychology review, 30(3), 885-919.
Incidental teaching is often a component of early childhood intervention programs. This study evaluated the use of grahical feedback to increase the use of incidental teaching.
Casey, A. M., & McWilliam, R. A. (2008). Graphical Feedback to Increase Teachers’ Use of Incidental Teaching. Journal of Early Intervention, 30(3), 251-268.
One of the challenges for increasing treatment integrity is finding effective methods for doing so. This study evaluated the use of checklist-based training to increase treatment integrity.
Casey, A. M., & McWilliam, R. A. (2011). The impact of checklist-based training on teachers’ use of the zone defense schedule. Journal of Applied Behavior Analysis, 44(2), 397-401.
This study evaluated the impact of performance feedback on how well problem-solving teams implemeted a structured decision-making protocal. Teams performed better when feedback was provided.
Codding, R. S., & Smyth, C. A. (2008). Using Performance Feedback To Decrease Classroom Transition Time And Examine Collateral Effects On Academic Engagement. Journal of Educational & Psychological Consultation, 18(4), 325-345.
This study investigated the effects of performance feedback to increase treatment integrity.
Codding, R. S., Feinberg, A. B., & Dunn, E. K. (2005). Effects of Immediate Performance Feedback on Implementation of Behavior Support Plans. Journal of Applied Behavior Analysis, 38(2), 205-219.
This study evaluated the effects of performance feedback in increasing treatment integrity. It also evaluated the possible reactivitiy effects of being observed.
Codding, R. S., Livanis, A., Pace, G. M., & Vaca, L. (2008). Using Performance Feedback to Improve Treatment Integrity of Classwide Behavior Plans: An Investigation of Observer Reactivity. Journal of Applied Behavior Analysis, 41(3), 417-422.
Reviews of treatment outcome literature indicate treatment integrity is not regularly assessed. In consultation, two levels of treatment integrity (i.e., consultant procedural integrity [CPI] and intervention treatment integrity [ITI]) provide relevant implementation data.
Collier-Meek, M. A., & Sanetti, L. M. (2014). Assessment of consultation and intervention implementation: A review of conjoint behavioral consultation studies. Journal of Educational and Psychological Consultation, 24(1), 55-73.
Collecting treatment integrity data is critical for (a) strengthening internal validity within a research study, (b) determining the impact of an intervention on student outcomes, and (c) assessing the need for implementation supports. Although researchers have noted the increased inclusion of treatment integrity data in published articles, there has been limited attention to how treatment integrity is assessed.
Collier-Meek, M. A., Fallon, L. M., & Gould, K. (2018). How are treatment integrity data assessed? Reviewing the performance feedback literature. School Psychology Quarterly, 33(4), 517.
This study examines obstacles encountered by 33 educators along with suggested interventions to overcome impediments to effective delivery of classroom management interventions or behavior support plans. Having the right classroom management plan isn’t enough if you can’t deliver the strategies to the students in the classroom.
Collier‐Meek, M. A., Sanetti, L. M., & Boyle, A. M. (2019). Barriers to implementing classroom management and behavior support plans: An exploratory investigation. Psychology in the Schools, 56(1), 5-17.
The study compared basic and elaborated corrections within the context of otherwise identical computer-assisted instruction (CAI) programs that taught reasoning skills. Twelve learning disabled and 16 remedial high school students were randomly assigned to either the basic-corrections or elaborated-corrections treatment. Criterion-referenced test scores were significantly higher for the elaborated-corrections treatment on both the post and maintenance tests and on the transfer test. Time to complete the program did not differ significantly for the two groups.
Collins, M., Carnine, D., & Gersten, R. (1987). Elaborated corrective feedback and the acquisition of reasoning skills: A study of computer-assisted instruction. Exceptional Children, 54(3), 254-262.
This study evaluated the effects of video modeling on staff implementation of a problem solving intervention.
Collins, S., Higbee, T. S., & Salzberg, C. L. (2009). The effects of video modeling on staff implementation of a problem-solving intervention with adults with developmental disabilities. Journal of applied behavior analysis, 42(4), 849-854.
Dane and Schneider propose treatment integrity as a multi-dimensional construct and describe five dimensions that constitute the construct.
Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: are implementation effects out of control. Clinical psychology review, 18(1), 23-45.
The authors examined the extent to which program integrity (i.e., the degree to which programs were implemented as planned) was verified and promoted in evaluations of primary and early secondary prevention programs published between 1980 and 1994.
Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: are implementation effects out of control?. Clinical psychology review, 18(1), 23-45.
This study evaluated the effects of allowing teachers to “test drive” interventions and then select the intervention they most preferred. The result was an increase in treatment integrity.
Dart, E. H., Cook, C. R., Collins, T. A., Gresham, F. M., & Chenier, J. S. (2012). Test Driving Interventions to Increase Treatment Integrity and Student Outcomes. School Psychology Review, 41(4), 467-481.
To prevent academic failure and promote long-term success, response-to-intervention (RtI) is designed to systematically increase the intensity of delivering research-based interventions. Interventions within an RtI framework must not only be effective but also be implemented with treatment fidelity and delivered with the appropriate level of treatment intensity to improve student mathematics achievement.
DeFouw, E. R., Codding, R. S., Collier-Meek, M. A., & Gould, K. M. (2019). Examining dimensions of treatment intensity and treatment fidelity in mathematics intervention research for students at risk. Remedial and Special Education, 40(5), 298-312.
The impact of an intervention is influenced by how well it fis into the context of a classroom. This paper suggests a number of variables to consider and how they might be measured prior to the development of an intervention.
Detrich, R. (1999). Increasing treatment fidelity by matching interventions to contextual variables within the educational setting. School Psychology Review, 28(4), 608-620.
To produce better outcomes for students two things are necessary: (1) effective, scientifically supported interventions (2) those interventions implemented with high integrity. Typically, much greater attention has been given to identifying effective practices. This review focuses on features of high quality implementation.
Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258-271.
Presentation by Wing Institute with goals: Make the case that treatment integrity monitoring is a necessary part of service delivery; describe dimensions of treatment integrity; suggest methods for increasing treatment integrity; place treatment integrity within systems framework .
Detrich, R. (2015). Treatment integrity: A wicked problem and some solutions. Missouri Association for Behavior Analysis 2015 Conference. http://winginstitute.org/2015-MissouriABA-Presentation-Ronnie-Detrich
Over the last fifty years, there have been many educational reform efforts, most of which have had a relatively short lifespan and failed to produce the promised results. One possible reason for this is for the most part these innovations have been poorly implemented. In this chapter, the author proposes a data-based decision making approach to assuring high quality implementation.
Detrich, R. Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform. In M. Murphy, S. Redding, and J. Twyman (Eds). Handbook on Innovations in Learning, 31. Charlotte, NC: Information Age Publishing
Reform efforts tend to come and go very quickly in education. This paper makes the argument that the sustainability of programs is closely related to how well those programs are implemented.
Detrich, R., Keyworth, R. & States, J. (2010). Treatment Integrity: A Fundamental Unit of Sustainable Educational Programs. Journal of Evidence-Based Practices for Schools, 11(1), 4-29.
In this conceptualization of treatment integrity, there are four dimensions relevant to practice: (a) exposure (dosage), (b) adherence, (c) quality of delivery, and (d) student responsiveness. It is important to understand that these dimensions do not stand alone but rather interact to impact the ultimate effectiveness of an intervention.
Detrich, R., States, J. & Keyworth, R. (2017). Dimensions of Treatment Integrity Overview. Oakland, Ca. The Wing Institute
The usual approach to determining if an intervention is effective for a student is to review student outcome data; however, this is only part of the task. Student data can only be understood if we know something about how well the intervention was implemented. Student data without treatment integrity data are largely meaningless because without knowing how well an intervention has been implemented, no judgments can be made about the effectiveness of the intervention. Poor outcomes can be a function of an ineffective intervention or poor implementation of the intervention. Without treatment integrity data, there is a risk that an intervention will be judged as ineffective when, in fact, the quality of implementation was so inadequate that it would be unreasonable to expect positive outcomes.
Detrich, R., States, J. & Keyworth, R. (2017). Treatment Integrity in the Problem Solving Process. Oakland, Ca. The Wing Institute.
For the best chance of producing positive educational outcomes for all children, two conditions must be met: (a) adopting effective empirically supported (evidence-based) practices and (b) implementing those practices with sufficient quality that they make a difference (treatment integrity)
Detrich, R., States, J., & Keyworth, R. (2107). Overview of Treatment Integrity. Oakland, Ca. The Wing Institute.
This study comared the effects of goal setting about student performance and feedback about student performance with daily written feedback about student performance, feedback about accuracy of implementation, and cancelling meetings if integrity criterion was met.
DiGennaro, F. D., Martens, B. K., & Kleinmann, A. E. (2007). A comparison of performance feedback procedures on teachers' treatment implementation integrity and students' inappropriate behavior in special education classrooms. Journal of Applied Behavior Analysis, 40(3), 447-461.
This study evaluated the impact of allowing teachers to miss coaching meetings if their treatment integrity scores met or exceeded criterion.
DiGennaro, F. D., Martens, B. K., & McIntyre, L. L. (2005). Increasing Treatment Integrity Through Negative Reinforcement: Effects on Teacher and Student Behavior. School Psychology Review, 34(2), 220-231.
This study evaluated the effects of video modeling on how well teachers implemented interventions. There was an increase in integrity but it remained variable. More stable patterns of implementation were observed when teachers were given feedback about their peroformance.
Digennaro-Reed, F. D., Codding, R., Catania, C. N., & Maguire, H. (2010). Effects of video modeling on treatment integrity of behavioral interventions. Journal of Applied Behavior Analysis, 43(2), 291-295.
This study evaluated the impact of public feedback in RtI team meetings on the quality of implementation. Feedback improved poor implementation and maintained high level implementation.
Duhon, G. J., Mesmer, E. M., Gregerson, L., & Witt, J. C. (2009). Effects of public feedback during RTI team meetings on teacher implementation integrity and student academic performance. Journal of School Psychology, 47(1), 19-37.
The first purpose of this review is to assess the impact of implementation on program outcomes, and the second purpose is to identify factors affecting the implementation process.
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American journal of community psychology, 41(3-4), 327-350.
This paper summarizes survey results about the acceptability of different methods for monitoring treatment integrity and performance feedback.
Easton, J. E., & Erchul, W. P. (2011). An Exploration of Teacher Acceptability of Treatment Plan Implementation: Monitoring and Feedback Methods. Journal of Educational & Psychological Consultation, 21(1), 56-77. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/10474412.2011.544949?journalCode=hepc20.
Presents 4 case studies demonstrating an innovative approach for studying and promoting treatment integrity in a manner acceptable to consultees and related to treatment success.
Ehrhardt, K. E., Barnett, D. W., Lentz Jr, F. E., Stollar, S. A., & Reifin, L. H. (1996). Innovative methodology in ecological consultation: Use of scripts to promote treatment acceptability and integrity. School Psychology Quarterly, 11(2), 149.
A growing number of evidence-based psychotherapies hold the promise of substantial benefits for children, families, and society. For the benefits of evidence-based programs to be realized on a scale sufficient to be useful to individuals and society, evidence-based psychotherapies need to be put into practice outside of controlled clinical trials.
Fixsen, D. L., Blase, K. A., Duda, M. A., Naoom, S. F., & Van Dyke, M. (2010). Implementation of evidence-based treatments for children and adolescents: Research findings and their implications for the future.
This paper discusses common elements of successfully sustaining effective practices across a variety of disciplines.
Fixsen, D. L., Blase, K. A., Duda, M., Naoom, S. F., & Van Dyke, M. (2010). Sustainability of evidence-based programs in education. Journal of Evidence-Based Practices for Schools, 11(1), 30-46.
This is a comprehensive literature review of the topic of Implementation examining all stages beginning with adoption and ending with sustainability.
Fixsen, D. L., Naoom, S. F., Blase, K. A., & Friedman, R. M. (2005). Implementation research: A synthesis of the literature.
Quality indicators of prereferral interventions (i.e., behavioral definition, direct measure, step-by-step plan, treatment integrity, graphing of results, and direct comparison to baseline) were investigated as predictors of prereferral intervention outcomes with a sample of regular education teachers and related services personnel on the same 312 students.
Flugum, K. R., & Reschly, D. J. (1994). Prereferral interventions: Quality indices and outcomes. Journal of School Psychology, 32(1), 1-14.
This paper describes the scaling up and dissemination of a partent training program in Norway while maintaining fidelity of implementation.
Forgatch, M. S., & DeGarmo, D. S. (2011). Sustaining fidelity following the nationwide PMTO implementation in Norway. Prevention Science, 12(3), 235-246. Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3153633
Garbacz, L., Brown, D., Spee, G., Polo, A., & Budd, K. (2014). Establishing Treatment Fidelity in Evidence-Based Parent Training Programs for Externalizing Disorders in Children and Adolescents. Clinical Child & Family Psychology Review, 17(3).
To help states and districts make informed decisions about the PD they implement to improve reading instruction, the U.S. Department of Education commissioned the Early Reading PD Interventions Study to examine the impact of two research-based PD interventions for reading instruction: (1) a content-focused teacher institute series that began in the summer and continued through much of the school year (treatment A) and (2) the same institute series plus in-school coaching (treatment B).
Garet, M. S., Cronen, S., Eaton, M., Kurki, A., Ludwig, M., Jones, W., ... Zhu, P. (2008). The impact of two professional development interventions on early reading instruction and achievement. NCEE 2008-4030. Washington, DC: National Center for Education Evaluation and Regional Assistance.
This study examined general education teachers’ implementation of a peer tutoring intervention for five elementary students referred for consultation and intervention due to academic concerns. Treatment integrity was assessed via permanent products produced by the intervention.
Gilbertson, D., Witt, J. C., Singletary, L. L., & VanDerHeyden, A. (2007). Supporting teacher use of interventions: Effects of response dependent performance feedback on teacher implementation of a math intervention. Journal of Behavioral Education, 16(4), 311-326.
Organizations house many individuals. Many of them are responsible implementing the same practice. If organizations are to meet their goal it is important for the organization have systems for assuring high levels of treatment integrity.
Gottfredson, D. C. (1993). Strategies for Improving Treatment Integrity in Organizational Consultation. Journal of Educational & Psychological Consultation, 4(3), 275.
Many students with autism spectrum disorder (ASD) receive behavioral interventions to improve academic and prosocial functioning and remediate current skill deficits. Sufficient treatment integrity is necessary for these interventions to be successful.
Gould, K. M., Collier-Meek, M., DeFouw, E. R., Silva, M., & Kleinert, W. (2019). A systematic review of treatment integrity assessment from 2004 to 2014: Examining behavioral interventions for students with autism spectrum disorder. Contemporary School Psychology, 23(3), 220-230.
Technical issues (specification of treatment components, deviations from treatment protocols and amount of behavior change, and psychometric issues in assessing Treatment Integrity) involved in the measurement of Treatment Integrity are discussed.
Gresham, F. M. (1989). Assessment of treatment integrity in school consultation and prereferral intervention. School Psychology Review, 18(1), 37-50.
The concept of treatment fidelity (integrity) is important across a diversity of fields that are involved with providing treatments or interventions to individuals. Despite variations in terminology across these diverse fields, the concern that treatments or interventions are delivered as prescribed or intended is of paramount importance to document that changes in individuals’ functioning (medical, nutritional, psychological, or behavioral) are due to treatments and not from uncontrolled, extraneous variables.
Gresham, F. M. (2016). Features of fidelity in schools and classrooms: Constructs and measurement. In Treatment fidelity in studies of educational intervention (pp. 30-46). Routledge.
The concept of treatment integrity is an essential component to data-based decision making within a response-to-intervention model. Although treatment integrity is a topic receiving increased attention in the school-based intervention literature, relatively few studies have been conducted regarding the technical adequacy of treatment integrity assessment methods.
Gresham, F. M., Dart, E. H., & Collins, T. A. (2017). Generalizability of Multiple Measures of Treatment Integrity: Comparisons Among Direct Observation, Permanent Products, and Self-Report. School Psychology Review, 46(1), 108-121.
This study reviewed all intervention studies published between 1980-1990 in Journal of Applied Behavior Analysis in which children were the subjects of the study. The authors found that treatment integrity was reported in only 16% of the studies.
Gresham, F. M., Gansle, K. A., & Noell, G. H. (1993). Treatment ?integrity in ?applied behavior analysis with children. Journal of Applied ?Behavior Analysis, 26(2), 257-263.
The authors reviewed three learning disabilities journals between 1995-1999 to determine what percent of the intervention studies reported measures of treatment integrity. Only 18.5% reported treatment integrity measures.
Gresham, F. M., MacMillan, D. L., Beebe-Frankenberger, M. E., & Bocian, K. M. ?(2000). Treatment integrity in learning disabilities intervention research: Do we really know how treatments are implemented. Learning Disabilities ?Research & Practice, 15(4), 198-205.
This review examines the treatment integrity data of literacy interventions for students with emotional and/or behavioral disorders (EBD). Findings indicate that studies focusing on literacy interventions for students with EBD included clear operational definitions and data on treatment integrity to a higher degree than have been found in other disciplines.
Griffith, A. K., Duppong Hurley, K., & Hagaman, J. L. (2009). Treatment integrity of literacy interventions for students with emotional and/or behavioral disorders: A review of literature. Remedial and Special Education, 30(4), 245-255.
This study evaluated the differences in estimates of treatment integrity be measuring different dimensions of it.
Hagermoser Sanetti, L. M., & Fallon, L. M. (2011). Treatment Integrity Assessment: How Estimates of Adherence, Quality, and Exposure Influence Interpretation of Implementation. Journal of Educational & Psychological Consultation, 21(3), 209-232.
The paper describes the Implemenation Planning protocol as an approach for increasing treament integrity.
Hagermoser Sanetti, L. M., Collier-Meek, M. A., Long, A. C. J., Byron, J., & Kratochwill, T. R. (2015). Increasing teacher treatment integrity of behavior support plans through consultation and Implementation Planning. Journal of School Psychology, 53(3).
This study evaluated the relative benefits of verbal feedback and verbal plus grahic feedback as a means for increasing treatment integrity. The verbal plus graphic feedback was more effective than verbal feedback alone.
Hagermoser Sanetti, L. M., Luiselli, J. K., & Handler, M. W. (2007). Effects of Verbal and Graphic Performance Feedback on Behavior Support Plan Implementation in a Public Elementary School. Behavior Modification, 31(4), 454-465. https://doi.org/10.1177/0145445506297583
This study evaluated the relative benefits of verbal feedback and verbal plus grahic feedback as a means for increasing treatment integrity. The verbal plus graphic feedback was more effective than verbal feedback alone.
Hagermoser Sanetti, L. M., Luiselli, J. K., & Handler, M. W. (2007). Effects of Verbal and Graphic Performance Feedback on Behavior Support Plan Implementation in a Public Elementary School. Behavior Modification, 31(4), 454-465. https://doi.org/10.1177/0145445506297583
Determining whether or not the treatment or innovation under study is actually in use, and if so, how it is being used, is essential to the interpretation of any study. The concept of Levels of Use of the Innovation (LoU) permits an operational, cost-feasible description and documentation of whether or not an innovation or treatment is being implemented.
Hall, G. E., & Loucks, S. F. (1977). A developmental model for determining whether the treatment is actually implemented. American Educational Research Journal, 14(3), 263-276.
This study examines adoption and implementation of the US Department of Education's new policy, the `Principles of Effectiveness', from a diffusion of innovations theoretical framework. In this report, we evaluate adoption in relation to Principle 3: the requirement to select research-based programs.
Hallfors, D., & Godette, D. (2002). Will the “principles of effectiveness” improve prevention practice? Early findings from a diffusion study. Health Education Research, 17(4), 461–470.
For decades, schools have taught children the strategies of struggling readers, using a theory about reading that cognitive scientists have repeatedly debunked. And many teachers and parents don't know there's anything wrong with it.
Hanford, E. (2019). At a loss for words: How a flawed idea is teaching millions of kids to be poor readers. APM Reports. https://www.apmreports.org/story/2019/08/22/whats-wrong-how-schools-teach-reading
This paper suggests a model for selecting interventions that match the context of classrooms.
Harn, B., Parisi, D., & Stoolmiller, M. (2013). Balancing fidelity with flexibility and Fit: What do we really know about fidelity of implementation in schools?. Exceptional Children, 79(2), 181-193.
Validated a measure of clinical supervision practices, further validated a measure of therapist adherence, and examined the association between supervisory practices and therapist adherence to an evidence-based treatment model (i.e., multisystemic therapy [MST]) in real-world clinical settings.
Henggeler, S. W., Schoenwald, S. K., Liao, J. G., Letourneau, E. J., & Edwards, D. L. (2002). Transporting efficacious treatments to field settings: The link between supervisory practices and therapist fidelity in MST programs. Journal of Clinical Child and Adolescent Psychology, 31(2), 155-167.
This article discusses the current focus on using teacher observation instruments as part of new teacher evaluation systems being considered and implemented by states and districts.
Hill, H., & Grossman, P. (2013). Learning from teacher observations: Challenges and opportunities posed by new teacher evaluation systems. Harvard Educational Review, 83(2), 371-384.
Used a direct observation-based approach to identify behavioral conditions in sending (i.e., special education) and in receiving (i.e., regular education) classrooms and to identify targets for intervention that might facilitate mainstreaming of behavior-disordered (BD) children.
Hoier, T. S., McConnell, S., & Pallay, A. G. (1987). Observational assessment for planning and evaluating educational transitions: An initial analysis of template matching. Behavioral Assessment.
This document presents a set of criteria to be used in evaluating treatment guidelines that have been promulgated by health care organizations, government agencies, professional associations, or other entities.1 The purpose of treatment guidelines is to educate health care professionals2 and health care systems about the most effective treatments available
Hollon, D., Miller, I. J., & Robinson, E. (2002). Criteria for evaluating treatment guidelines. American Psychologist, 57(12), 1052-1059.
The purposes of this manuscript are to propose core features that may apply to any practice or set of practices that proposes to be evidence-based in relation to School-wide Positive Behavior Support (SWPBS).
Horner, R. H., Sugai, G., & Anderson, C. M. (2010). Examining the evidence base for school-wide positive behavior support. Focus on Exceptional Children, 42(8), 1.
“Contextual fit” is based on the premise that the match between an intervention and local context affects both the quality of intervention implementation and whether the intervention actually produces the desired outcomes for children and families.
Horner, R., Blitz, C., & Ross, S. (2014). The importance of contextual fit when implementing evidence-based interventions. Washington, DC: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation. https://aspe.hhs.gov/system/files/pdf/77066/ib_Contextual.pdf
This piece describes the widely held perception among education leaders that we already know how to help teachers improve, and that we could achieve our goal of great teaching in far more classrooms if we just applied what we know more widely.
Jacob, A., & McGovern, K. (2015). The mirage: Confronting the hard truth about our quest for teacher development. Brooklyn, NY: TNTP. https://tntp.org/assets/documents/TNTP-Mirage_2015.pdf.
This study examined the effects of performance feedback on treatment integrity.
Jones, K. M., Wickstrom, K. F., & Friman, P. C. (1997). The effects of observational feedback on treatment integrity in school-based behavioral consultation. School Psychology Quarterly, 12(4).
The Good Behavior Game (GBG) is a well-documented group contingency designed to reduce disruptive behavior in classroom settings. However, few studies have evaluated the GBG with students who engage in severe problem behavior in alternative schools, and there are few demonstrations of training teachers in those settings to implement the GBG.
Joslyn, P. R., & Vollmer, T. R. (2020). Efficacy of teacher‐implemented Good Behavior Game despite low treatment integrity. Journal of applied behavior analysis, 53(1), 465-474.
This book provides research as well as case studies of successful professional development strategies and practices for educators.
Joyce, B. R., & Showers, B. (2002). Student achievement through staff development. ASCD.
Considers design issues and strategies by comparative outcome studies, including the conceptualization, implementation, and evaluation of alternative treatments; assessment of treatment-specific processes and outcomes; and evaluation of the results. It is argued that addressing these and other issues may increase the yield from comparative outcome studies and may attenuate controversies regarding the adequacy of the demonstrations.
Kazdin, A. E. (1986). Comparative outcome studies of psychotherapy: methodological issues and strategies. Journal of consulting and clinical psychology, 54(1), 95.
The authors proposed a preliminary FI theory (FIT) and tested it with moderator analyses. The central assumption of FIT is that FIs change the locus of attention among 3 general and hierarchically organized levels of control: task learning, task motivation, and meta-tasks (including self-related) processes.
Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological bulletin, 119(2), 254.
This book examines the use of video recording to to improve teacher performance. The book shows how every classroom can easily benefit from setting up a camera and hitting “record”.
Knight, J. (2013). Focus on teaching: Using video for high-impact instruction. (Pages 8-14). Thousand Oaks, CA: Corwin.
Scientists have discredited claims that listening to classical music enhances intelligence, yet this so-called "Mozart Effect" has actually exploded in popularity over the years.
Krakovsky, M. (2005). Dubious “Mozart effect” remains music to many Americans’ ears. Stanford, CA: Stanford Report
An old disagreement over how to teach children to read -- whole-language versus phonics -- has re-emerged in California, in a new form. Previously confined largely to education, the dispute is now a full-fledged political issue there, and is likely to become one in other states.
Lemann, N. (1997). The reading wars. The Atlantic Monthly, 280(5), 128–133.
This study evaluated the effects of an intensive training progam for paraeducators responsible for implementing a group contingency intervention for classroom behavior.
Maggin, D. M., Fallon, L. M., Hagermoser, S., Lisa M., & Ruberto, L. M. (2012). Training Paraeducators to Implement a Group Contingency Protocol: Direct and Collateral Effects. Behavioral Disorders, 38(1), 18-37.
McIntosh, K., Chard, D. J., Boland, J. B., & Horner, R. H. (2006). Demonstration of combined efforts in school-wide academic and behavioral systems and incidence of reading and behavior challenges in early elementary grades. Journal of Positive Behavior Interventions, 8(3), 146-154.
McIntosh, K., Chard, D. J., Boland, J. B., & Horner, R. H. (2006). Demonstration of combined efforts in school-wide academic and behavioral systems and incidence of reading and behavior challenges in early elementary grades. Journal of Positive Behavior Interventions, 8(3), 146-154.
This paper examines school-based experimental studies with individuals 0 to 18 years between 1991 and 2005. Only 30% of the studies provided treatment integrity data. Nearly half of studies (45%) were judged to be at high risk for treatment inaccuracies.
McIntyre, L. L., Gresham, F. M., DiGennaro, F. D., & Reed, D. D. (2007). Treatment integrity of school‐based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis, 40(4), 659–672.
We should stop being so embarrassed by uncertainty and embrace it as a strength rather than a weakness of scientific reasoning
McIntyre, L., (2019, May 22). How to reverse the assault on science. Scientific American. https://blogs.scientificamerican.com/observations/how-to-reverse-the-assault-on-science1/
This position paper contends that the whole language approach to reading instruction has been disproved by research and evaluation but still pervades textbooks for teachers, instructional materials for classroom use, some states' language-arts standards and other policy documents, teacher licensing requirements and preparation programs, and the professional context in which teachers work.
Moats, L. C. (2000). Whole language lives on: The illusion of “balanced” reading instruction. Washington, DC: DIANE Publishing.
Fidelity of treatment in outcome research refers to confirmation that the manipulation of the independent variable occurred as planned. Verification of fidelity is needed to ensure that fair, powerful, and valid comparisons of replicable treatments can be made.
Moncher, F. J., & Prinz, R. J. (1991). Treatment fidelity in outcome studies. Clinical psychology review, 11(3), 247-266.
This study evaluated the effects of performance feedback on the implementation of a classroom intervention.
Mortenson, B. P., & Witt, J. C. (1998). The use of weekly performance feedback to increase teacher implementation of a prereferral academic intervention. School Psychology Review, 613-627.
This study investigated treatment fidelity in social work research. The authors systematically reviewed all articles published in five prominent social work journals over a 5- year period.
Naleppa, M. J., & Cagle, J. G. (2010). Treatment fidelity in social work intervention research: A review of published studies. Research on Social Work Practice, 20(6), 674-681.
This study compared the effects of discussing issues of implementation challenges and performance feedback on increasing the integrity of implementation. Performance feedback was more effective than discussion in increasing integrity.
Noell, G. H., & Witt, J. C. (2000). Increasing intervention implementation in general education following consultation: A comparison of two follow-up strategies. Journal of Applied Behavior Analysis, 33(3), 271.
This study contributes to the data-base on the use of performance feedback to increase treatment integrity.
Noell, G. H., Duhon, G. J., Gatti, S. L., & Connell, J. E. (2002). Consultation, Follow-up, and Implementation of Behavior Management Interventions in General Education. School Psychology Review, 31(2), 217.
This study examined the impact of three levels of treatment integrity on students'
responding on mathematics tasks.
Noell, G. H., Gresham, F. M., & Gansle, K. A. (2002). Does treatment integrity matter? A preliminary investigation of instructional implementation and mathematics performance. Journal of Behavioral Education, 11(1), 51-67.
This study evaluated the impact of training on treatment integrity. After finding that positive effects lasted 2-4 days, performance feedback was used to increase treatment integrity.
Noell, G. H., Witt, J. C., Gilbertson, D. N., Ranier, D. D., & Freeland, J. T. (1997). Increasing teacher intervention implementation in general education settings through consultation and performance feedback. School Psychology Quarterly, 12(1).
Noell, G. H., Witt, J. C., Gilbertson, D. N., Ranier, D. D., & Freeland, J. T. (1997). Increasing teacher intervention implementation in general education settings through consultation and performance feedback. School Psychology Quarterly, 12(1), 77.
This study evaluated three approaches to behavioral consultation and their impact on treatment integrity. Performance feedback was associated with superior treatment implementation and child behavioral outcomes.
Noell, G. H., Witt, J. C., Slider, N. J., Connell, J. E., Gatti, S. L., Williams, K. L., . . . Duhon, G. J. (2005). Treatment Implementation Following Behavioral Consultation in Schools: A Comparison of Three Follow-up Strategies. School Psychology Review, 34(1), 87-106.
Naomi Oreskes offers a bold and compelling defense of science, revealing why the social character of scientific
knowledge is its greatest strength—and the greatest reason we can trust it.
Oreskes, N. (2019). Why trust science? Princeton, NJ: Princeton University Press.
This study evaluated the effects of a pyramidal training model to improve teachers’ implementation of functional analysis components. An expert trained a group of teachers who then trained another group of teachers. All teachers improved their ability to conduct functional analyses.
Pence, S., St., P., Claire, & Giles, A. (2014). Teacher Acquisition of Functional Analysis Methods Using Pyramidal Training. Journal of Behavioral Education, 23(1), 132-149.
The purpose of the current investigation was to assess the relationship between the integrity with which social skills interventions were implemented in early childhood special education classrooms and 3 factors: teacher ratings of intervention acceptability, consultative support for implementation, and individual child outcomes.
Peterson, C. A., & McCONNELL, S. R. (1996). Factors related to intervention integrity and child outcome in social skills interventions. Journal of early intervention, 20(2), 146-164.
This study evaluated the effects of performance feedback to pre-service teachers to increase their rates of positive and negative communication with students.
Rathel, J. M., Drasgow, E., & Christle, C. C. (2008). Effects of Supervisor Performance Feedback on Increasing Preservice Teachers’ Positive Communication Behaviors With Students With Emotional and Behavioral Disorders. Journal of Emotional & Behavioral Disorders, 16(2), 67-77.
The focus of the book is on essential concepts in educational statistics, understanding when to use various statistical tests, and how to interpret results. This book introduces educational students and practitioners to the use of statistics in education and basic concepts in statistics are explained in clear language.
Ravid, R. (2019). Practical statistics for educators. Rowman & Littlefield Publishers.
This study evaluted the impact of coaching on the implementation of an intervention. Coaching with higher rates of performance feedback resulted in the highest level of treatment integrity.
Reinke, W., Stormont, M., Herman, K., & Newcomer, L. (2014). Using Coaching to Support Teacher Implementation of Classroom-based Interventions. Journal of Behavioral Education, 23(1), 150-167.
This page describes the conflict of interest and what should we do about it.
Resources for Research Ethics Education. (2001). What is a conflict of interest? San Diego, CA: University of California, San Diego. http://research-ethics.org/topics/conflicts-of-interest/
An alternating treatments design was used to evaluate the effectiveness of an educational program that combined timings (via chess clocks), peer tutoring (i.e., peer-delivered immediate feedback), positive-practice overcorrection, and performance feedback on mathematics fluency (i.e., speed of accurate responding) in four elementary students with mathematics skills deficits.
Rhymer, K. N., Dittmer, K. I., Skinner, C. H., & Jackson, B. (2000). Effectiveness of a multi-component treatment for improving mathematics fluency. School Psychology Quarterly, 15(1), 40.
It is proposed in this paper that interventions are most likely to be implemented when they draw from existing practices in a classroom.
Riley-Tillman, T. C., & Chafouleas, S. M. (2003). Using Interventions That Exist in the Natural Environment to Increase Treatment Integrity and Social Influence in Consultation. Journal of Educational & Psychological Consultation, 14(2), 139-156.
First Step to Success is an empirically supported intervention for young elementary school students that can be implemented by public school teachers with training. This study evaluated the effects of coaching feedback on teachers who not effectively implemented First Step following training.
Rodriguez, B. J., Loman, S. L., & Horner, R. H. (2009). A preliminary analysis of the effects of coaching feedback on teacher implementation fidelity of First Step to Success. Behavior Analysis in Practice, 2(2), 11-21.
This book looks at how new ideas spread via communication channels over time. Such innovations are initially perceived as uncertain and even risky. To overcome this uncertainty, most people seek out others like themselves who have already adopted the new idea. Thus the diffusion process typically takes months or years. But there are exceptions: use of the Internet in the 1990s, for example, may have spread more rapidly than any other innovation in the history of humankind.
Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.
This review will briefly address the nature of conflicts of interest in research, including the importance of both financial and non-financial conflicts, and the potential effectiveness and limits of various strategies for managing such conflicts.
Romain, P. L. (2015). Conflicts of interest in research: Looking out for number one means keeping the primary interest front and center. Current Reviews in Musculoskeletal Medicine, 8(2), 122–127.
The purpose of this chapter is to explain the role of treatment integrity assessment within the “implementing solutions” stage of a problem-solving model.
Sanetti, L. H., & Kratochwill, T. R. (2005). Treatment integrity assessment within a problem-solving model. Assessment for intervention: A problem-solving approach, 314-325.
This paper describes a multi-tier system of supports for teachers as they implement an intervention.
Sanetti, L. M. H., & Collier-Meek, M. A.-. (2015). Data-Driven Delivery of Implementation Supports in a Multi-Tiered Framework: A Pilot Study. Psychology in the Schools, 52(8). 815-828
This study evaluated the Treatment Integiry Planning Protocol as a means for increasing treatment integrity.
Sanetti, L. M. H., & Kratochwill, T. R. (2009). Treatment Integrity Assessment in the Schools: An Evaluation of the Treatment Integrity Planning Protocol. School Psychology Quarterly, 24(1), 24-35.
This paper describes treatment integrity assessment and intervention for practicing school psychologists.
Sanetti, L. M. H., & Kratochwill, T. R. (2011). An Evaluation of the Treatment Integrity Planning Protocol and Two Schedules of Treatment Integrity Self-Report: Impact on Implementation and Report Accuracy. Journal of Educational & Psychological Consultation, 21(4), 284-308.
This paper reviews options for treatment integrity measurement emphasizing how direct behavior rating technology might be incorporated within a multi-tiered model of intervention delivery.
Sanetti, L. M. H., Chafouleas, S. M., Christ, T. J., & Gritter, K. L. (2009). Extending Use of Direct Behavior Rating Beyond Student Assessment. Assessment for Effective Intervention, 34(4), 251-258.
This study evaluated four methods for teachers self reporting how well they implemented an intervention.
Sanetti, L. M. H., Chafouleas, S. M., O’Keeffe, B. V., & Kilgus, S. P. (2013). Treatment Integrity Assessment of a Daily Report Card Intervention: A Preliminary Evaluation of Two Methods and Frequencies. Canadian Journal of School Psychology, 28(3), 261-276.
Both student outcomes and treatment fidelity data are necessary to draw valid conclusions about intervention effectiveness. Reviews of the intervention outcome literature in related fields, and prior reviews of the school psychology literature, suggest that many researchers failed to report treatment fidelity data.
Sanetti, L. M. H., Charbonneau, S., Knight, A., Cochrane, W. S., Kulcyk, M. C., & Kraus, K. E. (2020). Treatment fidelity reporting in intervention outcome studies in the school psychology literature from 2009 to 2016. Psychology in the Schools, 57(6), 901-922.
This paper evaluated the impact of Implementation Planning on teacher level of treatment integrity.
Sanetti, L. M. H., Collier-Meek, M. A., Long, A. C. J., Kim, J., & Kratochwill, T. R. (2014). Using implementation planning to increase teachers' adherence and quality to behavior support plans. Psychology in the Schools, 51(8), 879-895.
The authors reviewed all intervention studies published in the Journal of Positive Behavior Interventions between 1999-2009 to determine the percent of those studies that reported a measure of treatment integrity. Slightly more than 40% reported a measure of treatment integrity.
Sanetti, L. M. H., Dobey, L. M., & Gritter, K. L. (2012). Treatment Integrity of Interventions With Children in the Journal of Positive Behavior Interventions: From 1999 to 2009. Journal of Positive Behavior Interventions, 14(1), 29-46.
If educational programs are to be effective they must be implemented with sufficient integrity to assure benefits. To have a significant impact on schools, solutions must be scalable. This study evaluated the effects of using existing school personnel to provide performance feedback to teachers regarding the quality of implementation.
Sanetti, L. M. H., Fallon, L. M., & Collier-Meek, M. A. (2013). Increasing teacher treatment integrity through performance feedback provided by school personnel. Psychology in the Schools, 50(2), 134-150.
The authors reviewed four school psychology journals between 1995-2008 to estimate the percent of intervention studies that reported some measure of treatment integrity. About 50% reported a measure of treatment integrity.
Sanetti, L. M. H., Gritter, K. L., & Dobey, L. M. (2011). Treatment integrity of interventions with children in the school psychology literature from 1995 to 2008. School Psychology Review, 40(1), 72-84.
This study utilized a “bug in the ear” device to provide immediate feedback on implementation of specific teaching practices.
Scheeler, M. C., Congdon, M., & Stansbery, S. (2010). Providing Immediate Feedback to Co-Teachers Through Bug-in-Ear Technology: An Effective Method of Peer Coaching in Inclusion Classrooms. Teacher Education & Special Education, 33(1).
This paper describes the use of wireless technology to give feedback to students in a teacher prep program about their integrity of implementation.
Scheeler, M. C., McAfee, J. K., & Ruhl, K. L. (2006). Effects of Corrective Feedback Delivered via Wireless Technology on Preservice Teacher Performance and Student Behavior. Teacher Education & Special Education, 29(1).
A commonly used research design in applied behavior analysis involves comparing two or more independent variables. There were some consistencies across studies, with half resulting in equivalent outcomes across comparisons. In addition, most studies employed the use of an alternating treatments or multi-element single-subject design and compared a teaching methodology.
Shabani, D. B., & Lam, W. Y. (2013). A review of comparison studies in applied behavior analysis. Behavioral Interventions, 28(2), 158-183.
This paper outlines the best practices for researchers and practitioners translating research to practice as well as recommendations for improving the process.
Shriver, M. D. (2007). Roles and responsibilities of researchers and practitioners for translating research to practice. Journal of Evidence-Based Practices for Schools, 8(1), 1-30.
The effects of two error-correction procedures on oral reading errors and a control condition were compared in an alternating treatments design with three students who were moderately mentally retarded. The two procedures evaluated were word supply and sentence repeat.
Singh, N. N. (1990). Effects of two error-correction procedures on oral reading errors: Word supply versus sentence repeat. Behavior Modification, 14(2), 188-199.
This literature review was conducted to evaluate the current state of evidence supporting communication interventions for individuals with severe intellectual and developmental disabilities. Many researchers failed to report treatment fidelity or to assess basic aspects of intervention effects, including generalization, maintenance, and social validity.
Snell, M. E., Brady, N., McLean, L., Ogletree, B. T., Siegel, E., Sylvester, L., ... & Sevcik, R. (2010). Twenty years of communication intervention research with individuals who have severe intellectual and developmental disabilities. American journal on intellectual and developmental disabilities, 115(5), 364-380.
The current study extracted and aggregated data from single-case studies that used Performance feedback (PF) in school settings to increase teachers' use of classroom-based interventions.
Solomon, B. G., Klein, S. A., & Politylo, B. C. (2012). The effect of performance feedback on teachers' treatment integrity: A meta-analysis of the single-case literature. School Psychology Review, 41(2).
This study is a meta-analysis of studies using performance feedback to improve treatment integrity. The overall result was that performance feedback had moderate effects on integrity.
Solomon, B., Klein, S. A., & Politylo, B. C. (2012). The Effect of Performance Feedback on Teachers’ Treatment Integrity: A Meta-Analysis of the Single-Case Literature. School Psychology Review, 41(2), 160-175.
Observational measurement of treatment adherence has long been considered the gold standard. However, little is known about either the generalizability of the scores from extant observational instruments or the sampling needed. Results suggested that reliable cognitive–behavioral therapy adherence studies require at least 10 sessions per patient, assuming 12 patients per therapists and two coders—a challenging threshold even in well-funded research. Implications, including the importance of evaluating alternatives to observational measurement, are discussed.
Southam-Gerow, M. A., Bonifay, W., McLeod, B. D., Cox, J. R., Violante, S., Kendall, P. C., & Weisz, J. R. (2020). Generalizability and decision studies of a treatment adherence instrument. Assessment, 27(2), 321-333.
This book is written for school administrators, staff developers, behavior specialists, and instructional coaches to offer guidance in implementing research-based practices that establish effective classroom management in schools. The book provides administrators with practical strategies to maximize the impact of professional development.
Sprick, et al. (2010). Coaching Classroom Management: Strategies & Tools for Administrators & Coaches. Pacific Northwest Publishing.
This paper examines a range of education failures: common mistakes in how new practices are selected, implemented, and monitored. The goal is not a comprehensive listing of all education failures but rather to provide education stakeholders with an understanding of the importance of vigilance when implementing new practices.
States, J., & Keyworth, R. (2020). Why Practices Fail. Oakland, CA: The Wing Institute. https://www.winginstitute.org/roadmap-overview
Inattention to treatment integrity is a primary factor of failure during implementation. Treatment integrity is defined as the extent to which an intervention is executed as designed, and the accuracy and consistency with which the intervention is implemented
States, J., Detrich, R. & Keyworth, R. (2017). Treatment Integrity Strategies. Oakland, CA: The Wing Institute. https://www.winginstitute.org/effective-instruction-treatment-integrity-strategies.
The debate about the safety of calcium-channel antagonists provided an opportunity to study financial conflicts of interest in medicine. This project was designed to examine the relation between authors' published positions on the safety of calcium-channel antagonists and their financial interactions with the pharmaceutical industry.
Stelfox, H. T., Chua, G., O'Rourke, K., & Detsky, A. S. (1998). Conflict of interest in the debate over calcium-channel antagonists. New England Journal of Medicine, 338(2), 101–106.
This study compared indirect training and direct training methods as a means of impacting levels of treatment integrity. Direct training methods produced better outcomes.
Sterling-Turner, H. E., Watson, T. S., & Moore, J. W. (2002). The effects of direct training and treatment integrity on treatment outcomes in school consultation. School Psychology Quarterly, 17(1).
The present study was conducted to investigate the relationship between training procedures and treatment integrity.
Sterling-Turner, H. E., Watson, T. S., Wildmon, M., Watkins, C., & Little, E. (2001). Investigating the relationship between training type and treatment integrity. School Psychology Quarterly, 16(1), 56.
This is a systematic review of the effects of coaching teachers to implement social behavior interventions.
Stormont, M., Reinke, W. M., Newcomer, L., Marchese, D., & Lewis, C. (2015). Coaching Teachers’ Use of Social Behavior Interventions to Improve Children’s Outcomes: A Review of the Literature. Journal of Positive Behavior Interventions, 17(2).
This paper describes the problem of publication bias with reference to its history in a number of fields, with special reference to the area of educational research.
Torgerson, C. J. (2006). Publication bias: The Achilles’ heel of systematic reviews? British Journal of Educational Studies, 54(1), 89-102. https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1467-8527.2006.00332.x
This experiment evaluated the effects of requiring overt answer construction in computer-based programmed instruction using an alternating treatments design.
Tudor, R. M. (1995). Isolating the effects of active responding in computer‐based instruction. Journal of Applied Behavior Analysis, 28(3), 343-344.
Using the data from the Program for the International Assessment of Adult Competencies (PIAAC), this Data Point summarizes the number of U.S. adults with low levels of English literacy and describes how they differ by nativity status1 and race/ethnicity.
U.S. Department of Education. (2019). Data point: Adult literacy in the United States. https://nces.ed.gov/datapoints/2019179.asp
The National Assessment of Educational Progress (NAEP) is the largest nationally representative and continuing assessment of what America's students know and can do in various subject areas.
U.S. Department of Education. (2020). National Assessment of Educational Progress (NAEP). https://nces.ed.gov/nationsreportcard/
Simple experimental studies randomize study participants into two groups: a treatment group that includes participants who receive the offer to participate in a program or intervention, and a control group that includes participants who do not receive that offer. Such studies primarily address questions about the program impacts on the average outcomes of participants.
Unlu, F., Bozzi, L., Layzer, C., Smith, A., Price, C., & Hurtig, R. (2016). Linking implementation fidelity to impacts in an RCT (pp. 108-137). Routledge.
A review of 20 experimental, shared book reading (SBR) interventions using questioning strategies with preschool children was conducted. The studies were analyzed in terms of their quality, focus, and the questioning strategies employed. Although there were few methodological concerns about the studies conducted, treatment fidelity and replicability of the reported interventions are raised as issues needing attention in future research.
Walsh, R. L., & Hodge, K. A. (2018). Are we asking the right questions? An analysis of research on the effect of teachers’ questioning on children’s language during shared book reading with young children. Journal of Early Childhood Literacy, 18(2), 264-294.
In a randomized control study, Head Start teachers were assigned to either an intervention group that received intensive, ongoing professional development (PD) or to a comparison group that received the “business as usual” PD provided by Head Start. The PD intervention provided teachers with conceptual knowledge and instructional strategies that support young children’s development of vocabulary, alpha- bet knowledge, and phonological sensitivity.
Wasik, B. A., & Hindman, A. H. (2011). Improving vocabulary and pre-literacy skills of at-risk preschoolers through teacher professional development. Journal of Educational Psychology, 103(2), 455.
The purpose of this study was to assess the degree to which behavioral intervention studies conducted with persons with mental retardation operationally defined the independent variables and evaluated and reported measures of treatment integrity. The study expands the previous work in this area reported by Gresham, Gansle, and Noell (1993) and Wheeler, Baggett, Fox, and Blevins (2006) by providing an evaluation of empirical investigations published in multiple journals in the fields of applied behavior analysis and mental retardation from 1996–2006. Results of the review indicated that relatively few of the studies fully reported data on treatment integrity.
Wheeler, J. J., Mayton, M. R., Carter, S. L., Chitiyo, M., Menendez, A. L., & Huang, A. (2009). An assessment of treatment integrity in behavioral intervention studies conducted with persons with mental retardation. Education and Training in Developmental Disabilities, 187-195.
The relationships among independent variables and three measures of treatment integrity were evaluated.
Wickstrom, K. F., Jones, K. M., LaFleur, L. H., & Witt, J. C. (1998). An analysis of treatment integrity in school-based behavioral consultation. School Psychology Quarterly, 13(2), 141.
Treatment integrity is essential for the implementation of interventions in schools as it determines the accuracy or consistency with which different components of a treatment are implemented. There are no current standards regarding the best practices in treatment integrity measurement; however, higher integrity is associated with enhanced student outcomes.
Wilson, E. (2017). Generalizability of multiple measures of treatment integrity: An empirical replication.
This study evaluated the effects of performance feedback on increasing the quality of implementation of interventions by teachers in a public school setting.
Witt, J. C., Noell, G. H., LaFleur, L. H., & Mortenson, B. P. (1997). Teacher use of interventions in general education settings: Measurement and analysis of ?the independent variable. Journal of Applied Behavior Analysis, 30(4), 693.
This study evaluated the effects of graphed feedback alone compared to the effects of graphed feedback plus verbal feedback. The combined graphed and verbal resulted in slightly better performance.
Zoder-Martell, K., Dufrene, B., Sterling, H., Tingstrom, D., Blaze, J., Duncan, N., & Harpole, L.-. (2013). Effects of Verbal and Graphed Feedback on Treatment Integrity. Journal of Applied School Psychology, 29(4).