Creating Single-Subject Design in Microsoft Excel
The article provides a task analyses for constructing various types of commonly used single-subject design graphs in Microsoft Excel
Dixon, M. R., Jackson, J. W., Small, S. L., Horner?King, M. J., Lik, N. M. K., Garcia, Y., & Rosales, R. (2009). CCreating Single-Subject Design in Microsoft Excel 2007. Journal of applied behavior analysis, 42(2), 277-293.
Will the âprinciples of effectivenessâ improve prevention practice? Early findings from a diffusion study
This study examines adoption and implementation of the US Department of Education's new policy, the `Principles of Effectiveness', from a diffusion of innovations theoretical framework. In this report, we evaluate adoption in relation to Principle 3: the requirement to select research-based programs.
Hallfors, D., & Godette, D. (2002). Will the “principles of effectiveness” improve prevention practice? Early findings from a diffusion study. Health Education Research, 17(4), 461–470.
Treatment integrity of schoolâbased interventions with children
This paper examines school-based experimental studies with individuals 0 to 18 years between 1991 and 2005. Only 30% of the studies provided treatment integrity data. Nearly half of studies (45%) were judged to be at high risk for treatment inaccuracies.
McIntyre, L. L., Gresham, F. M., DiGennaro, F. D., & Reed, D. D. (2007). Treatment integrity of schoolâbased interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis, 40(4), 659–672.
Uneducated Guesses: Using Evidence to Uncover Misguided Education Policies
This book offers concrete examples from educational testing to illustrate the importance of empirically and logically scrutinizing the evidence used to make education policy decisions. Wainer uses statistical evidence to show why some of the most widely held beliefs in education may be wrong.
Wainer, H. (2011). Uneducated guesses: Using evidence to uncover misguided education policies. Princeton University Press.
Randomized Trials and Quasi-Experiments in Education Research
This paper examines the benefits and challenges inherent in using randomized clinical trials and quasi-experimental designs in the field of education research.
Angrist, J. D. (2003). Randomized trials and quasi-experiments in education research. NBER Reporter Online, (Summer 2003), 11-14.
The Core Analytics of Randomized Experiments for Social Research
This paper examines the elements of randomized experiments for social research.
Bloom, H. S. (2006). The core analytics of randomized experiments for social research.
Randomized, Controlled Trials, Observational Studies, and the Hierarchy of Research Designs
A study comparing the efficacy of randomized controlled trials to observational studies.
Concato, J., Shah, N., & Horwitz, R. I. (2000). Randomized, controlled trials, observational studies, and the hierarchy of research designs. New England Journal of Medicine, 342(25), 1887-1892.
Can Randomized Trials Answer the Question of What Works?
This article discusses the use of randomized controlled trials as required by the Department of Education in evaluating the effectiveness of educational practices.
EDUC, A. R. O. (2005). Can randomized trials answer the question of what works?.
New Federal Policy Favors Randomized Trials in Education Research
Thisis an article from the Chronicle of Higher Education discussing the pros and cons of randomized controlled trials in education
Glenn, D. A. V. I. D. (2005). New federal policy favors randomized trials in education research. The Chronicle of Higher Education, Retrieved March, 25, 2005.
Implementing Randomized Field Trials in Education: Report of a Workshop
This book examines the use of randomized controlled trial (RCT) studies in education.
Hilton, M., & Towne, L. (Eds.). (2004). Implementing Randomized Field Trials in Education:: Report of a Workshop. National Academies Press.
The Use of Single-Subject Research to Identify Evidence-Based Practice in Special Education
The defining features of single-subject research are presented, the contributions of single-subject research for special education are reviewed, and a specific proposal is offered for using single-subject research to document evidence-based practice.
Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S., & Wolery, M. (2005). The use of single-subject research to identify evidence-based practice in special education. Exceptional Children, 71(2), 165-179.
Single-Case Research: Documenting Evidence-based Practice
Making the case that single-subject design can and should be accepted as an alternative to randomized controlled trials in determining efficacy of practices.
Horner, R. University of Oregon.
Why Most Published Research Findings Are False
This essay discusses issues and concerns that too many research findings may be false. The paper examines reasons a study may prove inaccurate including: the study power and bias, the number of other studies on the same question, and the ratio of true to no relationships. Finally, it considers the implications these problems create for conducting and interpreting research.
Ioannidis, J. P. (2005). Why most published research findings are false. PLoS medicine, 2(8), e124.
Single-Case Designs for Educational Research
This paper examines the benefits and challenges inherent in using of randomized clinical trials and quasi-experimental designs in the field of education research.
Kazdin, A. E. (2011). Single-case research designs: Methods for clinical and applied settings . Oxford University Press.
Single-Case Designs for Educational Research
This book provides a thorough summary of information about the use of single-subject experimental designs in educational research
Kennedy, C. H. (2005). Single-case designs for educational research. Pearson/A & B.
Evaluating Teacher Preparation Programs Using the Performance of their Graduates
This commentary addresses concerns for the use of value-added outcome measures commonly used to evaluate teachers and implications for the use of these metrics to assess the effectiveness of preparation programs.
Koedel, C. and Parsons, E., (2014). Evaluating teacher preparation programs using the performance of their graduates. Teachers College Record. Retrieved November 18, 2014 from http://www.tcrecord.org/Content.asp?ContentID=17741
Single-case research design and analysis: New directions for psychology and education.
This book provides a thorough summary of information about the use of single-subject experimental designs.
Kratochwill, T. R., & Levin, J. R. (1992). Single-case research design and analysis: New directions for psychology and education. Lawrence Erlbaum Associates, Inc.
Single-Case Design Technical Documentation
Single case design has made important contributions to identifying effective educational practices. Until recently, there have been no standards for evaluating the quality and quality of studies across a topic area. These standards were developed by the Institute for Education Science.
Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M. & Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved from What Works Clearinghouse website: http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf.
What Works Clearinghouse: SINGLE?CASE DESIGN TECHNICAL DOCUMENTATION
This paper by a What Works Clearinghouse the panel provides an overview of singlr-subject designs (SCDs), specifies the types of questions that SCDs are designed to answer, and discusses the internal validity of SCDs. The panel then proposes standards to be implemented by the WWC.
Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2010). Single-case designs technical documentation. What Works Clearinghouse.
Comparing Results of Systematic Reviews: Parallel Reviews of Research on Repeated Reading.
This paper demonstrates that different well accepted methods for reviewing research on repeated readings produces different results.
O’Keeffe, B. V., Slocum, T. A., Burlingame, C., Snyder, K., & Bundock, K. (2012). Comparing Results of Systematic Reviews: Parallel Reviews of Research on Repeated Reading. Education & Treatment of Children (West Virginia University Press), 35(2), 333-366
Combining estimates of effect size
This book is an in depth examination of literature synthesis along with useful advice when one attempts to interpret the results of a meta-analysis.
Shadish, W. R., & Haddock, C. K. (2009). Combining estimates of effect size. The Handbook of Research Synthesis and Meta-analysis, 257-277.
The state of the science in the meta-analysis of single-case experimental designs
This is a review of the issues and methods for conducting a meta-analysis of single-case design research studies.
Shadish, W. R., Rindskopf, D. M. & Hedges, L. V. (2008). The state of the science in the meta-analysis of single-case experimental designs. Evidence-Based Communication Assessment and Intervention, 2(3), 188-196. doi:10.1080/17489530802581603
Evaluating the validity of systematic reviews to identify empirically supported treatments
Systematic reviews are a process for assessing the quality of the literature to determine if a particular practice has met criteria for empirically supported. As with any assessment process there are issues of validity. The concepts and methodological tools of measurement validity can be applied to systematic reviews to identify their strengths and weaknesses.
Slocum, T. A., Detrich, R., & Spencer, T. D. (2012). Evaluating the validity of systematic reviews to identify empirically supported treatments. Education and Treatment of Children, 35(2), 201-233.
Randomized Trials Flourish in Developing Countries
This article reviews issues regarding the use of randomized trials in developing countries.
Viadero, D. (2006). Randomized Trials Flourish in Developing Countries. Education Week, Retrieved October 30, 2006.