Best Available Evidence Overview

All Research

TITLE
SYNOPSIS
CITATION
Why Education Experts Resist Effective Practices (And What It Would Take To Make Education More Like Medicine

The first section of this essay provides examples from reading and mathematics curricula that show experts dispensing unproven methods and flitting from one fad to another. The middle section describes how experts, for ideological reasons, have shunned some solutions that do display robust evidence of efficacy. The following sections show how public impatience has forced other professions to "grow up" and accept accountability and scientific evidence. The paper concludes with a plea to develop education into a mature profession.

Carnine, D. (2000). Why Education Experts Resist Effective Practices (And What It Would Take To Make Education More Like Medicine).

 

How Methodological Features Affect Effect Sizes in Education

The purpose of this article is to examine how methodological features such as types of publication, sample sizes, and research designs affect effect sizes in experiments.

Cheung, A., & Slavin, R. E. (2015). How methodological features affect effect sizes in education. Best Evidence Encyclopedia, Johns Hopkins University, Baltimore, MD.

 

Pitfalls of Data Analysis (or How to Avoid Lies and Damned Lies)

This paper examines things that people often overlook in their data analysis, and ways people sometimes "bend the rules" of statistics to support their viewpoint. It discusses ways you can make sure your own statistics are clear and accurate.

Helberg, C., (1995). Pitfalls of Data Analysis (or How to Avoid Lies and Damned Lies). Third International Applied Statistics in Industry Conference in Dallas, TX, June 5-7, 1995.

Some recommendations for the reporting of quantitative studies

This editorial offers recommendations aimed at providing examples of a series of elements that may significantly contribute towards demonstrating the robustness of quantitative results. It is therefore not a methodological guide but instead a guide that acts as a reminder of some basic principles when reporting quantitative research.

López, X., Valenzuela, J., Nussbaum, M., & Tsai, C. C. (2015). Some recommendations for the reporting of quantitative studies. Computers & Education, 91(C), 106-110.

Uneducated Guesses: Using Evidence to Uncover Misguided Education Policies

This book offers concrete examples from educational testing to illustrate the importance of empirically and logically scrutinizing the evidence used to make education policy decisions. Wainer uses statistical evidence to show why some of the most widely held beliefs in education may be wrong.

Wainer, H. (2011). Uneducated guesses: Using evidence to uncover misguided education policies. Princeton University Press.

Randomized Trials and Quasi-Experiments in Education Research
This paper examines the benefits and challenges inherent in using randomized clinical trials and quasi-experimental designs in the field of education research.
Angrist, J. D. (2003). Randomized trials and quasi-experiments in education research. NBER Reporter Online, (Summer 2003), 11-14.
The Right To Effective Education
The purpose of this Web Site is to disseminate information about behavioral fluency; and to connect people interested in building fluent behavior of all kinds and for all types of people.
Barrett, B. H., Beck, R., Binder, C., Cook, D. A., Engelmann, S., Greer, R. D., ... & Watkins, C. L. (1991). The right to effective education. The Behavior Analyst, 14(1), 79.
Assessing The Value-Added Effects Of Literacy Collaborative Professional Development On Student Learning
This is a 4-year longitudinal study of the effects of Literacy Collaborative (LC), a school-wide reform model that relies on coaching of teachers for improving student literacy learning.
Biancarosa, G., Bryk, A. S., & Dexter, E. R. (2010). Assessing the value-added effects of literacy collaborative professional development on student learning. The elementary school journal, 111(1), 7-34.
Distinguishing Science and Pseudoscience
This paper works as a primer to help discern real science from the significant numbers of reports of false science and pseudoscience that we are daily bombarded with from the media as well as published works that purport to be scientific.
Coker, R. (2001). Distinguishing science and pseudoscience. Retrieved September, 10, 2009.
Fallacy Files
This is a collection and examination of logical fallacies.
Curtis, G. N. (2012). Fallacy files. URL http://www. fallacyfiles. org.
Stephen's guide to the logical fallacies
This paper examines common logical fallacies
Downes, S. (1995). Stephen’s guide to the logical fallacies. Electronic document.
Can Randomized Trials Answer the Question of What Works?
This article discusses the use of randomized controlled trials as required by the Department of Education in evaluating the effectiveness of educational practices.
EDUC, A. R. O. (2005). Can randomized trials answer the question of what works?.
Scientific Research and Evidence-Based Practice, 2003
This paper examines the issue of evidence-based practices for use in education. Evidence-based education (EBE) is examined in the context of evidence-based practice in the field of medicine on which EBE is based. The medical model is compared with EBE with an emphasis on how to develop EBE products and services.
Hood, P. D. (2003). Scientific research and evidence-based practice. San Francisco: WestEd.
Why Most Published Research Findings Are False
This essay discusses issues and concerns that too many research findings may be false. The paper examines reasons a study may prove inaccurate including: the study power and bias, the number of other studies on the same question, and the ratio of true to no relationships. Finally, it considers the implications these problems create for conducting and interpreting research.
Ioannidis, J. P. (2005). Why most published research findings are false. PLoS medicine, 2(8), e124.
The National Center for Special Education Research (NCSER) in IES
This powerpoint presentation provides an overview of the National Center for Education Research (NCSER)
Kame’enui, E. and Gonzalez, P. (2006)
Single-Case Designs for Educational Research
This paper examines the benefits and challenges inherent in using of randomized clinical trials and quasi-experimental designs in the field of education research.
Kazdin, A. E. (2011). Single-case research designs: Methods for clinical and applied settings . Oxford University Press.
A Policymaker's Primer on Education Research
The goal of this Policymaker’s Primer on Education Research is to help policymakers and other interested individuals answer three big questions: (1) What does the research say? (2) Is the research trustworthy? (3) How can the research be used to guide policy?
Lauer, P. A. (2004, February). A Policymaker's Primer on Education Research: How to Understand, Evaluate and Use it. ECS.
What Is a Culture of Evidence? How Do You Get One? And... Should You Want One?
This paper uses a framework derived from Cultural Historical Activity Theory to describe changes in organizational practice in two teacher education programs as they began to use new sources of outcome data to make decisions about program design, curriculum and instruction.
Peck, C. A., & McDonald, M. A. What Is a Culture of Evidence? How Do You Get One? And... Should You Want One?. Teachers College Record. Date accessed: 3/21/14 http://www.tcrecord.org/Content.asp?contentid=17359
Experimental and Quasi-Experimental Designs for Generalized Causal Inference
This book is a valuable resource for graduate students and applied researchers who are interested designing experimental studies as well as for those needing to interpret both experimental and quasi-experimental research in the social and behavioral sciences.
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Wadsworth Cengage Learning.
Baloney Detection Kit
“The Demon-Haunted World”, Carl Sagan provides tools for skeptical thinking. This excellent list is a strong tool to weed out the bad seeds in science.
Shermer, M., & Linse, P. (2001). The Baloney Detection Kit. Skeptic Society.
Pasteur’s Quadrant as the Bridge Linking Rigor with Relevance
The authors propose educational design research and communities of practice as frameworks through which to realize the promise of Pasteur's quadrant.
Smith, G. J., Schmidt, M. M., Edelen-Smith, P. J., & Cook, B. G. (2013). Pasteur's Quadrant as the Bridge Linking Rigor With Relevance. Exceptional Children, 79(2), 147-161.
Can Traditional Public Schools Replicate Successful Charter Models? A Different Take
This Op Ed piece from Daniel Willingham examines the study, Injecting Charter School Best Practices into Traditional Public Schools: Evidence from Field Experiments. Willingham makes a number of interesting points including: the need to disseminate the results of studies that failed to produce significant effects and the importance of understanding what and how it the study failed.
Willingham, D. (2014). Can Traditional Public Schools Replicate Successful Charter Models? A Different Take. Real Clear Education.
Translating Evidence into Efficacy: Evaluating Strengths and Weaknesses of Different Study Designs
This is a critical examination of the strength and weaknesses of research designs.
Wong, N. (2006). Translating Evidence into Efficacy: Evaluating Strengths and Weaknesses of Different Study Designs. Retrieved July 17, 2014.
Back to Top