In this article, we respond to Wolery’s critique of the What Works Clearinghouse (WWC) pilot Standards, which were developed by the current authors. We do so to provide additional information and clarify some points previously summarized in this journal. We also respond to several concerns raised by Maggin, Briesch, and Chafouleas after they applied the Standards to a single-case design synthesis published in this journal. The overall purpose of this response is to clarify what the Standards are designed to accomplish and to offer our views about future revisions.
AhnS.AmesA. J.MyersN. D. (2012). A review of meta-analyses in education: Methodological strengths and weaknesses. Review of Educational Research, 82, 436–476.
2.
BiddleB. J.BerlinerD. C. (2002). Small class size and its effects. Educational Leadership, 59(5), 12–23.
3.
CooperH.KoenkaA. C. (2012). The overview of reviews: Unique challenges and opportunities when research syntheses are the principal elements of new integrative scholarship. American Psychologist, 67, 446–462.
4.
DuPaulG. J.EckertT. L.VilardoB. (2012). The effects of school-based interventions for attention deficit hyperactivity disorder: A meta-analysis 1996–2010. School Psychology Review, 41, 387–412.
5.
FerronJ.JonesP. K. (2006). Tests for the visual analysis of response-guided multiple-baseline data. The Journal of Experimental Education, 75, 66–81.
6.
FerronJ. M.LevinJ. R. (in press). Single-case permutation and randomization statistical tests: Present status, promising new developments. In KratochwillT. R.LevinJ. R. (Eds.), Single-case intervention research: Methodological and data-analysis advances. Washington, DC: American Psychological Association.
7.
HornerR. H.CarrE. G.HalleJ.McGeeG.OdomS.WoleryM. (2005). The use of single subject research to identify evidence-based practice in special education. Exceptional Children, 71, 165–179.
8.
KratochwillT. R.HitchcockJ.HornerR. H.LevinJ. R.OdomS. L.RindskopfD. M.ShadishW. R. (2010). Single case designs technical documentation. Retrieved from http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf
9.
KratochwillT. R.HitchcockJ. H.HornerR. H.LevinJ. R.OdomS. L.RindskopfD. M.ShadishW. R. (2013). Single-case intervention research design standards. Remedial and Special Education, 34, 26–38.
10.
KratochwillT. R.LevinJ. R. (2010). Enhancing the scientific credibility of single-case intervention research: Randomization to the rescue. Psychological Methods, 15, 124–144.
11.
KratochwillT. R.StoiberK. C.GutkinT. B. (2001). Empirically supported interventions in school psychology: The role of negative results in outcome research. Psychology in the Schools, 37, 399–413.
12.
LevinJ. R. (1994). Crafting educational intervention research that’s both credible and creditable. Educational Psychology Review, 6, 231–243.
13.
MagginD. M.BrieschA. M.ChafouleasS. M. (2013). An application of the What Works Clearinghouse standards for evaluating single-subject research: Synthesis of the self-management literature base. Remedial and Special Education, 34, 44–58.
14.
MagginD. M.ChafouleasS. M. (2013). Introduction to the special series: Issues and advances of synthesizing single-case research. Remedial and Special Education, 34, 3–8.
15.
MagginD. M.ChafouleasS. M.GoddardK. M.JohnsonA. H. (2011). A systematic evaluation of token economies as a classroom management tool for students with challenging behavior. Journal of School Psychology, 49, 529–554.
16.
MaibachE. (2012). Knowing our options for setting the record straight, when doing so is particularly important. Psychological Science in the Public Interest, 13, 105.
17.
NyeB.HedgesL. V.KonstantopoulosS. (2000). The effects of small classes on academic achievement: The results of the Tennessee class size experiment. American Educational Research Journal, 37, 123–151.
18.
ReichowB.BartonE. E.SewellJ. N.GoodL.WoleryM. (2010). Effects of weighted vests on the engagement of children with developmental delays and autism. Focus on autism and Other Developmental Disabilities, 25, 3–11.
19.
ReppA. C.HornerR. H. (1999). Functional analysis of problem behavior: From effective assessment to effective support. Belmont, CA: Wadsworth.
20.
SanettiL. H.KratochwillT. R. (2014). Treatment integrity: Methodological and conceptual advances in research and practice. Washington, DC: American Psychological Association.
21.
ShadishW. R. (1995). The logic of generalization: Five principles common to experiments and ethnographies. American Journal of Community Psychology, 23, 419–428.
22.
ShadishW. R.CookT. D.CampbellD. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton Mifflin.
23.
SinghN. N.LancioniG. E.SinghA. N.WintonA. S. W.SinghA. N.AdkinsA. D.SinghJ. (2008). Clinical and benefit-cost outcomes of teaching a mindfulness-based procedure with adult offenders with intellectual disabilities. Behavior Modification, 32, 622–637. doi:10.1177/0145445508315854
24.
SmithJ. D. (2012). Single-case experimental designs: A systematic review of published, research and recommendations for researchers and reviewers. Psychological Methods, 17, 510–550.
25.
SwobodaC. M.KratochwillT. R.HornerR. H.LevinJ. R. (2012). Visual analysis training protocol: Applications with the alternating treatment, multiple baseline, and ABAB designs. Unpublished manuscript, University of Wisconsin–Madison.
26.
SwobodaC. M.KratochwillT. R.LevinJ. R. (2010, November). Conservative dual-criterion method for single-case research: A guide for visual analysis of AB, ABAB, and multiple-baseline designs (Working Paper 2010-13). Madison: Wisconsin Center for Education Research. Retrievable from http://www.wcer.wisc.edu/publications/workingPapers/Working_Paper_No_2010_13.php
27.
WendtO.MillerB. (2012). Quality appraisal of single-subject experimental designs: An overview and comparison of different appraisal tools. Education & Treatment of Children, 35, 235–268.
28.
WoleryM. (2013). A commentary: Single-case design technical document of the What Works Clearinghouse. Remedial and Special Education, 43, 39–43.