Literature DB >> 28633936

Additional comparisons of randomization-test procedures for single-case multiple-baseline designs: Alternative effect types.

Joel R Levin1, John M Ferron2, Boris S Gafurov3.   

Abstract

A number of randomization statistical procedures have been developed to analyze the results from single-case multiple-baseline intervention investigations. In a previous simulation study, comparisons of the various procedures revealed distinct differences among them in their ability to detect immediate abrupt intervention effects of moderate size, with some procedures (typically those with randomized intervention start points) exhibiting power that was both respectable and superior to other procedures (typically those with single fixed intervention start points). In Investigation 1 of the present follow-up simulation study, we found that when the same randomization-test procedures were applied to either delayed abrupt or immediate gradual intervention effects: (1) the powers of all of the procedures were severely diminished; and (2) in contrast to the previous study's results, the single fixed intervention start-point procedures generally outperformed those with randomized intervention start points. In Investigation 2 we additionally demonstrated that if researchers are able to successfully anticipate the specific alternative effect types, it is possible for them to formulate adjusted versions of the original randomization-test procedures that can recapture substantial proportions of the lost powers.
Copyright © 2017 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

Keywords:  Alternative effect types; Multiple-baseline design; Randomization statistical tests; Single-case intervention research

Mesh:

Year:  2017        PMID: 28633936     DOI: 10.1016/j.jsp.2017.02.003

Source DB:  PubMed          Journal:  J Sch Psychol        ISSN: 0022-4405


  7 in total

1.  Quantitative Techniques and Graphical Representations for Interpreting Results from Alternating Treatment Design.

Authors:  Rumen Manolov; René Tanious; Patrick Onghena
Journal:  Perspect Behav Sci       Date:  2021-05-13

2.  A Priori Justification for Effect Measures in Single-Case Experimental Designs.

Authors:  Rumen Manolov; Mariola Moeyaert; Joelle E Fingerhut
Journal:  Perspect Behav Sci       Date:  2021-03-25

3.  A Randomized Case Series Approach to Testing Efficacy of Interventions for Minimally Verbal Autistic Children.

Authors:  Jo Saul; Courtenay Norbury
Journal:  Front Psychol       Date:  2021-05-24

Review 4.  N-of-1 Clinical Trials in Nutritional Interventions Directed at Improving Cognitive Function.

Authors:  Natalia Soldevila-Domenech; Anna Boronat; Klaus Langohr; Rafael de la Torre
Journal:  Front Nutr       Date:  2019-07-23

5.  Blended cognitive behaviour therapy for children and adolescents with mitochondrial disease targeting fatigue (PowerMe): study protocol for a multiple baseline single case experiment.

Authors:  I L Klein; K F E van de Loo; T J Hoogeboom; M C H Janssen; J A M Smeitink; E van der Veer; C M Verhaak; J A E Custers
Journal:  Trials       Date:  2021-03-01       Impact factor: 2.279

6.  A proposal for the assessment of replication of effects in single-case experimental designs.

Authors:  Rumen Manolov; René Tanious; Belén Fernández-Castilla
Journal:  J Appl Behav Anal       Date:  2022-04-25

7.  Randomized Single-Case Intervention Designs and Analyses for Health Sciences Researchers: A Versatile Clinical Trials Companion.

Authors:  Joel R Levin; Thomas R Kratochwill
Journal:  Ther Innov Regul Sci       Date:  2021-04-01       Impact factor: 1.778

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.