Visual analysis in single case experimental design studies: Brief review and guidelines.

a Department of Special Education , The University of Georgia , Athens , GA , USA.
Neuropsychological Rehabilitation (Impact Factor: 2.07). 07/2013; DOI: 10.1080/09602011.2013.815636
Source: PubMed

ABSTRACT Visual analysis of graphic displays of data is a cornerstone of studies using a single case experimental design (SCED). Data are graphed for each participant during a study with trend, level, and stability of data assessed within and between conditions. Reliable interpretations of effects of an intervention are dependent on researchers' understanding and use of systematic procedures. The purpose of this paper is to provide readers with a rationale for visual analysis of data when using a SCED, a step-by-step guide for conducting a visual analysis of graphed data, as well as to highlight considerations for persons interested in using visual analysis to evaluate an intervention, especially the importance of collecting reliability data for dependent measures and fidelity of implementation of study procedures.

  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper introduces the Special Issue of Neuropsychological Rehabilitation on Single Case Experimental Design (SCED) methodology. SCED studies have a long history of use in evaluating behavioural and psychological interventions, but in recent years there has been a resurgence of interest in SCED methodology, driven in part by the development of standards for conducting and reporting SCED studies. Although there is consensus on some aspects of SCED methodology, the question of how SCED data should be analysed remains unresolved. This Special Issues includes two papers discussing aspects of conducting SCED studies, five papers illustrating use of SCED methodology in clinical practice, and nine papers that present different methods of SCED data analysis. A final Discussion paper summarises points of agreement, highlights areas where further clarity is needed, and ends with a set of resources that will assist researchers conduct and analyse SCED studies.
    Neuropsychological Rehabilitation 04/2014; · 2.07 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In the context of the evidence-based practices movement, the emphasis on computing effect sizes and combining them via meta-analysis does not preclude the demonstration of functional relations. For the latter aim, we propose to augment the visual analysis to add consistency to the decisions made on the existence of a functional relation without losing sight of the need for a methodological evaluation of what stimuli and reinforcement or punishment are used to control the behavior. Four options for quantification are reviewed, illustrated, and tested with simulated data. These quantifications include comparing the projected baseline with the actual treatment measurements, on the basis of either parametric or nonparametric statistics. The simulated data used to test the quantifications include nine data patterns in terms of the presence and type of effect and comprise ABAB and multiple-baseline designs. Although none of the techniques is completely flawless in terms of detecting a functional relation only when it is present but not when it is absent, an option based on projecting split-middle trend and considering data variability as in exploratory data analysis proves to be the best performer for most data patterns. We suggest that the information on whether a functional relation has been demonstrated should be included in meta-analyses. It is also possible to use as a weight the inverse of the data variability measure used in the quantification for assessing the functional relation. We offer an easy to use code for open-source software for implementing some of the quantifications.
    Behavior Modification 08/2014; 38(6):878-913. · 1.70 Impact Factor


Available from
May 28, 2014