| Literature DB >> 35478098 |
Tessa Taylor1,2, Marc J Lanovaz3.
Abstract
Behavior analysts typically rely on visual inspection of single-case experimental designs to make treatment decisions. However, visual inspection is subjective, which has led to the development of supplemental objective methods such as the conservative dual-criteria method. To replicate and extend a study conducted by Wolfe et al. (2018) on the topic, we examined agreement between the visual inspection of five raters, the conservative dual-criteria method, and a machine-learning algorithm (i.e., the support vector classifier) on 198 AB graphs extracted from clinical data. The results indicated that average agreement between the 3 methods was generally consistent. Mean interrater agreement was 84%, whereas raters agreed with the conservative dual-criteria method and the support vector classifier on 84% and 85% of graphs, respectively. Our results indicate that both objective methods produce results consistent with visual inspection, which may support their future use.Entities:
Keywords: artificial intelligence; conservative dual criteria; interrater agreement; machine learning; visual inspection
Mesh:
Year: 2022 PMID: 35478098 PMCID: PMC9323513 DOI: 10.1002/jaba.921
Source DB: PubMed Journal: J Appl Behav Anal ISSN: 0021-8855
Figure 1Example of a Dataset Separated by a Support Vector Classifier
Proportion of Correspondence and Kappa Agreement Between the Different Methods (Binary Outcomes)
| Expert A | Expert B | Expert C | Expert D | Expert E | CDC Method | |
|---|---|---|---|---|---|---|
| Expert A | ||||||
| Expert B | .79 / .57 | |||||
| Expert C | .80 / .59 | .90 / .77 | ||||
| Expert D | .72 / .43 | .88 / .71 | .86 / .67 | |||
| Expert E | .83 / .65 | .91 / .79 | .89 / .75 | .85 / .65 | ||
| CDC Method | .79 / .58 | .84 / .64 | .87 /.72 | .83 / .59 | .85 / .67 | |
| SVC | .78 / .55 | .89/ .75 | .87 / .71 | .84 / .63 | .87 / .71 | .81 / .59 |
Note. For each pair, the proportion of correspondence is on the left of the slash and the kappa value on the right. CDC: conservative dual‐criteria, SVC: support vector classifier.
Correlation Between the Different Methods (Continuous Outcomes)
| Expert A | Expert B | Expert C | Expert D | Expert E | |
|---|---|---|---|---|---|
| Expert A | |||||
| Expert B | .66 | ||||
| Expert C | .66 | .86 | |||
| Expert D | .66 | .82 | .84 | ||
| Expert E | .72 | .90 | .87 | .82 | |
| SVC | .60 | .79 | .76 | .78 | .76 |
Note. SVC: support vector classifier.
Figure 2Average Agreement of Each Analysis When the Conservative Dual‐Criteria (CDC) and Support Vector Classifier (SVC) Indicated an Effect or No Effect
Proportion of Graphs with Given Phase Lengths in the Presence and Absence of Agreement
| Number of Data Points in the Phase | |||||
|---|---|---|---|---|---|
| 3 | 4 or 5 | 6 to 9 | 10 or more | ||
| Phase A |
|
|
|
| |
| Visual inspection – Agreement | .531 | .091 | .143 | .234 | |
| Visual inspection – Disagreement | .130 | .087 | .217 | .565 | |
| CDC – Disagreement | .133 | .267 | .067 | .533 | |
| SVC – Disagreement | .615 | .000 | .231 | .154 | |
| Phase B |
|
|
|
| |
| Visual inspection – Agreement | .354 | .086 | .211 | .349 | |
| Visual inspection – Disagreement | .174 | .130 | .217 | .478 | |
| CDC – Disagreement | .067 | .267 | .400 | .267 | |
| SVC – Disagreement | .308 | .077 | .154 | .462 | |
Note. The disagreements for the CDC and SVC are relative to exemplars on which visual raters mostly agreed (see text for details). CDC: conservative dual‐criteria method, SVC: support vector classifier.