Alexander MacIntosh1,2,3, Eric Desailly4, Nicolas Vignais3,5, Vincent Vigneron6, Elaine Biddiss1,2. 1. Bloorview Research Institute, Holland Bloorview Kids Rehabilitation Hospital, Toronto, Canada. 2. Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Canada. 3. Complexité, Innovation, Activités Motrices et Sportives, Sciences du Sport, de la Motricité et du Mouvement Humain, Université Paris-Saclay, Orsay, France. 4. Recherche et innovation, Fondation Ellen Poidatz, Saint Fargeau-Ponthierry, France. 5. Complexité, Innovation, Activités Motrices et Sportives, Université d'Orléans, Orléans, France. 6. Informatique, Bio-informatique et Systèmes Complexes, l'Université d'Evry Val-d'Essonne, Evry, France.
Abstract
IMPORTANCE/ BACKGROUND: Movement-controlled video games have potential to promote home-based practice of therapy activities. The success of therapy gaming interventions depends on the quality of the technology used and the presence of effective support structures. AIM: This study assesses the feasibility of a novel intervention that combines a co-created gaming technology integrating evidence-based biofeedback and solution-focused coaching (SFC) strategies to support therapy engagement and efficacy at home. METHODS: Following feasibility and single-case reporting standards (CONSORT and SCRIBE), this was a non-blind, randomized, multiple-baseline, AB, design. Nineteen (19) young people with cerebral palsy (8-18 years old) completed the 4-week home-based intervention in France and Canada. Participant motivations, personalized practice goals, and relevance of the intervention to daily activities were discussed in a Solution Focused Coaching-style conversation pre-, post-intervention and during weekly check-ins. Participants controlled a video game by completing therapeutic gestures (wrist extension, pinching) detected via electromyography and inertial sensors on the forearm (Myo Armband and custom software). Process feasibility success criteria for recruitment response, completion and adherence rates, and frequency of technical issues were established a priori. Scientific feasibility, effect size estimates and variance were determined for Body Function outcome measures: active wrist extension, grip strength and Box and Blocks Test; and for Activities and Participation measures: Assisting Hand Assessment (AHA), Canadian Occupational Performance Measure (COPM) and Self-Reported Experiences of Activity Settings (SEAS). RESULTS: Recruitment response (31%) and assessment completion (84%) rates were good and 74% of participants reached self-identified practice goals. As 17% of technical issues required external support to resolve, the intervention was graded as feasible with modifications. No adverse events were reported. Moderate effects were observed in Body Function measures (active wrist extension: SMD = 1.82, 95%CI = 0.85-2.78; Grip Strength: SMD = 0.63, 95%CI = 0.65-1.91; Box and Blocks: Hedge's g = 0.58, 95%CI = -0.11-1.27) and small-moderate effects in Activities and Participation measures (AHA: Hedge's g = 0.29, 95%CI = -0.39-0.97, COPM: r = 0.60, 95%CI = 0.13-0.82, SEAS: r = 0.24, 95%CI = -0.25-0.61). CONCLUSION: A definitive RCT to investigate the effectiveness of this novel intervention is warranted. Combining SFC-style coaching with high-quality biofeedback may positively engage youth in home rehabilitation to complement traditional therapy. TRIAL REGISTRATION: ClinicalTrials.gov, U.S. National Library of Medicine: NCT03677193.
RCT Entities:
IMPORTANCE/ BACKGROUND: Movement-controlled video games have potential to promote home-based practice of therapy activities. The success of therapy gaming interventions depends on the quality of the technology used and the presence of effective support structures. AIM: This study assesses the feasibility of a novel intervention that combines a co-created gaming technology integrating evidence-based biofeedback and solution-focused coaching (SFC) strategies to support therapy engagement and efficacy at home. METHODS: Following feasibility and single-case reporting standards (CONSORT and SCRIBE), this was a non-blind, randomized, multiple-baseline, AB, design. Nineteen (19) young people with cerebral palsy (8-18 years old) completed the 4-week home-based intervention in France and Canada. Participant motivations, personalized practice goals, and relevance of the intervention to daily activities were discussed in a Solution Focused Coaching-style conversation pre-, post-intervention and during weekly check-ins. Participants controlled a video game by completing therapeutic gestures (wrist extension, pinching) detected via electromyography and inertial sensors on the forearm (Myo Armband and custom software). Process feasibility success criteria for recruitment response, completion and adherence rates, and frequency of technical issues were established a priori. Scientific feasibility, effect size estimates and variance were determined for Body Function outcome measures: active wrist extension, grip strength and Box and Blocks Test; and for Activities and Participation measures: Assisting Hand Assessment (AHA), Canadian Occupational Performance Measure (COPM) and Self-Reported Experiences of Activity Settings (SEAS). RESULTS: Recruitment response (31%) and assessment completion (84%) rates were good and 74% of participants reached self-identified practice goals. As 17% of technical issues required external support to resolve, the intervention was graded as feasible with modifications. No adverse events were reported. Moderate effects were observed in Body Function measures (active wrist extension: SMD = 1.82, 95%CI = 0.85-2.78; Grip Strength: SMD = 0.63, 95%CI = 0.65-1.91; Box and Blocks: Hedge's g = 0.58, 95%CI = -0.11-1.27) and small-moderate effects in Activities and Participation measures (AHA: Hedge's g = 0.29, 95%CI = -0.39-0.97, COPM: r = 0.60, 95%CI = 0.13-0.82, SEAS: r = 0.24, 95%CI = -0.25-0.61). CONCLUSION: A definitive RCT to investigate the effectiveness of this novel intervention is warranted. Combining SFC-style coaching with high-quality biofeedback may positively engage youth in home rehabilitation to complement traditional therapy. TRIAL REGISTRATION: ClinicalTrials.gov, U.S. National Library of Medicine: NCT03677193.
Interactive computer play (ICP) is “any kind of computer game or virtual reality technology where the individual can interact and play with virtual objects in a computer-generated environment” [1]. It is an attractive way to augment traditional therapy and align with children’s interest. Eight of 10 young people with Cerebral Palsy (CP) enjoy playing video games recreationally [2]. Cerebral Palsy (CP) is neuromuscular disability impacting approximately 2.11 per 1,000 live births in high-resource settings [3]. CP presents with positive and negative motor signs including: spasticity, weakness, impaired selective motor control and sensory deficits [4]. ICP has been used to improve balance [5], gait symmetry [6], upper limb strength [7] and other functional abilities in people with CP. Perhaps the largest study to date in this field is the ‘Move it to improve it’ (MITII) randomized controlled trial that used web-based therapy at home to improve occupational performance and visual perception in children with unilateral CP [8]. These types of studies have shown moderate evidence towards improving balance and overall motor skill but weak evidence towards improving upper extremity skills, joint control, gait and strength [9]. In part, the success of ICP-based therapies in the home has been thwarted by the challenge of (i) delivering high quality feedback that informs individuals with CP on any progress towards therapy goals as a therapist would in a clinic setting [10]; and (ii) sustaining engagement in the intervention over an extended period. Children with CP and their families have expressed a desire for more accurate feedback within ICP [8] and reported the challenge of sustaining intrinsic interest in ICP therapies in the home.
High quality feedback
Improving feedback quality in ICP can increase practice engagement and efficiency at home [11]. Biofeedback, where a people receives information about their body state (e.g. heart rate, foot speed, muscle activity) [1], can help increase awareness and control by informing them how their body is functioning [11]. Echoing families’ remarks, a recent systematic review found that most interventions using biofeedback do so in a way that positions the people in a passive role in their practice and builds dependency, hindering motor learning progress [10].Working with young people with CP and clinicians, we recently co-created an evidence-based biofeedback strategy (S1 File). The implementation improves biofeedback use (efficiency, effectiveness, engagement) by offering autonomy, varying feedback presentation (e.g. visual, audio), and being proportionate to the person’s ability [10]. These mechanisms serve to improve the effectiveness of ICP therapies. They help direct the person to higher-quality movements at home where practice is completed without therapist supervision and provide a stronger cognitive link between game-focused (e.g. scoring points) and therapy-focused (e.g. decreased compensatory movements) goals (S1 File).
Sustaining engagement
Low adherence is a primary concern in home-based interventions, historically ranging from 34–67% [12,13]. A recent review consistently found engagement and adherence difficult to maintain [14]. To improve adherence and engagement, an intervention must closely align with participants values and sustain their intrinsic motivation [14]. Intrinsic motivation is influenced by both the ICP technology (feedback, activity personalization), and the intervention design (therapist interaction, social support). Solution-focused coaching in pediatric rehabilitation (SFC-Peds) is a model of coaching recommended for youth with disabilities [15]. SFC-Peds builds intrinsic motivation to generate personal interest in health behaviour changes [16,17]. In SFC-Peds, coaches collaborate with children to help them envision their “preferred future” [15]. Through this process, children develop therapy goals and a supporting plan that aligns with their priorities.
Aim
The success of an ICP intervention is influenced both by the technology used and the supports provided. In this project, we investigate the combination of a novel ICP technology integrating evidence-based biofeedback and Solution Focused Coaching strategies to promote home-based practice of hand/arm exercises. The ICP is a video game where participants complete therapeutic hand gestures to control game actions on-screen. The approach aims to provide a motivational, goal-based environment to address muscle weakness and selective motor control. This paper addresses intervention feasibility as articulated by Thabane et al (2010) [18]. This framework highlights that the aim of feasibility testing can be related to one or more of the following four classifications: process, resources, management and scientific. The objective here is to support development towards future randomized controlled trials and not to define statistical or clinical effectiveness of the intervention [18]. In this study, we concentrate on two of the four feasibility classifications:First, we assess process feasibility of the biofeedback-enhanced therapy video game intervention protocol for young people with CP. The objective here is to determine the ability to enroll participants, enable home-based practice, and retain their activity during a 1-month intervention. To this purpose, a priori success criteria were established for the recruitment and response rates, adherence, and frequency of technical difficulties impeding home practice [18].Second, we assess the scientific feasibility of the intervention by estimating the effect size and variance for six person-centred outcome measures for the hand and wrist. The measures are aligned to the Body Functions and Activities and Participation chapters of the ICF (international classification of functioning disability and health) [19,20].
Methods
Design
A randomized, multiple-baseline, single-case experimental design (SCED) with two phases was applied. SCED designs can provide strong evidence wherein participants serve as their own controls for the purpose of within-subject comparison [21]. SCED research typically involves collecting a representative baseline phase through repeated measurements of an outcome of interest (Phase 1) that is then compared with the intervention phase (Phase 2). In a randomized SCED design, the time lapse between Phase 1 and Phase 2 is randomly allocated for each participant to further mitigate threats to internal validity [22]. SCED designs are increasingly used in clinical intervention research, particularly when sample size is limited, SCEDs can provide a rigorous approach to generate higher quality evidence [23]. In this feasibility study, the Single-Case Reporting Guideline in Behavioural Interventions (SCRIBE) [21] and the Consolidated Standards of Reporting Trials (CONSORT) methodologies were used (see S2 File and S3 File) [24]. Participants, researchers and assessing therapists were not blind to treatment phase. One methodological change occurred during the first week of enrollment. Inclusion criteria was expanded to include those with mixed tone and mild dystonia since it was found that they could control the game effectively and the potential therapeutic value was confirmed by clinicians. Procedural fidelity was maintained by following a standard operating protocol outlining the activities and resources for each phase (S4 File). Ethical approval was obtained by Holland Bloorview’s Research Ethics Board (approved July 27, 2018, amended November 23, 2018, to include participants with mixed tone and mild dystonia) and the French national Comité de protection des personnes (CPP) (approved July 6, 2018). The authors confirm that all ongoing and related trials for this intervention are registered.
Participants
From September to October 2018, ten participants were recruited from a regional rehabilitation hospital at a metropolitan city in Canada. From November 2018 to January 2019, ten participants were recruited from a rural regional rehabilitation hospital in France.Inclusion criteria:Cerebral Palsy diagnosis.8–18 years old.Manual Abilities Classification System levels I-III [25].Having a goal relating to improving hand/wrist function.Dominantly spastic presentation. This original criterion was expanded to included mixed tone and mild dystonia.Normal or corrected-to-normal vision and hearing.Able to co-operate, understand, and follow simple instructions for game play.Having passive wrist extension at least 10° greater than active wrist extension.Exclusion criteria:Receiving active therapy of the hand/wrist.History of unmanaged epilepsy.Having received a Botulinum Toxin treatment within 3 months or constraint-based movement therapy within 6 months before study enrollment.Visual, cognitive or auditory disability that would interfere with play.Unable to commit an estimated minimum of 5-hours to training plan over four weeks.
Sample size rationale
The sample size of twenty was determined based on recommended samples of a future definitive RCT with a small (0.2)–medium (0.5) effect, 90% power and two-sided 5% significance. Based on estimates of previous interventions on active wrist extension in the literature [26-29] a pre/post change of 10° with an expected standard deviation of 4–12° would require between 8–22 participants [30].
Recruitment
Occupational therapists and developmental pediatricians identified participants who met inclusion criteria from their existing or previous caseload. In Canada, eligible participants were also identified through the hospital’s centralized recruitment database, connect2research. In Canada the researcher telephoned potential participants after sending an invitation letter by mail. Then, the researcher screened interested participants and obtained written informed consent. Recruitment in Canada took place from September-November 2018. In France, the developmental pediatricians invited eligible individuals to participate. Developmental pediatricians obtained written informed consent from interested participants. Recruitment in France took place from November 2018—January 2019. Caregivers gave consent and were consulted to ensure the child could provide consent or assent. If capable, the child gave consent/assent [31].
Protocol
The description follows the Template for Intervention Description and Replication (TIDieR guidelines), see S5 File [32]. S6 File.
Baseline (phase A)
Participants met once with the researcher and Occupational Therapist (60 minutes) in clinic. In a Solution-focused Coaching style [15,17] conversation, they discussed: motivations for participating, personalized scheduling and practice goals, how the intervention connects to daily activities. By the end of this conversation, participants established Canadian Occupational Performance Measure (COPM) goals [33]. Caregivers were present if desired. The dialogue was intended to improve cognitive engagement and consequently, home-play adherence [16]. Therapists and researchers guiding these conversations received one day of formal training and practiced scenarios with a certified SFC coach. Fidelity to the coaching style was maintained by following a conversation checklist developed with the certified SFC coach (S4 File).Following the coaching conversation, a therapist assessed bimanual performance (Assisting Hand Assessment (AHA)) and gross manual dexterity (Box and Blocks (B&B)). The researcher visited each participant’s home for multiple baseline testing of wrist extension and grip strength (3–6 visits, 30-minutes sessions). The number of baseline sessions was ‘data-driven’ to establish stability in the primary measure of effectiveness: active wrist extension–open fingers (AWEO). Stability was defined as 80% of phase 1 data within interquartile range [34]. After baseline, participants waited a computer-generated randomized number of days (between 1–10 days) to begin the intervention.
Activity description
During 1–2 baseline sessions the researcher habituated the participant to ICP activity system controls. Participants learned to control the video game using a therapeutic gesture, one of: wrist extension- open fingers, wrist extension- closed fingers, finger-thumb pinch, supination. In the SFC-Peds style conversation, therapists helped participants identify which gesture to practice based on the daily activities that were established to be important to them. All but two participants practiced wrist extension. In the game, participants are rewarded by making the gesture at the correct time with high quality (i.e. high forearm extensor activity and isolated hand movement).
Intervention (phase B)
After the randomized waiting period, the researcher gave participants the system to practice at home. The system includes hardware: laptop, electromyography (EMG) and inertial sensor (Myo Armband) and software: adapted commercial video game (Dashy Square) and custom software to interpret movements and control the game (MATLAB 2017b). Participants practiced at home alone for 4-weeks according to their self-defined practice schedule established during the initial conversation. Once per week, the researcher visited each participant. During the 60-minute visit they:Recorded gameplay with a video camera and electro-goniometerMeasured wrist extension and grip strengthHad a ‘check-in’ conversation to re-evaluate the self-defined practice goalsThe check-in followed the SFC conversation guideline, EARS (elicit, amplify, reinforce, start again) [35] and served to gauge: satisfaction with progress, motivation, and modify practice goals and game difficulty if necessary. At the first and last weekly visit, participants completed Self-Reported Experiences of Activity Settings (SEAS) questionnaire to measure interaction, engagement, and sense of control while playing at home [36].
Post-intervention
Within 2-weeks following the final visit, participants returned to clinic for a 60-minute assessment with the researcher and occupational therapist. The therapist re-evaluated: bimanual performance (AHA), gross manual dexterity (B&B), and COPM goals. The researcher conducted a semi-structured interview to gain participant’s subjective evaluations of the intervention [37] (S4 File). Finally, a separate member of the research team (other than who completed the home visits) made a (5–10 minute) telephone call with participant’s caregiver. The researcher asked questions and noted responses related to system use and integration into home-life (S4 File).
Outcomes
To address process feasibility, a priori success criteria were compared to observed outcomes. Success criteria were based on previous recruitment, adherence and technical performance results of a home-based exercise intervention in a similar population [38,39]:Process feasibility, a priori success criteria:≥10% response rate from all eligible participants [38]≥80% of the participants successfully complete the study. (i.e. completed at least 3 repeated measures during phase A and B, and complete assessments at baseline and post-intervention)Participants meet their self-identified practice goal. (within ≥66% of the identified frequency and duration)Participants were not prohibited from practicing due to technical constraints (e.g. After instruction, participants could start and play the game, technical challenges were overcome with the provided aid, and participants were not forced to cancel a practice session due to technical limitations). Participants may report multiple issues during the 1-month intervention.The study is then given one of the following recommendations: Not feasible, Feasible with minor modifications, Feasible with close monitoring, Feasible as is according to criteria set by [18].Towards assessing scientific feasibility, the size and variance of the effect of the intervention on participant-centred outcome measures are evaluated. As no single measure covers all aspects of function and experience during home-based interventions, complementary measures are used to capture changes across two ICF chapters [19,20].Measures for ICF chapter: Body Function, changes in:active wrist extension–open fingers (AWEO)grip strength (GRIP)gross manual dexterity (B&B)First, to evaluate the capacity with which participants can open their hand, a manual goniometric measurement of [40] with open and closed fingers was made. Participants start with the elbow in 90 degrees of flexion, the forearm pronated to the extent possible and the upper arm alongside the trunk. With the forearm fixed by the assessor, the child performs three wrist extensions per side [41]. Positive values indicate extension above neutral wrist position and values are recorded to the nearest five degrees, the minimal detectable difference [42]. In the relevant population passive movement tests show very good test–retest reliability (Intraclass correlation coefficients (ICC): 0.81–0.94) and moderate inter-rater reliability (correlation coefficients between 0.48–0.73 [43]). Second, was measured using a modified sphygmomanometer to evaluate relative changes in grip capacity [44]. The child sits with the arm adducted, the elbow flexed at 90 degrees and the forearm and wrist in neutral position (if possible). Participants maximally squeeze the device three times per side. The test is completed on both sides and the relative strength of the non-dominant hand to the dominant is reported. While normative data for children’s grip strength using the modified sphygmomanometer are not available, [44] found excellent test-retest reliability (Pearson correlation coefficient of 0.97) and [45] reported high intra-rater reliability (ICC = 0.92). Third, the measured gross manual dexterity. The number of blocks a participant can move over a center divider in one minute is counted for both hands [46]. The B&B test shows high inter-rater reliability (ICCs >0.95) and test-retest reliability (ICCs >0.95) in children 6–19 years [47] and in adults with hemiplegia [48].Measures for ICF Chapter: Activities and Participation, changes in:functional bimanual performance (AHA)perceived functional performance in a self-identified goal (COPM)perception of meaningful participation experiences (SEAS)Three measures address ICF chapter, Activities and Participation. First, the quantifies spontaneous functional bimanual performance. Progressing through a board game guided by a trained occupational therapist, participants complete bimanual tasks such as opening a box or shuffling cards [49]. There are twenty tasks, scored on a 4-point scale. The smallest detectable change for the AHA is 5 logit units (scaled from 0–100). In adolescents with unilateral CP up to age 18, AHA shows good construct validity and excellent inter-rater (ICCs 0.94–0.98) and test-retest reliability (ICCs 0.98–0.99) [49,50]. Second, evaluates perceived changes in satisfaction and performance of self-identified goal areas [33]. In conversation with a trained Therapist, participants rated a primary, and secondary if desired, goal area(s) from 1–10 in terms of importance, performance and satisfaction. Goal areas were re-evaluated post-intervention [33]. The primary goal’s perceived change in performance and satisfaction are evaluated here [51]. 2-points is the minimal clinically meaningful change [52]. COPM has been reported as valid, reliable, and responsive [53]. Third, the evaluates experience with an activity in four domains: personal growth, psychological engagement, meaningful interactions, choice and control [36]. It is a 22-item questionnaire completed independently or with parental/researcher assistance. Scores are interpreted on a 7-point Likert scale (from +3- -3). The questionnaire has good internal consistency and test-retest reliability (Cronbach’s alpha from 0.71 to 0.88, mean scale ICC = 0.68) [54].
Analyses
For process feasibility, participant recruitment and demographic characteristics are presented as per CONSORT recommendations [24]. Success criteria are reported descriptively and narratively. Technical issues and resources required for resolution were documented and reported. Recommendations for the design of a future clinical trial is based on the number of and extent to which success criteria were met.For scientific feasibility related to Body Function measures size and variance of the intervention effect on active wrist extension and grip strength are calculated for baseline and intervention phases. SCRIBE recommends a combined visual and statistical approach for SCED data [55]. The statistical approaches improve the estimate of the effect and can help account for serial dependence, variability and trends in the time-series data [23,56]. As the objective of the study is to inform a definitive RCT, the analysis focuses on estimating treatment effect size and variance. Statistical significance tests are not recommended to report due to insignificant power [21,57,58]. Level- and slope-change differences between phases, percentage of all non-overlapping data (PAND), and standardized mean difference (SMD, d-statistic with 95% confidence interval (CI)) show the effect size and variance [23,34,56]. Changes in Box and Blocks performance are described at individual and group levels with Hedge’s G effect size and 95% CI.For scientific feasibility related to Activities and Participation measures, the effect size (Hedge’s g) and 95% CI are reported for changes in functional bimanual performance (AHA, Assisting Hand Assessment). Effect size for non-parametric data (COPM and SEAS) are reported using matched-pairs rank-biserial correlation (r) with 95% CI by bootstrapping [59]. These data analyses were completed in R (v. 3.7) employing packages: SingleCaseES [60] and RcmdrPlugin [23].
Results
Participant and recruitment characteristics
Fig 1 shows the CONSORT recruitment flow chart. Participant enrolment started September 2018 and completed January 2019. The target number of participants was reached. Table 1 shows baseline demographic and clinical characteristics. System usage varied across individuals but averaged 4±1 days/week (8–24 days total), 17±9 minutes/day (37–333 minutes total), and 163±59 gesture repetitions/day (997–5698 total). No adverse events were reported. Two participants reported mild forearm muscle soreness during a weekly check-in. The soreness lasted for one day and resolved naturally without intervention.
Fig 1
CONSORT recruitment flow chart.
Standardized recruitment flowchart depicting the number of screened, enrolled, allocated and assess participants. Clinical assessment completed before and after intervention with occupational therapist: Box and Blocks, Assisting Hand Assessment and COPM Goals. As in single-case design, all participants allocated to same intervention.
Table 1
Baseline demographic and clinical characteristics.
ID
Age
Sex
MACS
Affected Side
CP Type
Notes
A
14.0
M
I
L
SH
Control gesture: pinch
B
13.0
F
I
R
SH
Physiotherapy once weekly (lower limb)
C
9.8
M
I
R
SH
Learning disability
D
9.8
F
I
R
SH
-
E
16.2
F
II
R
SH
-
F
10.5
F
I
R
SH
Physiotherapy once weekly (upper/lower limb), control gesture: pinch
G
10.5
F
I
R
SH
-
H
9.9
F
II
R
MT
-
I
13.8
M
I
L
SH
-
J
9.7
M
II
R
MT
Learning disability
K
10.4
F
I
L
SH
Features of ADHD, intellectual disability, speech and language delays
L
10.9
M
II
R
MT
ADHD, seizure disorder, learning disability
M
8.4
M
I
L
MD
ADHD
N
12.9
M
II
R
MT
ASD, epilepsy
O
8.4
F
I
R
MT
-
P
14.5
M
II
R
MT
ADHD
Q
17.4
F
I
R
MT
-
R
11.8
F
II
R
SH
-
S
9.8
M
I
L
SH
Learning disability
Notes describe secondary diagnoses and comments reported by therapists. All other control gestures were wrist extension- open fingers, wrist extension- closed fingers. Abbreviations: Mild Dystonia-MD, Spastic Hemiplegia-SH, Mixed tone-MT, attention deficit hyperactivity disorder-ADHD, Manual Abilities Classification System-MACS, Right-R, Left-L.
CONSORT recruitment flow chart.
Standardized recruitment flowchart depicting the number of screened, enrolled, allocated and assess participants. Clinical assessment completed before and after intervention with occupational therapist: Box and Blocks, Assisting Hand Assessment and COPM Goals. As in single-case design, all participants allocated to same intervention.Notes describe secondary diagnoses and comments reported by therapists. All other control gestures were wrist extension- open fingers, wrist extension- closed fingers. Abbreviations: Mild Dystonia-MD, Spastic Hemiplegia-SH, Mixed tone-MT, attention deficit hyperactivity disorder-ADHD, Manual Abilities Classification System-MACS, Right-R, Left-L.
Process feasibility success criteria
Table 2 summarizes the a priori success criteria evaluation. As most (i.e. recruitment rate, completion), but not all criteria (i.e. frequency of technical issues) were met, the recommendation is ‘feasible with minor modifications’.
Table 2
Feasibility success criteria evaluation.
Criteria
Percent achieved
Evaluation description
Criteria met
≥10% response rate
31%
19/62 of eligible participants were recruited
Yes
≥80% complete study
84%
3/19 participants completed all assessments
Yes
≥66% of the self-identified practice goals met
74%
14/19 participants met goal criteria*
Partial
0 practice restrictions from technical issues
17%
6/36 reported technical issues not resolved immediately and restricting practice
No
* Partial completion as some but not all participants (74%) reached ≥66% of the self-identified practice goals.
* Partial completion as some but not all participants (74%) reached ≥66% of the self-identified practice goals.
Scientific feasibility
Table 3 shows individual system utilization (dose) and scores for the six person-centred outcome measures.
Table 3
Individual system usage and outcomes.
Dose
Response
Session does (median, IQR)
Body Function
Activities and Participation
Reps.
Minutes Active
Minutes in System
AWEO (°)
Grip (/1)
B&B (blocks)
AHA (logit)
COPM-P (/10)
COPM-S (/10)
SEAS (+3 - -3)
ID
Days
S
F
S
F
S
F
S
F
S
F
S
F
S
F
A
12
114 (161)
7.1 (11)
10.5 (12)
36
50
0.5
0.48
30
39
62
57
3
2
2
3
3
2
B
21
238 (322)
15.6 (17.7)
29.7 (21.6)
26
45
0.38
0.53
16
24
55
54
4
7
4
5
2
2
C
18
74 (75)
4.3 (5)
12.5 (12)
36
50
0.5
0.46
16
17
43
43
1
3
1
5
2
2
D
18
90 (149)
4.1 (6.6)
16.5 (19.3)
22
35
0.47
0.58
16
23
55
55
4
4
4
2
0
0
E
24
186 (169)
6.1 (5.8)
15.3 (17.7)
-23
-5
0.19
0.19
11
8
43
46
4
3
2
8
1
2
F
21
68 (119)
7.2 (14.7)
25.2 (24.3)
9
25
0.24
0.45
26
29
54
55
7
8
8
7
2
2
G
20
103 (123)
7.6 (15.9)
23.8 (15.5)
-8
20
0.51
0.64
19
18
64
70
2
5
3
10
-2.5
-2
H
14
186 (182)
6.9 (7.1)
12.7 (17.7)
-14
20
0.42
0.47
5
3
52
54
3
7
3
10
3
3
I
11
65 (50)
3 (2.5)
7.2 (8.7)
-5
-
0.20
-
17
-
48
-
3
-
3
-
3
2
J
14
174 (192)
6.2 (4.6)
9.8 (7.4)
-20
0
0.44
0.44
16
16
57
48
3
5
0
0
2
1
K
21
143 (82)
8.3 (9.8)
13.6 (14.6)
30
35
0.61
0.76
22
25
55
63
5
8
3
8
3
3
L
12
210 (212)
6.4 (6.7)
23.5 (13.9)
11
30
0.51
0.57
22
26
48
50
5
6
5
6
2
2
M
16
117 (192)
5.2 (4.2)
9.3 (11.3)
61
55
0.73
0.78
31
31
76
77
5
10
5
10
2.5
2
N
14
50 (82)
4.9 (6)
10 (8.1)
-
-
-
-
10
13
32
40
5
6
8
6
3
2
O
12
196 (258)
6.3 (7.5)
10.8 (22.8)
35
45
0.53
0.58
32
41
77
79
4
5
5
5
3
3
P
14
112 (90)
5.1 (3.6)
7 (5.3)
13
10
0.39
0.49
13
15
50
54
4
4
4
2
1
2
Q
14
79 (39)
4.5 (2.9)
10.2 (10.8)
41
40
0.42
0.55
20
17
59
57
2
4
1
-
2
2
R
17
83 (141)
3.9 (9.2)
11.7 (10.7)
-16
5
0.43
0.32
24
33
48
50
4
4
4
2
2
1
S
8
205 (128)
8.2 (6.4)
13.3 (11.9)
23
25
0.42
0.48
20
17
62
62
5
3
5
1
3
3
Participants system usage in total days played (Days), average daily repetitions (Reps.), average time spent actively playing in the system (Minutes Active), average time using the system (Minutes in System). Body Function and Activities and Participation measures are presented. Starting (S) scores are: the median values at baseline for AWEO (Active wrist extension–open fingers, positive values indicate extension above neutral) and Grip (non-dominant grip strength relative to dominant), therapist assessed baseline values for B&B (Box and Blocks Test) and AHA (Assisting Hand Assessment- Logit score/100), COPM-P (primary goal’s performance score), COPM-S (primary goal’s satisfaction score), and SEAS (overall score on 7-point Likert scale) assessed at first week of play. Finishing (F) score are median score during final week of the intervention for AWEO and Grip, therapist assessment within 2-weeks of the end of intervention for B&B, AHA and COPM, and SEAS after the final day of practice. Insufficient/ not collected data denoted by -.
Body function
Active wrist extension (AWEO) increased 12±12°. There was a moderate to large effect for AWEO (SMD = 1.82, 95%CI = 0.85–2.78). Grip strength also increased (17±18%) and there was a small to moderate effect (SMD = 0.63, 95%CI = 0.65–1.91). Fig 2 shows the number of participants with small, moderate and large effects through Slope and Level change, and PAND analysis. A positive increase in at least one Body Function measure was seen in 14/19 participants (see S7 File, for slope and level changes for each participant).
Fig 2
Visual analyses summary for slope, level and non-overlapping data.
Number of participants showing changes between baseline and intervention phases (total N = 17). Active wrist extension–open fingers (AWEO, dark) and grip strength (light). (a) Slope changes, increasing indicates intervention phase slope is greater than baseline slope. (b) Level changes determined by split-middle method, small—Intervention phase < 5° (AWEO) or <5% (Grip Strength) from Baseline, moderate—Intervention phase 5–15° (AWEO) or 5–15% (Grip Strength) from Baseline, and large—Intervention phase >15° (AWEO) or >15% (Grip Strength) from Baseline. (c) Percent of all non-overlapping data, 50–80% of all non-overlapping data between phases indicates moderate separation between baseline and intervention and >80% of all non-overlapping data between phases indicates large separation between baseline and intervention.
Visual analyses summary for slope, level and non-overlapping data.
Number of participants showing changes between baseline and intervention phases (total N = 17). Active wrist extension–open fingers (AWEO, dark) and grip strength (light). (a) Slope changes, increasing indicates intervention phase slope is greater than baseline slope. (b) Level changes determined by split-middle method, small—Intervention phase < 5° (AWEO) or <5% (Grip Strength) from Baseline, moderate—Intervention phase 5–15° (AWEO) or 5–15% (Grip Strength) from Baseline, and large—Intervention phase >15° (AWEO) or >15% (Grip Strength) from Baseline. (c) Percent of all non-overlapping data, 50–80% of all non-overlapping data between phases indicates moderate separation between baseline and intervention and >80% of all non-overlapping data between phases indicates large separation between baseline and intervention.Change score for non-dominant Box & Blocks performance showed a moderate effect (Hedge’s g = 0.58, 95%CI = -0.11–1.27). Box & Blocks scores were between 5–32 at pre-test and 3–41 at post with a median change of +2.5 blocks (Table 3).
Activities and participation
Fig 3 summarizes pre-post changes, ordered by practice time (minutes in system). Post-hoc analyses showed small to no relationship between practice time and functional change scores (B&B, r = 0.19; AHA, r = 0.20; COPM, r = 0.30; SEAS, r = 0.08). Four weeks of the intervention showed a small effect in AHA score (Hedge’s g = 0.29, 95%CI = -0.39–0.97). AHA scores ranged between 32–77 at pre-test and 40–79 at post with a median change of +1.5 logit units. There was a moderate effect for COPM Performance scores (r = 0.60, 95%CI = 0.13–0.82). Median COPM change was +1 post intervention ranging from -2–5. See S1 Table for COPM Goals. The SEAS questionnaire showed participants felt positively about the activity (median = +2, IQR = 1.25) at the beginning and end of the intervention. SEAS score did not change (Median change = 0, range = -1–1, IQR = 0.75, small effect r = 0.24, 95%CI = —.25–0.61). See S2 Table for SEAS subscale scores.
Fig 3
Pre- post-intervention change scores by participant.
Differences in score before and after intervention for each measure, positive values indicate higher post-intervention score. (a) Assisting Hand Assessment (AHA) logit units, b) Box and Blocks (Blocks / minute), c) Canadian Occupational Performance Measure (COPM) P-performance, S-satisfaction (1–10 scale), and d) Self-Reported Experiences of Activity Settings (SEAS) overall score (+3- -3, 7-point Likert scale). Solid horizontal line indicates clinically meaningful change. Red, horizontal lines indicate clinically meaningful difference where available. Participants arranged left to right by total practice time (minutes) as indicated by dotted line, showing little correlation (r = 0.8–0.3) between practice time and change score for each measure.
Pre- post-intervention change scores by participant.
Differences in score before and after intervention for each measure, positive values indicate higher post-intervention score. (a) Assisting Hand Assessment (AHA) logit units, b) Box and Blocks (Blocks / minute), c) Canadian Occupational Performance Measure (COPM) P-performance, S-satisfaction (1–10 scale), and d) Self-Reported Experiences of Activity Settings (SEAS) overall score (+3- -3, 7-point Likert scale). Solid horizontal line indicates clinically meaningful change. Red, horizontal lines indicate clinically meaningful difference where available. Participants arranged left to right by total practice time (minutes) as indicated by dotted line, showing little correlation (r = 0.8–0.3) between practice time and change score for each measure.
Discussion
This study assesses the feasibility of novel intervention combining Solution Focused Coaching strategies with biofeedback-enhanced movement-controlled gaming. Most but not all a priori success criteria were met, as such the intervention approach is feasible with modifications, most notably in the refinement of the technology to mitigate technical issues. Clinical outcomes of the intervention were promising with moderate effects in Body Function measures and small-moderate effects in Activities and Participation measures.
Recruitment and adherence
This study shows comparable recruitment and retention rates to similar interventions. When reported, previous home-based ICP interventions have shown 37–46% recruitment of eligible participants and 66–90% retention through the intervention [38,61-63]. Adherence was relatively high compared to other studies which have shown participants complete 53–78% of practice goals during intervention periods 4–20 weeks long [38,61-63]. Few studies transparently report relevant metrics of the practice (timing, duration intensity) which makes it difficult to quantify and compare the therapy dose between studies. Standardized reporting in this area would facilitate meta-analyses needed to strengthen the evidence for home-based gaming interventions. At the observed effect size, 0.29, in bimanual performance (AHA change score), 127 participants would be required for the definitive RCT assuming alpha significance set to 0.05 and a desired power = 0.9. AHA change score was chosen to estimate the sample size instead of active wrist extension capacity since it measures functional bimanual performance and has a more direct relationship with ability to perform activities of daily living.
Implementation
It is important to note that this is the first investigation of the novel home-based gaming intervention and accordingly focused on process and scientific feasibility. Specifically, the study focuses on determining the extent to which the intervention could be used at home and estimating the effect size and variance it might have. Towards conducting a complete RCT, more studies and development would be needed to ascertain real-world feasibility, addressing resources and management. This includes a proper health economics analysis and evaluation of the logistic organization with respect to an institute’s existing therapeutic practices. It should also be noted that participants in Canada and France were off treatment blocks but visited their care provider for regularly scheduled check-ups, either annually or quarterly. Two participants, B and F, took additional weekly therapy sessions outside of the rehabilitation centre and one participant, J, attended the school at the rehabilitation centre and would see the occupational therapist as needed.
Motivation
We expect that the coaching and biofeedback strategies helped maintain participants’ engagement in the intervention. Biofeedback was linked to a variety of short- and long-term goals in the game. Participants’ motivations were linked to game biofeedback. Participants regularly chose to review their progress and adapted their movements in response to game feedback (S1 File). The Solution Focused Coaching strategy helped to maintain cognitive engagement in the intervention, reiterating how the game addressed functional goals. The positive, self-directed rhetoric was more effective with participants who believed that the game addressed their functional goals. Five of 19 participants did not reach their practice goals. These participants showed a novelty effect, with little interest after 1–2 weeks with the new game. These participants commented that they felt the activity did not align with their functional goals (e.g. Spread thumb easier to use joystick when playing video games) indicating a lack of cognitive engagement. Participants who were less convinced of the relevance of the task found it more difficult to participate in the coaching conversations and verbalize how their success in the game could translate to daily activities. In such cases, we explored alternative motivation strategies (e.g. parent-identified rewards for adhering to practice goals, or leader boards playing to a competitive nature). The SFC strategy required flexibility in the conversation and training structure. We would recommend it as a tool to engage participants but would not rely solely on SFC. Overall our learning in this study defends the importance of ensuring that home-based ICP therapy activities align with individual motivations and goals to support cognitive, affective and behavioural engagement in the intervention [14].
Body function
Active wrist extension is moderately to highly related to manual abilities [43,64]. In the current study, 12 of 17 participants increased active wrist extension by at least 5 degrees. These findings are consistent with other home-based supplemental therapy activities. Comparatively, [28] found 6±3 degrees improvement in wrist extension across 30 young children with CP after 24 weeks at 3 * 60 minutes/week and [27] saw 18±12 degrees change with four participants after 30 minutes * 5 sessions of EMG-based neurotherapy. Note, the wide range in practice time and different nature of the interventions may contribute to the inconsistency between studies.Wrist extension capacity is also directly related to grip strength [65]. Accordingly, we saw 18% average improvement in non-dominant grip strength relative to the dominant side across participants. However, normative data for children’s grip strength, as compared to the dominant side using the modified sphygmomanometer are not available. For comparison, [66] reported an average 15% grip strength improvement after 12-weeks hand function training in 15 children with CP. [67] observed a median 25% change from 5 participants pre-test values after 8-weeks of single joint resistance training combined with Botulinum toxin A injections.Post-hoc analyses showed differences in Body Function effect based on CP severity. Participants with more severe involvement (MACS level II) had greater gains in active wrist extension (8±10°) and smaller gains in grip strength (-8±7%) compared to participants at MACS level I. Statistical confirmation is not advised based on sample size and variance. More severely affected participants, those at MACS II had below neutral maximum wrist extension. Therefore, there was more opportunity for amplitude improvement, but without being in an extended posture, it is difficult to optimize grip strength.[68] established through path analysis from records of 136 children with CP that grip strength indirectly contributes to manual ability (Abilhand-Kids) [69] via its influence on gross manual dexterity (B&B) [68]. Consequently, active wrist extension and grip strength may indicate changes in manual ability relevant to daily activities. Despite this, we recognize the increased focus and relevance towards Activity and Participation measures [70]. Accordingly, the Activity and Participation measures were included in this early-stage feasibility study.
Activities and participation
There were small effects in Activities and Participation related measures (AHA, COPM). A minority of participants met or exceeded clinically meaningful thresholds (3/19 for AHA and 8/19 for COPM), while the majority showed non-clinically significant positive changes. The small effect is most likely due to low dosage and the nature of the activity. Further considering this relatively small dose across all participants, it is not surprising to see small correlations between practice time and functional change scores. The biofeedback video game practices specific functional movements but it is not an activity-based intervention (e.g. Constraint Induced Movement Therapy) [41]. Since practical manual ability is not the simple summation of skill and structure it would be unreasonable to expect gross transfer to daily tasks. However, considering the need for diverse and engaging rehabilitation strategies and the relative low risk of harm of this ICP intervention it may be a useful supplement to activity-based interventions. It may help accelerate Body Function changes (e.g. active wrist extension, grip strength) to facilitate manual ability improvements. This question could be addressed in future clinical trials.The SEAS questionnaire showed that participants experience was consistent during the 4-weeks. This corresponds with observations during the weekly check-in conversations. Only in the five participants who experienced a novelty effect, as described above, did we observe a decrease in the sub-scale score, Psychological Engagement (S2 Table).
Limitations
Due to resource limitations, a single game was built which may not have optimally appealed to the wide range in ages (8–18 years) and interests of the participants. General interest in video games or this game was not an inclusion criterion. Considering the impact personal motivation has on adherence and the vast differences in personal preferences, it is essential to match participants to activities that interest them. Greater choice and game variety, while challenging to implement in rehabilitation protocols, would help maintain novelty and interest in the activity. Collaboration with independent game developers can improve feasibility by offering content with relatively quick and flexible modification abilities, as was the case in the current study with the adapted commercial video game (Dashy Square) [71].Methodologically, there was risk of bias in assessment scoring as clinicians and researchers were not blinded to the participant phase. Bias in goal setting is also possible as parents, clinicians and researchers were present when the participants set their practice and COPM goals. Further, COPM responses are subjective and can be influenced by mood and environment. For instance, one participant successfully completed her goal for the first time at the post-intervention assessment but scored the performance lower than at baseline. For these reasons we use multiple measures to capture Activities and Participation experiences. Next, level-change groupings (small, moderate, and large) for active wrist extension were based on a minimal detectible difference of five degrees [42]. However, similar level-change groupings in grip strength were undeterminable since minimal detectible differences for grip strength, as compared to the dominant hand, do not yet exist for young people with CP. Scores are reported as a percent of the dominant side since improved capacity for bimanual activities is a primary goal for many young people with CP. For context, raw score increases in the affected hand's grip strength averaged 17 mmHg and all but two participants saw an improvement of greater than 10 mmHg. A minimal detectible difference of seven mmHg and a within-subject Standard Error of Measurement of three mmHg has recently been reported in individuals with Parkinson's Disease using a similar modified sphygmomanometer test [72]. These visual analyses summaries are not provided as a definitive evaluation but to aid the reader in their interpretation of the effect size in this single-case design intervention [23]. Further, while AB designs are useful for evaluating feasibility, return-to-baseline, or withdrawal designs would improve the strength of evidence of treatment effects [23]. The SEAS questionnaire was a practical tool to implement in the home to gauge self-reported experience. For a comprehensive evaluation, future work should consider qualitative interviews and content analysis [73]. The Solution Focused Coaching approach is designed to encourage collaborative development, led by the participant but does acknowledge the potential for external influence [16]. Here we kept fidelity of the SFC approach by referring to a checklist, but this could be improved by video review and completing a fidelity questionnaire [16].Logistically, the protocol may benefit from increased clinician involvement. Occupational Therapists remarked that that they could have helped guide home-based training by observing participants playing at weekly visits or by video. Finally, this study used the AHA as an outcome measure focusing on the non-dominant hand’s involvement in bimanual activities. Other measures of manual performance (i.e. Melbourne Assessment of Unilateral Upper Limb Function) [46] have been proposed and used more widely. Changing this metric could facilitate cross-study comparison.
Biofeedback technology development.
(PDF)Click here for additional data file.
The Single-Case Reporting Guideline In BEhavioural interventions (SCRIBE) 2016 checklist.
(PDF)Click here for additional data file.
The Consolidated Standards of Reporting Trials (CONSORT) checklist.
(PDF)Click here for additional data file.
Procedural fidelity resource.
(PDF)Click here for additional data file.
Template for Intervention Description and Replication (TIDieR) checklist.
(PDF)Click here for additional data file.
Intervention protocol.
(PDF)Click here for additional data file.
Individual visual analysis.
(PDF)Click here for additional data file.
COPM goals.
(DOCX)Click here for additional data file.
SEAS subscale scores.
(DOCX)Click here for additional data file.3 Dec 2019PONE-D-19-22575A biofeedback-enhanced therapeutic exercise video game intervention for young people with cerebral palsy: a randomized single-case experimental design feasibility study.PLOS ONEDear Alexander MacIntosh,Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.the coherence between the study declared goal and the actual presentation of results has been questioned, as well as the way in which the results are presented and the significance evaluated- Please address these concerns properly.We would appreciate receiving your revised manuscript by January 15 2020. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocolsPlease include the following items when submitting your revised manuscript:A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.We look forward to receiving your revised manuscript.Kind regards,Andrea MartinuzziAcademic EditorPLOS ONEJournal Requirements:1. When submitting your revision, we need you to address these additional requirements.Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found athttp://www.journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and http://www.journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf2. Thank you for submitting your clinical trial to PLOS ONE and for providing the name of the registry and the registration number. The information in the registry entry suggests that your trial was registered after patient recruitment began. PLOS ONE strongly encourages authors to register all trials before recruiting the first participant in a study.As per the journal’s editorial policy, please include in the Methods section of your paper:1) your reasons for your delay in registering this study (after enrolment of participants started);2) confirmation that all related trials are registered by stating: “The authors confirm that all ongoing and related trials for this drug/intervention are registered”.Please also ensure you report the date at which the ethics committee approved the study as well as the complete date range for patient recruitment and follow-up in the Methods section of your manuscript, and clarify why the protocol is dated 23 November 2018.[Note: HTML markup is below. Please do not edit.]Reviewers' comments:Reviewer's Responses to QuestionsComments to the Author1. Is the manuscript technically sound, and do the data support the conclusions?The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.Reviewer #1: YesReviewer #2: Partly**********2. Has the statistical analysis been performed appropriately and rigorously?Reviewer #1: YesReviewer #2: No**********3. Have the authors made all data underlying the findings in their manuscript fully available?The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.Reviewer #1: YesReviewer #2: Yes**********4. Is the manuscript presented in an intelligible fashion and written in standard English?PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.Reviewer #1: YesReviewer #2: No**********5. Review Comments to the AuthorPlease use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)Reviewer #1: The manuscript describes a technically sound piece of scientific research.The statistic analysis has been conducted rigorouslyThe data support the conclusions and I totally agree with the resource limitation well described at paragraph 430Reviewer #2: General commentThe authors present the results of a study to assess the feasibility of an at-home video game intervention. However, the authors seem to mix a bit the purposes of this study and the paper is difficult to follow. The main aim is to determine the feasibility (adherence, complications) but then the authors also assess the motor functions of the partcipants before and after the intervention. Then they used unusual methods for presenting the results. The article would become more readable if the authors only present the results of the feasibility part and shorten the presentation of the pre-post results since it’s not the aim of this particular study. A table with the results of the different tests before and after the intervention with results of the statistics would be more suited and easy to interpret, the other analysis can be moved to supplementary materials.Specific commentsIntroductionLine 45: The authors should precise the inconsistency in the existing studies and what is currently missing to strengthen their study.Line 68: patient instead of clientMethodsIntervention: Was the total amount of training (defined during the self-defined practice) significantly different between the participants? The total duration of training should be presented in the results.ResultsTable 1: only 2 patients have physiotherapy sessions? It seems low compared to the recommendation. Should be discussed.Table 2: 19 patients but for the technical issues 6/39? Please clarifyFigure 2 – 3: Forest plots are a very unusual way of presenting individuals' data making the interpretation confusing! How did you compute CI for the SMD based only on 1 individual data? What’s the point of using a random effect model since it’s a controlled study? Participants should all have the same weight. A simple t-test (or Wilcoxon if the data are not normaly distributed, which is not precised) is more adapted (eventually with line plots if you want to present individual’s results).Figure 4: again visual analyses, the authors state that there did not compute the significance but still present a lot a statistics, this is a bit confusing.Was there any difference between the experiments in France and in Canada?DiscussionBody function: Against it is a bit confusing to read “In the current study, all but two participants increased active wrist extension” since no proper statistics were done.“Accordingly, we saw 18% average improvement with +5% gains in 12/19 participants.” Idem, what about the 7 other participants?Since it is a feasibility study the author should discuss the feasibility of implementing such kind of interventions in “real life” conditions. Do they think it is possible to have the pre-session and the different control sessions organized in practice (and who is going to pay for that). This is particularly important since apparently only 2 of the 19 participants have physiotherapy sessions (so basically no intervention).**********6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.If you choose “no”, your identity will remain anonymous but your review may still be made public.Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.Reviewer #1: NoReviewer #2: Yes: Bruno Bonnechère[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.3 Jan 2020The following response can also be found in the 'Response to Reviewers.docx'Editor’s comments1 The coherence between the study declared goal and the actual presentation of results has been questioned, as well as the way in which the results are presented, and the significance evaluated- Please address these concerns properly.Thank you for your comments. In our responses to reviewer 2 you will see that their concerns regarding the presentation of the results and the evaluation of significance have been addressed.In summary, regarding the main aim of this study, we assess feasibility according to recommendations from Thabane et al 2010, and the CONSORT extension for reporting feasibility studies (Eldridge et al 2016). This framework highlights that the aim of feasibility testing can be towards assessing any of four classifications: process, resources, management and scientific. “Scientific” here refers to treatment safety and estimation of treatment effect and its variance. Motor function assessments were completed for this reason.It is also worth noting that statistical analyses were intentionally not conducted. In fact, the feasibility reporting guide we follow specifically discourages emphasizing statistical significance in studies of this type mainly due to low powered samples. We therefore follow the Single-Case Reporting Guideline In BEhavioural Interventions (SCRIBE), which recommends the presentation of a mixed ‘visual and statistical’ approach (Manolov et al 2014). The methods are considered complementary rather than mutually exclusive. We acknowledge that there is no universal gold-standard for single-case design studies, and it is for this reason we provide a mixed analysis approach.Separating primary and secondary aims seems to have caused this confusion. To clarify, we have explicitly stated which aspects of feasibility criteria we address and have:- Added context around feasibility to the introduction- Re-organized the methods to show that the current investigation focuses on process and scientific feasibility.- Added a summary table of results as Reviewer 2 requestedYou can find the specific changes made to the manuscript in the response to Reviewer 2, general comments # 1 and 3.2 Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found athttp://www.journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and http://www.journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdfWe have reviewed the style guidelines and confirm they meet the requirements.3 Thank you for submitting your clinical trial to PLOS ONE and for providing the name of the registry and the registration number. The information in the registry entry suggests that your trial was registered after patient recruitment began. PLOS ONE strongly encourages authors to register all trials before recruiting the first participant in a study.Thank you for reviewing this. We apologize for the confusion here. We think this may be because the ‘Last Update’ date was posted June 18, 2019. This update was related to an annual renewal of the study protocol on the ClinicalTrials.gov registry.The trial, registered at: https://clinicaltrials.gov/ct2/show/NCT03677193 indicates that this study was first posted September 19, 2018. Further, the initial release date according to the ClinicalTrials.gov Protocol Registration and Results System was dated July 31, 2018. Note, this is before the start of the study enrollment.Please see the screenshots at the bottom of this document to confirm (Appendix A). If for some reason, the study still appears to be registered after recruitment began, we will indeed contact the registry directly to correct the error. Thank you for bringing this to our attention. The trial registration number may have also been incorrect. We have verified that the proper registration number is in the manuscript.4 As per the journal’s editorial policy, please include in the Methods section of your paper:1) your reasons for your delay in registering this study (after enrolment of participants started);2) confirmation that all related trials are registered by stating: “The authors confirm that all ongoing and related trials for this drug/intervention are registered”.Regarding item 1) as described in response #3, above, the study was registered July 31, 2018 on ClinicalTrials.gov. The first participant was enrolled on September 29th, 2019.Regarding item 2) the required statement was added to the section: Methods, Design.5 Please also ensure you report the date at which the ethics committee approved the study as well as the complete date range for patient recruitment and follow-up in the Methods section of your manuscript, and clarify why the protocol is dated 23 November 2018. Methods have been adjusted accordingly and now read as follows:“Ethical approval was obtained by Holland Bloorview’s Research Ethics Board (approved July 27, 2018, amended November 23, 2018, to include participants with mixed tone and mild dystonia) and the French national Comité de protection des personnes (CPP) (approved July 6, 2018).” – Section: Methods, Design“Recruitment in Canada took place from September-November 2018. … Recruitment in France took place from November 2018 - January 2019.”- Section: Methods, Participants, RecruitmentReviewer #1 comments:1 The manuscript describes a technically sound piece of scientific research.The statistic analysis has been conducted rigorouslyThe data support the conclusions and I totally agree with the resource limitation well described at paragraph 430Thank you for your kind remarks.Reviewer #2General comments1 The authors present the results of a study to assess the feasibility of an at-home video game intervention. However, the authors seem to mix a bit the purposes of this study and the paper is difficult to follow. The main aim is to determine the feasibility (adherence, complications) but then the authors also assess the motor functions of the participants before and after the intervention.We would first like to thank Reviewer 2 for their detailed comments.Regarding the main aim of this study, we assess feasibility according to recommendations from Thabane et al 2010, and the CONSORT extension for reporting feasibility studies (Eldridge et al 2016). This framework highlights that the aim of feasibility testing can relate to any of four broad classifications: process, resources, management and scientific. Scientific here refers to treatment safety and estimation of treatment effect and its variance. Motor function assessments were completed for this reason. We decided to separate the scientific, effect size and variance, estimate to the secondary aim as we used six outcome measures (since no single measure addresses all aspects of capacity and participation). However, we apologize, as this seems to have caused confusion. To clarify, all the reported outcomes, including those related to motor function are made towards assessing feasibility of the intervention.It is also worth noting that for this reason, statistical analyses were intentionally not conducted. In fact, the feasibility reporting guide we follow discourages emphasizing statistical significance in studies of this type (Thabane et al 2010, Eldridge et al 2016). The goal here is to inform future randomized controlled trial (RCT) research and not verify the significance of the response, as the study is underpowered to do so.We have made changes in the main text to clarify our use of the feasibility framework:“This paper addresses intervention feasibility as articulated by Thabane et al (2010) [18]. This framework highlights that the aim of feasibility testing can be related to one or more of the following four classifications: process, resources, management and scientific. The objective here is to support development towards future randomized controlled trials and not to define statistical or clinical effectiveness of the intervention [18]. In this study, we concentrate on two of the four feasibility classifications:First, we assess process feasibility of the biofeedback-enhanced therapy video game intervention protocol for young people with CP. The objective here is to determine the ability to enroll participants, enable home-based practice, and retain their activity during a 1-month intervention. To this purpose, a priori success criteria were established for the recruitment and response rates, adherence, and frequency of technical difficulties impeding home practice [18].Second, we assess the scientific feasibility of the intervention by estimating the effect size and variance for six participant-centred outcome measures for the hand and wrist. The measures are aligned to the Body Functions and Activities and Participation chapters of the ICF (international classification of functioning disability and health) [19,20].” – Section: Introduction, Aim2 Then they used unusual methods for presenting the results. The article would become more readable if the authors only present the results of the feasibility part and shorten the presentation of the pre-post results since it’s not the aim of this particular study.As indicated in the comment above, we assess feasibility according to recommendations from Thabane et al 2010, and the CONSORT extension for reporting feasibility studies (Eldridge et al 2016). This framework highlights that the aim of feasibility testing can relate to any of four broad classifications: process, resources, management and scientific. Scientific here refers to treatment safety and estimation of treatment effect and its variance. Motor function assessments were completed for this reason. All the reported outcomes, including those related to motor function (the pre-post results) are made towards assessing feasibility of the intervention.To clarify this, we have:- Added context around feasibility in the aim (please see the quoted text in the comment above).- Re-organized the methods to not distinguish the work in terms of primary and secondary aims, but in reference to process feasibility and scientific feasibility. You will see this change throughout the aim, methods and results of the manuscript.3 A table with the results of the different tests before and after the intervention with results of the statistics would be more suited and easy to interpret, the other analysis can be moved to supplementary materials.For clarity, we have provided a table of the results of the different tests. Please see Table 3 under the section: Results, Scientific Feasibility.Please note, as we followed the CONSORT extension for reporting feasibility studies (Eldridge et al 2016) and the Single-Case Reporting Guideline In BEhavioural Interventions (SCRIBE), we have intentionally not reported significance tests. We do however report the combined statistical and visual analysis along with estimates of effect size and their variance in the form of 95% confidence intervals where appropriate.We hope the reviewer can understand the choice to use this methodology, and not to focus on statistical significance. The goal here is to inform future RCT research and not verify the significance of the response, as the study is underpowered to do so. To this point we have added the following statement:“At the observed effect size, 0.29, in bimanual performance (AHA change score), 127 participants would be required for the definitive RCT assuming alpha significance set to 0.05 and a desired power = 0.9. AHA change score was chosen to estimate the sample size instead of active wrist extension capacity since it measures functional bimanual performance and has more direct relationship with ability to perform activities of daily living.” – Section: Discussion, Recruitment and adherenceSpecific commentsIntroduction1 Line 45: The authors should precise the inconsistency in the existing studies and what is currently missing to strengthen their study.The statement has been changed to:“These types of studies have shown moderate evidence towards improving balance and overall motor skill but weak evidence towards improving upper extremity skills, joint control, gait and strength [9]” - Introduction.2 Line 68: patient instead of clientWe have removed the use of the terms: patient and client throughout the manuscript. Our institution’s convention is to use client instead of patient, however we realize this is not true everywhere. To be as inclusive as possible we have used the term participant, person or individual whereMethods3 Intervention: Was the total amount of training (defined during the self-defined practice) significantly different between the participants? The total duration of training should be presented in the results.Individual practice did vary across participants. While the average amount of training was 16±4 days over the 1-month intervention, 7/19 participants were outside 1 standard deviation from this mean. Similarly, while participants practiced an average of 17±9 minutes/day, 10/19 participants were outside 1 standard deviation from this mean.Individual total amount of training can now be seen in Table 3 under section: Results, Scientific feasibility. We have also added the following statement:“System usage varied across individuals but averaged 4±1 days/week (8-24 days total), 17±9 minutes/day (37-333 minutes total), and 163±59 gesture repetitions/day (997-5698 total).” - Section: Results, Participant and recruitment characteristicsResults4 Table 1: only 2 patients have physiotherapy sessions? It seems low compared to the recommendation. Should be discussed.We have added the following to the discussion:“Participants in Canada and France were off treatment blocks but visited their care provider for regularly scheduled check-ups, either annually or quarterly. Two participants, B and F, took additional weekly therapy sessions outside of the rehabilitation centre and one participant, J, attended the school at the rehabilitation centre and would see the occupational therapist as needed.” - Section: Discussion, ImplementationBoth Canada and France have publicly funded healthcare but services for individuals in our recruitment demographic (older age group and mild to moderate impairment) are not as frequently utilized. As such, it is not uncommon to see older children and teenagers with CP only occasionally be on active blocks of manual therapy training.5 Table 2: 19 patients but for the technical issues 6/39? Please clarifyTechnical issues could be reported multiple times during the 1-month intervention. For instance, a participant might report a technical issue once in the first week and twice more in the third week. As such, it is possible to have more technical issues than participants enrolled.To clarify this metric, we have added the following text to Methods, Outcomes #4:“Participants may report multiple issues during the 1-month intervention.”6 Figure 2 – 3: Forest plots are a very unusual way of presenting individuals' data making the interpretation confusing! How did you compute CI for the SMD based only on 1 individual data? What’s the point of using a random effect model since it’s a controlled study? Participants should all have the same weight. A simple t-test (or Wilcoxon if the data are not normally distributed, which is not precised) is more adapted (eventually with line plots if you want to present individual’s results).Thank you for pointing this out. As described in the response to this reviewer’s general comments numbers 1 and 3, we are not presenting statistical tests such as t-tests or Wilcoxon to conform with the CONSORT feasibility testing and SCRIBE Single-Case design reporting frameworks.We do however acknowledge that this is an unusual presentation of individual results. Given the individual outcomes are now reported in Table 3, we have removed Fig 2 and Fig 3 and reported group level SMD and confidence intervals descriptively (Section: Methods, Scientific feasibility).7 Figure 4: again visual analyses, the authors state that there did not compute the significance but still present a lot a statistics, this is a bit confusing.As described in the response to this reviewer’s general comments numbers 1 and 3, significance tests comparing pre and post outcomes were not conducted in accordance with the CONSORT feasibility testing and SCRIBE Single-Case design reporting frameworks.However, statistical analyses were conducted to establish effect sizes and variances as recommended by CONSORT feasibility testing and SCRIBE Single-Case design reporting. SCRIBE recommends the presentation of a mixed ‘visual and statistical’ approach (Manolov et al 2014). The methods are considered complementary rather than mutually exclusive. We acknowledge that there is no universal gold-standard for SCED design studies, and it is for this reason we provide a mixed analysis approach.8 Was there any difference between the experiments in France and in Canada?No, the experiments in France and Canada were completed the same way. The same researcher trained the participants, completed all in-home assessments and measures of range of motion and grip strength. The only difference was in recruitment where:“In Canada, eligible participants were also identified through the hospital’s centralized recruitment database, connect2research. In Canada the researcher telephoned potential participants after sending an invitation letter by mail. Then, the researcher screened interested participants and obtained written informed consent. In France, the developmental pediatricians invited eligible individuals to participate. They obtained written informed consent from interested participants.”This is currently expressed in the section: Methods, Participants, Recruitment.Discussion9 Body function: Against it is a bit confusing to read “In the current study, all but two participants increased active wrist extension” since no proper statistics were done.This statement has been further specified for clarity and now reads:“In the current study, 12 of 17 participants increased active wrist extension by at least 5 degrees.” – Section: Discussion, Body FunctionAdditionally, we have added the following to the methods:“...values are recorded to the nearest five degrees, the minimal detectable difference [42].” – Section: Methods, Outcomes10 “Accordingly, we saw 18% average improvement with +5% gains in 12/19 participants.” Idem, what about the 7 other participants?5 participants had a grip strength change of <5%, 2 participants did not complete enough assessments to calculate grip strength or range of motion during the intervention (see Table 3 under the section: Results, Scientific feasibility).To provide additional context, the discussion has been changed to the following:“Accordingly, we saw 18% average improvement in non-dominant grip strength relative to the dominant side across participants. However, normative data for children’s grip strength, as compared to the dominant side using the modified sphygmomanometer are not available”. – Section: Discussion, Body FunctionFurther, we have added the following limitation:“Next, level-change groupings (small, moderate, and large) for active wrist extension were based on a minimal detectible difference of five degrees [42]. However, similar level-change groupings in grip strength were undeterminable since minimal detectible differences for grip strength, as compared to the dominant hand, do not yet exist for young people with CP. Scores are reported as a percent of the dominant side since improved capacity for bimanual activities is a primary goal for many young people with CP. For context, raw score increases in the affected hand's grip strength averaged 17 mmHg and all but two participants saw an improvement of greater than 10 mmHg. A minimal detectible difference of seven mmHg and a within-subject Standard Error of Measurement of three mmHg has recently been reported in individuals with Parkinson's Disease using a similar modified sphygmomanometer test [71]. These visual analyses summaries are not provided as a definitive evaluation but to aid the reader in their interpretation of the effect size in this single-case design intervention [23].”– Section: Discussion, Limitations11 Since it is a feasibility study the author should discuss the feasibility of implementing such kind of interventions in “real life” conditions. Do they think it is possible to have the pre-session and the different control sessions organized in practice (and who is going to pay for that). This is particularly important since apparently only 2 of the 19 participants have physiotherapy sessions (so basically no intervention).The reviewer raises an important concern. We have added the following text to address the reviewers question surrounding ‘real life’ implementation:“It is important to note that this is the first investigation of the novel home-based gaming intervention and accordingly focused on process and scientific feasibility. Specifically, the study focuses on determining the extent to which the intervention could be used at home and estimating the effect size and variance it might have. Towards conducting a complete RCT, more studies and development would be needed to ascertain real-world feasibility, addressing resources and management. This includes a proper health economics analysis and evaluation of the logistic organization with respect to an institute’s existing therapeutic practices.” – Discussion, Implementation.Main points related to process feasibility are in the discussion under the sections: Recruitment and adherence, Motivation and Limitations. To further comment on scientific feasibility, as described by Thabane et al 2010, we discuss Body Functions and Activity and Participation outcomes in the discussion as well.ReferencesThabane L, Ma J, Chu R, Cheng J, Ismaila A, Rios LP, et al. A tutorial on pilot studies: the what, why and how. BMC Med Res Methodol. 2010;10: 1.Eldridge SM, Chan CL, Campbell MJ, Bond CM, Hopewell S, Thabane L, et al. CONSORT 2010 statement: extension to randomised pilot and feasibility trials. Pilot Feasibility Stud. 2016;2: 64. doi:10.1186/s40814-016-0105-8Manolov R, Gast DL, Perdices M, Evans JJ. Single-case experimental designs: Reflections on conduct and analysis. Neuropsychol Rehabil. 2014;24: 634–660. doi:10.1080/09602011.2014.903199Please see Appendix A (Response to Reviewers.docx) for Screenshots of ClinicalTrials.gov showing trial registration verification date before recruitment began.Submitted filename: Response to Reviewers.docxClick here for additional data file.15 May 2020PONE-D-19-22575R1A biofeedback-enhanced therapeutic exercise video game intervention for young people with cerebral palsy: A randomized single-case experimental design feasibility study.PLOS ONEDear Mr. MacIntosh,Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.Please address the minor corrections requested by reviewer 3 and check syntax and grammar throughout the text.We would appreciate receiving your revised manuscript by May 31st. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocolsPlease include the following items when submitting your revised manuscript:A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.We look forward to receiving your revised manuscript.Kind regards,Andrea MartinuzziAcademic EditorPLOS ONE[Note: HTML markup is below. Please do not edit.]Reviewers' comments:Reviewer's Responses to QuestionsComments to the Author1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.Reviewer #3: (No Response)**********2. Is the manuscript technically sound, and do the data support the conclusions?The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.Reviewer #3: Yes**********3. Has the statistical analysis been performed appropriately and rigorously?Reviewer #3: N/A**********4. Have the authors made all data underlying the findings in their manuscript fully available?The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.Reviewer #3: Yes**********5. Is the manuscript presented in an intelligible fashion and written in standard English?PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.Reviewer #3: Yes**********6. Review Comments to the AuthorPlease use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)Reviewer #3: This is a very nicely done, and well-reported study, on an important topic. I only have a few comments; overall, I enjoyed reading the paper and commend the authors on their clear presentation of the results.l.72 "participants'" not "participant's" The plural is implied later in the sentence with pronoun "their." Throughout the paper, the authors should check pronoun agreement between subject and verb. In a number of places, a singular subject is followed by a plural pronoun; e.g., he...their (sorry, ex-grammar teacher).l. 145. Why is "Normal or corrected-to-normal vision and hearing" an EXCLUSION criteria? Is that correct?In several places, the authors give an abbreviation, followed by the full name that is being abbreviated in parentheses (ll.100-01, ll.164-65, ll.170-71). This order should be reversed (as it is elsewhere in the paper).Table 3 summarizes Reps, Minutes Active, and Minute in System using means and standard deviations. Judging by the size of the standard deviations relative to the means, these do not appear to be normally distributed, and would be better summarized with medians and interquartile ranges to give a more accurate picture of the data distributions.25% of the participants did not reach their goals due to a loss of interest. This seems fairly high. Did the authors factor this into their assessment of feasibility? I know they talk about the need to vary programs for different ages and over time to avoid this, but is THAT feasible? I.e., how much money and time go into designing and implementing a single game?**********7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.If you choose “no”, your identity will remain anonymous but your review may still be made public.Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.Reviewer #3: No[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.17 May 20201 This is a very nicely done, and well-reported study, on an important topic. I only have a few comments; overall, I enjoyed reading the paper and commend the authors on their clear presentation of the results.Thank you for your comments we appreciate your suggestions and the time spent reviewing.2 l.72 "participants'" not "participant's" The plural is implied later in the sentence with pronoun "their." Throughout the paper, the authors should check pronoun agreement between subject and verb. In a number of places, a singular subject is followed by a plural pronoun; e.g., he...their (sorry, ex-grammar teacher).Thank you for this comment since English language grammar is not a particular strength of our group. The manuscript has been reviewed for pronoun agreement and the corrections have been made as needed, including editing l.72 from “participant’s” to “participants”.3 l. 145. Why is "Normal or corrected-to-normal vision and hearing" an EXCLUSION criteria? Is that correct?This is indeed a typo. That item has been moved from the exclusions to the inclusions as it should be. Thank you for noticing the mistake.4 In several places, the authors give an abbreviation, followed by the full name that is being abbreviated in parentheses (ll.100-01, ll.164-65, ll.170-71). This order should be reversed (as it is elsewhere in the paper).The location of abbreviations have been corrected to appear in brackets after the full name at each of the lines identified by the reviewer.5 Table 3 summarizes Reps, Minutes Active, and Minute in System using means and standard deviations. Judging by the size of the standard deviations relative to the means, these do not appear to be normally distributed, and would be better summarized with medians and interquartile ranges to give a more accurate picture of the data distributions.Table 3 has been updated to include median and interquartile range as the reviewer suggests giving a more accurate picture of the data distributions.6 25% of the participants did not reach their goals due to a loss of interest. This seems fairly high. Did the authors factor this into their assessment of feasibility? I know they talk about the need to vary programs for different ages and over time to avoid this, but is THAT feasible? I.e., how much money and time go into designing and implementing a single game?The reviewer addresses an important aspect here. Adherence is a central issue. In the current study we found that adherence was relatively high compared to other studies which have shown participants complete 53-78% of practice goals, as compared to 75% in the current work. (l.381-383). From a feasibility perspective, the reviewer also raises a salient issue. The cost of activity development relative to the expected use and benefit will dictate quantity and quality. We mitigated some of this cost and improved feasibility by finding a suitable game and subsequently collaborating with directly with the developer. We have expanded the limitations section to include this point and it now read as follows:“…greater choice and game variety, while challenging to implement in rehabilitation protocols, would help maintain novelty and interest in the activity. Collaboration with independent game developers can improve feasibility by offering content with relatively quick and flexible modification abilities, as was the case in the current study with the adapted commercial video game (Dashy Square). [71]” – l.472.Submitted filename: Response to Reviewers.docxClick here for additional data file.3 Jun 2020A biofeedback-enhanced therapeutic exercise video game intervention for young people with cerebral palsy: A randomized single-case experimental design feasibility study.PONE-D-19-22575R2Dear Dr. MacIntosh,We are pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it complies with all outstanding technical requirements.Within one week, you will receive an e-mail containing information on the amendments required prior to publication. When all required modifications have been addressed, you will receive a formal acceptance letter and your manuscript will proceed to our production department and be scheduled for publication.Shortly after the formal acceptance letter is sent, an invoice for payment will follow. To ensure an efficient production and billing process, please log into Editorial Manager at https://www.editorialmanager.com/pone/, click the "Update My Information" link at the top of the page, and update your user information. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.If your institution or institutions have a press office, please notify them about your upcoming paper to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, you must inform our press team as soon as possible and no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.With kind regards,Andrea MartinuzziAcademic EditorPLOS ONEAdditional Editor Comments (optional):Reviewers' comments:Reviewer's Responses to QuestionsComments to the Author1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.Reviewer #3: All comments have been addressed**********2. Is the manuscript technically sound, and do the data support the conclusions?The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.Reviewer #3: Yes**********3. Has the statistical analysis been performed appropriately and rigorously?Reviewer #3: Yes**********4. Have the authors made all data underlying the findings in their manuscript fully available?The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.Reviewer #3: Yes**********5. Is the manuscript presented in an intelligible fashion and written in standard English?PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.Reviewer #3: Yes**********6. Review Comments to the AuthorPlease use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)Reviewer #3: (No Response)**********7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.If you choose “no”, your identity will remain anonymous but your review may still be made public.Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.Reviewer #3: No10 Jun 2020PONE-D-19-22575R2A biofeedback-enhanced therapeutic exercise video game intervention for young people with cerebral palsy: A randomized single-case experimental design feasibility study.Dear Dr. MacIntosh:I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.If we can help with anything else, please email us at plosone@plos.org.Thank you for submitting your work to PLOS ONE and supporting open access.Kind regards,PLOS ONE Editorial Office Staffon behalf ofDr. Andrea MartinuzziAcademic EditorPLOS ONE
Authors: Ann-Christin Eliasson; Lena Krumlinde-Sundholm; Birgit Rösblad; Eva Beckung; Marianne Arner; Ann-Marie Ohrvall; Peter Rosenbaum Journal: Dev Med Child Neurol Date: 2006-07 Impact factor: 5.449
Authors: Lehana Thabane; Jinhui Ma; Rong Chu; Ji Cheng; Afisi Ismaila; Lorena P Rios; Reid Robson; Marroon Thabane; Lora Giangregorio; Charles H Goldsmith Journal: BMC Med Res Methodol Date: 2010-01-06 Impact factor: 4.615
Authors: Sandra M Eldridge; Claire L Chan; Michael J Campbell; Christine M Bond; Sally Hopewell; Lehana Thabane; Gillian A Lancaster Journal: Pilot Feasibility Stud Date: 2016-10-21