Literature DB >> 29945858

eMental Healthcare Technologies for Anxiety and Depression in Childhood and Adolescence: Systematic Review of Studies Reporting Implementation Outcomes.

Lori Wozney1, Patrick J McGrath2, Kathryn Bennett3, Anna Huguet1,4, Lisa Hartling5, Michele P Dyson5, Nicole D Gehring5, Amir Soleimani5, Amanda S Newton5.   

Abstract

BACKGROUND: Anxiety disorders and depression are frequent conditions in childhood and adolescence. eMental healthcare technologies may improve access to services, but their uptake within health systems is limited.
OBJECTIVE: The objective of this review was to examine and describe how the implementation of eMental healthcare technologies for anxiety disorders and depression in children and adolescents has been studied.
METHODS: We conducted a search of 5 electronic databases and gray literature. Eligible studies were those that assessed an eMental healthcare technology for treating or preventing anxiety or depression, included children or adolescents (<18 years), or their parents or healthcare providers and reported findings on technology implementation. The methodological quality of studies was evaluated using the Mixed Methods Appraisal Tool. Outcomes of interest were based on 8 implementation outcomes: acceptability (satisfaction with a technology), adoption (technology uptake and utilization), appropriateness ("fitness for purpose"), cost (financial impact of technology implementation), feasibility (extent to which a technology was successfully used), fidelity (implementation as intended), penetration ("spread" or "reach" of the technology), and sustainability (maintenance or integration of a technology within a healthcare service). For extracted implementation outcome data, we coded favorable ratings on measurement scales as "positive results" and unfavorable ratings on measurement scales as "negative results." Those studies that reported both positive and negative findings were coded as having "mixed results."
RESULTS: A total of 46 studies met the inclusion criteria, the majority of which were rated as very good to excellent in methodological quality. These studies investigated eMental healthcare technologies for anxiety (n=23), depression (n=18), or both anxiety and depression (n=5). Studies of technologies for anxiety evaluated the following: (1) acceptability (78%) reported high levels of satisfaction, (2) adoption (43%) commonly reported positive results, and (3) feasibility (43%) reported mixed results. Studies of technologies for depression evaluated the following: (1) appropriateness (56%) reported moderate helpfulness and (2) acceptability (50%) described a mix of both positive and negative findings. Studies of technologies designed to aid anxiety and depression commonly reported mixed experiences with acceptability and adoption and positive findings for appropriateness of the technologies for treatment. Across all studies, cost, fidelity, and penetration and sustainability were the least measured implementation outcomes.
CONCLUSIONS: Acceptability of eMental healthcare technology is high among users and is the most commonly investigated implementation outcome. Perceptions of the appropriateness and adoption of eMental healthcare technology were varied. Implementation research that identifies, evaluates, and reports on costs, sustainability, and fidelity to clinical guidelines is crucial for making high-quality eMental healthcare available to children and adolescents. ©Lori Wozney, Patrick J McGrath, Nicole D Gehring, Kathryn Bennett, Anna Huguet, Lisa Hartling, Michele P Dyson, Amir Soleimani, Amanda S Newton. Originally published in JMIR Mental Health (http://mental.jmir.org), 26.06.2018.

Entities:  

Keywords:  decision-making; eHealth; healthcare organizations; healthcare planning; implementation science; mental health; organizational innovation

Year:  2018        PMID: 29945858      PMCID: PMC6039769          DOI: 10.2196/mental.9655

Source DB:  PubMed          Journal:  JMIR Ment Health        ISSN: 2368-7959


Introduction

Worldwide, at least 6.5% and 2.6% of children and adolescents meet the criteria for anxiety and depressive disorders, respectively [1]. The burden associated with these disorders rises sharply in childhood and peaks in adolescence and young adulthood (ages, 15-24 years) [2]. The long-term impact of anxiety and depression on children and adolescents includes significant interference with relationships, academic performance, school attendance, and daily functioning, making early intervention vital [3-8]. Underdiagnosis and undertreatment of childhood and adolescent depression and anxiety are well-documented concerns [9,10]. The current distribution, demand, structure, and costs that underpin services for these young people make them relatively unavailable to many who need them [11]. Electronic mental (eMental) healthcare technologies, which include internet-, mobile-, and, computer-based programs as well as mobile phone apps, supposedly improve mental healthcare access and availability [12-17]. In the past 5 years, a number of literature reviews have highlighted the increase in research and development activities for eMental healthcare technologies for children and adolescents [18-22]. While conclusions regarding the efficacy and effectiveness of technologies vary depending on the review and employed methodology, reviews are unified in their assessment that eMental healthcare technologies have potential utility in healthcare systems. However, despite increased emphasis on the potential value for improving health outcomes for children and adolescents, eMental health technologies are not widely adopted within health systems [23-26]. Distinguishing implementation effectiveness from the viewpoint of treatment effectiveness is critical for integrating eMental healthcare technologies. When uptake efforts fail, it is important to know if the failure occurred because the intervention was ineffective in the new setting (eg, lacked cultural relevance), or if an effective intervention was deployed ineffectively (eg, clinicians failed to send reminder emails as the protocol indicated). Current research on eMental healthcare technologies lack implementation frameworks [27], and the implementation literature has traditionally focused on the broadly defined eHealth [28,29], lacking a specific focus on mental healthcare. Conceptualizing and assessing implementation outcomes (ie, how implementation of a program works in specifics contexts) can advance the understanding of implementation processes (eg, cost, required in-service training, required infrastructure), enable studies of the comparative effectiveness of implementation strategies, and enhance efficiency in translating research into practice. The aim of this systematic review was to examine how the implementation of eMental healthcare technologies for children and adolescents with anxiety or depression has been studied (ie, the research questions asked, populations studied, and the rigor of the methodology used) and to describe implementation findings with respect to implementation processes and outcomes.

Methods

Design

A protocol for the review was developed and registered with PROSPERO (Registration #CRD42016049884). Reporting of the review adheres to the Preferred Reporting Items of Systematic Reviews and Meta-Analyses statement checklist [30]. Funding for the review was provided by the Canadian Institutes of Health Research (201404KRS). This organization had no involvement in any aspect of the conduct, analysis, and manuscript preparation of this review. This systematic review did not require ethics approval nor does it contain any individual person’s data in any form.

Search Strategy

A research librarian developed the search strategies for 5 databases: MEDLINE, EMBASE, PsycINFO, CINAHL, and the Cochrane Database of Systematic Reviews using date (2000-2016) restrictions. No restriction was placed on the study design or language to capture a broad range of evidence. The strategy was peer reviewed prior to implementation. The searches included literature published until December 5, 2016. Grey literature was searched using Google Scholar and ProQuest Dissertations & Theses Global. Clinical trials were searched using clinicialtrials.gov. Conference proceedings of the last 2 years (2014-2016) of the International Society for Research on Internet Interventions were searched as well. Reference lists of included studies were also searched. Multimedia Appendix 1 provides the search terms developed for the MEDLINE database.

Criteria for Considering Studies in the Review

Studies were included if they met the following criteria: (1) assessed an eMental healthcare technology for treating or preventing anxiety or depression; (2) the technology under investigation involved children or adolescents (<18 years), or their parents or healthcare providers. Studies that included both adolescents <18 and young adults were included if the mean age of the study sample was ≤19 years to ensure that the results largely reflected implementation with children and adolescents; (3) the technology needed to be an internet-, computer-, tablet-, or mobile-based program or mobile app; (4) the technology was used within the primary or secondary healthcare system (as opposed to the school system); (5) reported on an implementation outcome as a primary or secondary measure. The 8 outcomes of interest were drawn from Proctor and colleagues’ implementation framework [31]. These constructs were defined as follows: acceptability (ie, a measure of satisfaction with a technology including attitudes, functionality, preferences, and user experience); adoption (ie, the intention, initial decision, or action to take up or utilize a technology); appropriateness (ie, the perceived fit, relevance, usefulness/helpfulness, or compatibility of a technology for a given practice setting or problem); cost, (ie, the financial impact of an implementation effort); feasibility, (ie, the extent to which a technology had utility and compatibility within the practice setting); fidelity, (ie, the degree to which a technology was implemented as it was intended); penetration, (ie, the spread and reach of a technology within a service setting and its subsystems); and sustainability, (ie, the extent to which a technology was maintained within standard operations) [31]. We excluded protocols, editorials, and studies assessing telehealth interventions, including telepsychiatry and videoconferencing.

Screening for Eligibility

References were organized and screened using EndNote X7.2.1. Three reviewers (AS, NDG, and MO) independently screened the titles and abstracts in the EndNote library and calculated the interrater agreement with the kappa statistic for every 100 articles screened [32]. Once a sufficiently high kappa was reached (≥0.80), the remaining references in the library were divided into 3 equally sized groups. Each reviewer was given 2 of the 3 groups, allowing each article to be assessed by 2 reviewers, and each reviewer screened the studies using the title and abstract. Three reviewers (AS, NDG, MO) independently reviewed the full-text of studies that were identified as potentially eligible using the review’s inclusion and exclusion criteria. Any discrepancies were discussed among the reviewers and taken to a third party (ASN) if no agreement could be reached.

Data Extraction

Data were extracted by one reviewer (AS, NDG, or MO), and reviewed for accuracy and completeness by another. After verifying all of the extracted data, discrepancies were resolved by discussion or adjudication by another party (ASN). Extracted data included information on study characteristics (eg, authors, date of publication, country, and design) and implementation objectives, characteristics of the technology, study population, study setting, and implementation results. We coded statistically significant favorable ratings on measurement scales as “positive results” (eg, healthcare providers rating an intervention as highly acceptable) and statistically significant unfavorable ratings on measurement scales as “negative results” (eg, parents did not think the activities in the program were acceptable for their child’s age). Those studies that reported both positive and negative findings were coded as having “mixed results” (eg, child and parents did not show the same level of satisfaction with the intervention).

Quality Assessment

Methodological quality was assessed independently by 2 of the 3 assessors (AS, NDG, and MO). Disagreements were resolved through discussion. ASN participated when consensus could not be reached. The quality of studies was assessed using the Mixed Methods Appraisal Tool (MMAT) [33]. The scoring scale ranges from 0 (low quality) to 100 (high quality) and was pilot tested for reliability [34]. The MMAT consists of 2 screening questions applicable to all types of study designs and 3-4 questions applicable to specific study designs (eg, The questions relevant to each study design were scored with the number of ‘yes’ answers summed, divided by the total number of questions, and multiplied by 100 to give a final percentage score.) Qualitative studies were appraised for the relevance of data sources, processes used for data analyses, consideration of study context, and the researchers’ potential influences. Randomized controlled trials (RCTs) were appraised for sequence generation, allocation concealment, the completeness of outcome data, and study attrition. All other quantitative studies were appraised for recruitment strategies and sample representativeness, outcome measurements, the completeness of outcome data and study response rates, and the comparability of comparison groups (when applicable). Mixed methods studies were assessed for the relevance of the design, integration of methods, and limitations to integration. We did not exclude any studies on the basis of low-quality assessment scores.

Data Analysis

A codebook approach [35] was used to organize data extraction according to the 8 implementation outcome categories [31]. When no implementation data were available for a particular outcome in the included paper, the category remained empty. Four team members (NDG, MO, AS, and ASN) reviewed the assignments of the study outcome data to the implementation categories, and assignments were finalized after all team members were confident that the data were categorized accurately. Descriptive statistics (counts, frequencies) were used to summarize patterns across studies.

Results

Literature Search and Selection

The search strategy identified 6269 citations after removal of duplicates. Of these, 727 studies were considered potentially relevant based on their title and abstract (Figure 1). After full-text review, 46 studies (plus one erratum) articles met the inclusion criteria.
Figure 1

Literature search flow diagram.

Description of Included Studies

Table 1 outlines the format and delivery characteristics of the technologies assessed in the included studies. The implementation of eMental healthcare technologies for anxiety and depressive disorders in childhood or adolescence was assessed in 23 and 18 studies, respectively. Five studies assessed a technology that targeted both anxiety and depression. The location of studies was restricted to economically developed countries with the United States (20 studies) and Australia (13 studies) being the most common locations. A total of 32 studies examined internet-based technologies, 11 examined computer-based technologies, and 3 examined smartphone-based (app/short message service, SMS, text message) technologies as part of treatment.
Table 1

Reported format and delivery characteristics of eMental Health technologies for adolescents with anxiety and depression.

Technology/Program NameParticipantsTechnology Details
Target age (years)Parent involvementFeatures (sessions)Healthcare provider contact
Before programDuring program
Anxiety Programs
Cool Little Kids Online [36]3-6YesInternet-based (8 modules)NonePhone
Camp-Cope-A-Lot [37-41]7-13YesComputer-based (12 sessions)In-personIn-persona
DARE Program [42,79]8-12YesInternet-based (11 modules)NonePhone, within programb
BiP OCD [78,80]12-17YesInternet-based (12 chapters)NoneWithin program
BRAVE-ONLINEc [46-50]7-18dYesInternet-based (10 sessions)NoneEmail, within program
Cognitive bias modification [43]10-15YesInternet-based (8 sessions)In-personeNone
Ricky and the Spider [51]6-12YesInternet-based (8 levels)In-personIn-person
Cool Teens [44,52,53]14-18NoComputer-based (8 modules)NonePhone
Self-help manual and treatment [54]15-21NoInternet-based (9 modules)NoneWithin program
SmartCAT App [45]9-14NoMobile-based app (Ad hoc; includes 5 main components)NoneIn-person, within program
Virtual School Environment [55]8-12YesComputer-based (12 sessions)In-personIn-person
Depression Programs
Decision aid tool [56]12-25NoInternet-based (9 component webpage used during an appointment or in the waiting room)NoneIn-person
Monitoring tool [57,58]15-24NoInternet/tablet-based (Depression assessments)NoneIn-personf
Rebound (Australia) [59]15-25NoInternet-based (User can select from 56 sessions)NoneWithin program
MAYA (Chile) [81]12-18NoInternet-based (1 session)In-personIn-person
iDOVE (United States) [60]13-17NoMobile-based (8 weeks of 2 way SMSg text messaging)In-personSMS text message
Technology-enhanced CBTh intervention (United States) [61]12-17NoMobile/tablet-based (SMS text messaging)NoneIn-person, SMS text message
Behavioral Activation (United States) [62]12-17NoInternet-based (Ad hoc)NoneNone
CATCH-IT (United States) [63-70]14-21YesInternet-based (11-14 modules)In-personPhone
SPARX (Australia) [71]12-19NoComputer-based (7 modules)NonehPhone
Depression Experience Journal (United States) [72]8-19YesInternet-based (Ad hoc)In-personNonee
Anxiety + Depression Programs
Multi-family group therapy (Canada) [73]6-12YesInternet-based (3 sessions)NoneEmail
Treasure Hunt (Switzerland) [74]9-13NoInternet-based (6 levels)NoneIn-person
SPARX (New Zealand) [75]16-18NoComputer-based (7 modules)NoneIn-person
Problem-solving therapy (Netherlands) [76]12-21NoInternet-based (5 lessons)NoneWithin program
RU-OK (United Kingdom) [77]13-15NoInternet-based (Ad hoc)NoneNonee

aSessions 1-6 were self-led, but conducted in the presence of a healthcare provider; sessions 7-12 were primarily led by a healthcare provider.

bWithin program refers to communication self-contained within the program (internal email program). In this case, the user would have to login to see the communication that would not be delivered to their external email.

cIntervention has been modified for different age groups under slightly different names.

dBRAVE for children-ONLINE targets participants aged 7-14 years; BRAVE for teenagers-ONLINE targets participants aged 12-18 years.

eIntervention did not contain healthcare provider contact, but participants were referred by healthcare providers or were engaged with the healthcare system.

fParticipants did not use the intervention for healthcare provider interaction; providers received data or email updates that were used in in-person sessions.

gSMS: short message service.

hCBT: cognitive behavioral therapy.

iSPARX was tested in different implementation contexts, some of which included no in-person contact and some with in-person contact.

Study Quality

Details on the quality of the studies are provided in Multimedia Appendix 2. In total, 11 studies on eMental healthcare technologies for anxiety were of excellent quality with a score of 100 [37,46-50,52,55,78-80], 6 were of very good quality with a score of 75 [36,38,39,44,51,53], 4 were of moderate quality with a score of 50 [42,43,45,54], and 2 were of poor quality with a MMAT score of 25 [40,41]. Studies on technologies for depression also varied in quality: 10 studies were of excellent quality [56,57,59,63-67,71,81], 2 were of very good quality [60,68], 4 were of moderate quality [58,61,69,72], and 2 were of low quality and received a score of 25 [62] and 0 [70]. Studies evaluating technologies applicable to both anxiety and depression were of excellent [75,77], very good [74], and moderate [73,76] quality. The most common factors impacting the quality scores for quantitative studies were the lack of description on how randomization sequences were generated and if/how allocation was concealed (ie, see MMAT items 2.1 and 2.2 in Multimedia Appendix 2). The common factor impacting quality scores for mixed-method studies was the lack of consideration of data triangulation (ie, see MMAT item 5.3 in Multimedia Appendix 2). Literature search flow diagram. Reported format and delivery characteristics of eMental Health technologies for adolescents with anxiety and depression. aSessions 1-6 were self-led, but conducted in the presence of a healthcare provider; sessions 7-12 were primarily led by a healthcare provider. bWithin program refers to communication self-contained within the program (internal email program). In this case, the user would have to login to see the communication that would not be delivered to their external email. cIntervention has been modified for different age groups under slightly different names. dBRAVE for children-ONLINE targets participants aged 7-14 years; BRAVE for teenagers-ONLINE targets participants aged 12-18 years. eIntervention did not contain healthcare provider contact, but participants were referred by healthcare providers or were engaged with the healthcare system. fParticipants did not use the intervention for healthcare provider interaction; providers received data or email updates that were used in in-person sessions. gSMS: short message service. hCBT: cognitive behavioral therapy. iSPARX was tested in different implementation contexts, some of which included no in-person contact and some with in-person contact.

Trends in the Study of Implementation Among eMental Healthcare Technologies

Figure 2 displays the frequency by which implementation outcomes were studied for eMental healthcare technologies. Studies on eMental healthcare technologies for anxiety most commonly evaluated acceptability (78%), adoption (43%), and feasibility (43%) of the technologies, while studies on technologies for depression evaluated appropriateness (56%) and acceptability (50%). Studies testing technologies relevant to both anxiety and depression tended to evaluate acceptability (100%), adoption (40%), and appropriateness (40%). Across all studies, cost, fidelity, and penetration were the least measured implementation outcomes, and none of the studies evaluated technology sustainability in the healthcare service/system in which the technology was employed. While positive findings were reported 60% of the time or more in relation to measures of acceptability and costs across all included studies (Figure 3), mixed findings were reported more than 50% of the time in studies that measured adoption, feasibility, and fidelity outcomes.
Figure 2

Implementation outcomes measured according to the mental health condition targeted.

Figure 3

Conclusions reported by the authors for implementation outcomes.

Implementation Findings for eMental Healthcare Technologies for Anxiety

Table 2 outlines the implementation findings among eMental healthcare technologies for anxiety. Both positive (61%) [36,38,39,41,43,45,50,54,55,78,80] and mixed (39%) [37,40,42,46,48,49,79] findings were reported across 18 studies on technology acceptability. Positive results included high satisfaction and positive technology recommendations, with acceptability reported by parents [36,39,41,43,50], children [38,39,41,43,45,50,54,55,78,80], and healthcare providers [55]. Technology adoption was examined by 10 studies with studies reporting positive (60%) [42-44,47,50,53] and mixed (40%) [45,46,55,79] findings for technology compliance and adherence. Of the 6 studies that examined appropriateness, 4 described positive results (67%) [39,50,51,78] such as positive attitudes and perceived helpfulness of the technology among healthcare providers [39,51], while 2 studies [53,55] reported mixed results (33%) including moderate usefulness and helpfulness of the program for the youth [53]. Of the 23 studies on anxiety-directed technologies, only one examined cost, including initial implementation challenges such as startup costs, designated computers and clinic space, and technical assistance requirement [39]. Studies that examined the feasibility of anxiety technologies described more mixed (70%) [38-40,44,52,53,55] than positive (30%) [36,45,80] results, including barriers to participation such as finding time to complete tasks and ease of use. Only one study investigated technology penetration, reporting positive penetration with technology purchased by 56 child psychiatric institutions or practitioners within 1 year [51]. Studies examining eMental healthcare technologies for anxiety did not investigate or report on fidelity or sustainability.
Table 2

Implementation findings among eMental healthcare technologies for anxiety.

Program and studyParticipants (n)Implementation outcome (measurea); findingsb
Cool Little Kids Online
Morgan et al [36]

Parents of children aged 3-6 years with anxiety problems (n=51)

Acceptability (self-developed questionnaire); P: +

Feasibility (self-developed questionnaire); P: +

Camp-Cope-A-Lot
Salloum et al [40]

Parents of children aged 7-13 years with an anxiety disorder (n=100)

Acceptability (published instrument); P: +/–

Feasibility (published instrument); P: +/–

Storch et al [41]

Children aged 7-13 years with an anxiety disorder (n=49)

Acceptability (published instrument); C: +

Salloum et al [39]

Children aged 7-13 years with an anxiety disorder (n=3) and their parents (n=7)

Healthcare providers (n=3)

Project coordinators (n=3)

Administrators (n=3)

Acceptability (published instrument); P, C: +

Appropriateness (self-developed interview); HCP: +

Cost (self-developed interview); HCP, A: –

Feasibility (published instrument & self-developed interview); HCP, A, PC: +/–

Crawford et al [37]

Children aged 7-13 years with an anxiety disorder (n=17)

Acceptability (published instrument); C: +/–

Khanna and Kendall [38]

Children aged 7-13 years with an anxiety disorder (n=16)

Acceptability (published instrument); C: +

Feasibility (self-developed questionnaire); C: +/–

DARE Program
Vigerland et al [42]

Children (n=46) aged 8-12 years with an anxiety disorder and their parents (n=46)

Acceptability (published instrument); P, C: +/–

Adoption (program utilization); P, C: +

Vigerland et al [79]

Children aged 8-12 years with social phobia (n=30) and their parents (n=57)

Acceptability (published instrument); C: +

Adoption (program utilization); C: +/–

BiP OCDc
Lenhard et al [80]

Adolescents aged 12-13 years with OCD (n=8)

Acceptability (self-developed interview); C: +

Feasibility (self-developed interview); C: +

Lenhard et al [78]

Adolescents aged 12-17 years with OCD (n=21)

Acceptability (self-developed questionnaire); C: +

Appropriateness (self-developed questionnaire); C: +

BRAVE ONLINE
Donovan and March [46]

Children aged 3-6 with an anxiety disorder (n=23)

Acceptability (self-developed questionnaire); C: +/–

Adoption (program utilization); C: +/–

Anderson et al [47]

Children and adolescents aged 7-18 years with an anxiety disorder (n=132) and their parents (n=NRd)

Adoption (program utilization); P, C: +

Spence et al [48]

Adolescents aged 12-18 years with clinical levels of anxiety (n=44)

Acceptability (adapted questionnaire); C: +/–

March et al [49]

Children aged 7-12 years with an anxiety disorder (n=40) and their parents (n=NR)

Acceptability (self-developed questionnaire); P, C: +/–

Spence et al [50]

Children and adolescents aged 7-14 years with clinical levels of anxiety (n=27) and their parents (n=NR)

Acceptability (self-developed questionnaire); P, C: +

Adoption (program utilization); P, C: +

Appropriateness (self-developed questionnaire); P: +

Cognitive bias modification
Reuland and Teachman [43]

Children and adolescents aged 10-15 years with social anxiety and their mothers (n=18 mother-child dyads)

Acceptability (self-developed interview); P, C: +

Adoption (program utilization); P, C: +

Ricky and the Spider
Brezinka [51]

Children and adolescents aged 6-13 years with OCD (n=18)

Healthcare providers (n=13)

Appropriateness (self-developed questionnaire); HCP: +

Penetration (uptake by practices); HCP: +

Cool Teens
Wuthrich et al [53]

Adolescents aged 14-17 years with an anxiety disorder (n=24)

Adoption (program utilization); C: +

Appropriateness (self-developed questionnaire); C: +/–

Feasibility (self-developed questionnaire); C: +/–

Cunningham et al [52]

Adolescents aged 14-18 years with an anxiety disorder (n=22)

Nonclinical adolescents (n=13)

Feasibility (self-developed questionnaire); C: +/–

Cunningham and Wuthrich [44]

Adolescents aged 14-16 years with an anxiety disorder (n=5)

Adoption (program utilization); C: +

Feasibility (self-developed questionnaire); C: +/–

Virtual School Environment
Sarver et al [55]

Children aged 8-12 years with a principal diagnosis of social anxiety disorder (n=17)

Healthcare providers (n=NR)

Acceptability (self-developed questionnaire); C, HCP: +

Adoption (program utilization); C: +/–

Appropriateness (self-developed questionnaire); C: +/–

Feasibility (successful use & technical difficulties); C, HCP: +/–

SmartCAT App
Pramana et al [45]

Children and adolescents aged 9-14 years with a diagnosis of GADe, social or specific phobia, attention deficit hyperactivity disorder, oppositional defiant disorder, or social anxiety disorder (n=9)

Acceptability (self-developed questionnaire); C: +

Adoption (program utilization); C: +/–

Feasibility (published instrument); C: +

Self-help
Tillfors et al [54]

Adolescents aged 15-21 years with social anxiety disorder (n=10)

Acceptability (self-developed questionnaire); C: +

aSelf-developed questionnaire/interview: bespoke questions or survey items created by the researcher; published instrument: validated tool with citation in text; program utilization/physician adherence: metrics of usage.

bC: child/adolescent/young adult report; HCP: healthcare provider report; P: parent report; +: high/positive findings; – negative findings; +/– mixed findings.

cOCD: obsessive-compulsive disorder.

dNR: not reported.

eGAD: Generalized anxiety disorder.

Implementation outcomes measured according to the mental health condition targeted. Conclusions reported by the authors for implementation outcomes. Implementation findings among eMental healthcare technologies for anxiety. Parents of children aged 3-6 years with anxiety problems (n=51) Acceptability (self-developed questionnaire); P: + Feasibility (self-developed questionnaire); P: + Parents of children aged 7-13 years with an anxiety disorder (n=100) Acceptability (published instrument); P: +/– Feasibility (published instrument); P: +/– Children aged 7-13 years with an anxiety disorder (n=49) Acceptability (published instrument); C: + Children aged 7-13 years with an anxiety disorder (n=3) and their parents (n=7) Healthcare providers (n=3) Project coordinators (n=3) Administrators (n=3) Acceptability (published instrument); P, C: + Appropriateness (self-developed interview); HCP: + Cost (self-developed interview); HCP, A: – Feasibility (published instrument & self-developed interview); HCP, A, PC: +/– Children aged 7-13 years with an anxiety disorder (n=17) Acceptability (published instrument); C: +/– Children aged 7-13 years with an anxiety disorder (n=16) Acceptability (published instrument); C: + Feasibility (self-developed questionnaire); C: +/– Children (n=46) aged 8-12 years with an anxiety disorder and their parents (n=46) Acceptability (published instrument); P, C: +/– Adoption (program utilization); P, C: + Children aged 8-12 years with social phobia (n=30) and their parents (n=57) Acceptability (published instrument); C: + Adoption (program utilization); C: +/– Adolescents aged 12-13 years with OCD (n=8) Acceptability (self-developed interview); C: + Feasibility (self-developed interview); C: + Adolescents aged 12-17 years with OCD (n=21) Acceptability (self-developed questionnaire); C: + Appropriateness (self-developed questionnaire); C: + Children aged 3-6 with an anxiety disorder (n=23) Acceptability (self-developed questionnaire); C: +/– Adoption (program utilization); C: +/– Children and adolescents aged 7-18 years with an anxiety disorder (n=132) and their parents (n=NRd) Adoption (program utilization); P, C: + Adolescents aged 12-18 years with clinical levels of anxiety (n=44) Acceptability (adapted questionnaire); C: +/– Children aged 7-12 years with an anxiety disorder (n=40) and their parents (n=NR) Acceptability (self-developed questionnaire); P, C: +/– Children and adolescents aged 7-14 years with clinical levels of anxiety (n=27) and their parents (n=NR) Acceptability (self-developed questionnaire); P, C: + Adoption (program utilization); P, C: + Appropriateness (self-developed questionnaire); P: + Children and adolescents aged 10-15 years with social anxiety and their mothers (n=18 mother-child dyads) Acceptability (self-developed interview); P, C: + Adoption (program utilization); P, C: + Children and adolescents aged 6-13 years with OCD (n=18) Healthcare providers (n=13) Appropriateness (self-developed questionnaire); HCP: + Penetration (uptake by practices); HCP: + Adolescents aged 14-17 years with an anxiety disorder (n=24) Adoption (program utilization); C: + Appropriateness (self-developed questionnaire); C: +/– Feasibility (self-developed questionnaire); C: +/– Adolescents aged 14-18 years with an anxiety disorder (n=22) Nonclinical adolescents (n=13) Feasibility (self-developed questionnaire); C: +/– Adolescents aged 14-16 years with an anxiety disorder (n=5) Adoption (program utilization); C: + Feasibility (self-developed questionnaire); C: +/– Children aged 8-12 years with a principal diagnosis of social anxiety disorder (n=17) Healthcare providers (n=NR) Acceptability (self-developed questionnaire); C, HCP: + Adoption (program utilization); C: +/– Appropriateness (self-developed questionnaire); C: +/– Feasibility (successful use & technical difficulties); C, HCP: +/– Children and adolescents aged 9-14 years with a diagnosis of GADe, social or specific phobia, attention deficit hyperactivity disorder, oppositional defiant disorder, or social anxiety disorder (n=9) Acceptability (self-developed questionnaire); C: + Adoption (program utilization); C: +/– Feasibility (published instrument); C: + Adolescents aged 15-21 years with social anxiety disorder (n=10) Acceptability (self-developed questionnaire); C: + aSelf-developed questionnaire/interview: bespoke questions or survey items created by the researcher; published instrument: validated tool with citation in text; program utilization/physician adherence: metrics of usage. bC: child/adolescent/young adult report; HCP: healthcare provider report; P: parent report; +: high/positive findings; – negative findings; +/– mixed findings. cOCD: obsessive-compulsive disorder. dNR: not reported. eGAD: Generalized anxiety disorder. Implementation findings among eMental healthcare technologies for depression. Adolescents aged 12-19 years with depressive symptoms (n=94) Acceptability (self-developed questionnaire); C: + Primary caregivers (n=38) of hospitalized adolescents aged 8-19 years Acceptability (self-developed interview); P: + Appropriateness (self-developed interview); P: +/– Adolescents aged 12-17 years with clinical and subclinical depression (n=24) Appropriateness (self-developed questionnaire); C: + Feasibility (voiced opinions); C: + Fidelity (voiced opinions); C: +/– Adolescents aged 12-17 years with clinical and subclinical depression (n=24) Acceptability (published instrument); C, HCP: + Appropriateness (self-developed questionnaire); HCP: + Adolescents and young adults aged 12-25 years with mild to moderate-severe depression (n=66) Acceptability (published instrument); C: + Adoption (program utilization); C: +/– Adolescents and young adults aged 14-25 years diagnosed with depressive symptoms or a depressive disorder (n=101) FHealthcare providers (n=33) Acceptability (self-developed questionnaire); C, HCP: + Appropriateness (self-developed questionnaire); C, HCP: +/– Adolescents and young adults aged 15-25 years diagnosed with major depressive disorder (n=15) FHealthcare providers (n=7) Appropriateness (self-developed questionnaire & interview); C, HCP: + Feasibility (self-developed interview); C, HCP: +/– Adolescents and young adults aged 15-24 years in partial or full remission of major depressive disorder (n=42) Adoption (program utilization); C: + Female adolescents aged 12-18 years with symptoms of depression (n=15) Healthcare providers (n=5) Acceptability (self-developed questionnaire); C: +/– Feasibility (self-developed questionnaire); C: +/– Adolescents aged 13-17 years at high risk for depression and with a past-year history of physical peer violence (n=16) Acceptability (adapted published instrument); C: +/– Adoption (program utilization); C: +/– Appropriateness (self-developed interview); C: + Feasibility (adapted published instrument); C: + Adolescents and young adults aged 14-21 years with subthreshold depression (n=83) Appropriateness (adapted questionnaire); C: +/– Feasibility (adapted questionnaire); C: + Adolescents and young adults aged 14-21 years with subthreshold depression (n=83) Cost (economic analysis); C: + Adolescents and young adults aged 14-21 years with subthreshold depression (n=83) Healthcare providers (n=63) Acceptability (self-developed questionnaire); HCP: +/– Appropriateness (self-developed questionnaire); HCP: +/– Feasibility (self-developed questionnaire); HCP: +/– Adolescents and young adults aged 14-21 years with subthreshold depression (n=83) Appropriateness (adapted questionnaire); C: +/– Adolescents with subthreshold depression (n=83) Primary healthcare providers (n=12) Healthcare settings (n=5) Cost (marketing strategy success/cost reporting); HCP: + Penetration (uptake by practices); HCP: + Adolescents and young adults aged 14-21 years with subthreshold depression (n=83) Acceptability (self-developed questionnaire); C: + Adoption (program utilization); C: +/– Appropriateness (self-developed questionnaire); C: +/– Fidelity (physician adherence); HCP: +/– Adolescents and young adults aged 14-21 years with subthreshold depression (n=83) Adoption (program utilization); C: +/– Fidelity (physician adherence); HCP: +/– aSelf-developed questionnaire/interview: bespoke questions or survey items created by the researcher; published instrument: validated tool with citation in text; program utilization/physician adherence: metrics of usage. bC: child/adolescent/young adult report; HCP: healthcare provider report; P: parent report; +: high/positive findings; – negative findings; +/– mixed findings. cCBT: cognitive behavioral therapy.

Implementation Findings for eMental Healthcare Technologies for Depression

Table 3 displays the implementation findings among eMental healthcare technologies for depression. Most studies reported the technologies as acceptable (67%) with high satisfaction [56,61,63,72], recommendations for use [71], acceptability, and ease of use among children, parents, and healthcare providers [58]. The remainder (33%) reported mixed acceptability [60,70,81]. Of the 6 studies that examined adoption, one study (17%) described high usage [59], while the remaining studies (83%) described moderate or mixed adherence [56,63-65] and usage [60]. Appropriateness was the most commonly measured outcome among eMental healthcare technologies for depression, although results varied. Four studies (40%) reported high helpfulness [57,60-62], while 6 studies (60%) reported mixed outcomes [58,63,68-70,72]. Two studies examined cost outcomes [66,67] and described intervention implementation as economically viable. Of the 6 studies that investigated feasibility, 3 (50%) reported positive or high outcomes [60,62,69], while the other 3 (50%) described mixed ease of use [70,81] and attitudes [57]. Four studies examining fidelity reported mixed results [62-65], particularly healthcare provider adherence to the program. The CATCH-IT program was the only intervention that was examined for penetration [66]. Although penetration was successful, implementing the technology successfully in 12 practices, several barriers to implementation were described, such as low levels of interest from healthcare providers and lack of established procedures and guidelines [66]. Studies examining eMental healthcare technologies for depression did not investigate or report on sustainability.
Table 3

Implementation findings among eMental healthcare technologies for depression.

Program and studyParticipants (n)Implementation outcome (measurea); findingsb
SPARX
Merry et al [71]

Adolescents aged 12-19 years with depressive symptoms (n=94)

Acceptability (self-developed questionnaire); C: +

Depression Experience Journal
Demaso et al [72]

Primary caregivers (n=38) of hospitalized adolescents aged 8-19 years

Acceptability (self-developed interview); P: +

Appropriateness (self-developed interview); P: +/–

Behavioral activation intervention
Davidson et al [62]

Adolescents aged 12-17 years with clinical and subclinical depression (n=24)

Appropriateness (self-developed questionnaire); C: +

Feasibility (voiced opinions); C: +

Fidelity (voiced opinions); C: +/–

CBTc
Kobak et al [61]

Adolescents aged 12-17 years with clinical and subclinical depression (n=24)

Acceptability (published instrument); C, HCP: +

Appropriateness (self-developed questionnaire); HCP: +

Decision aid
Simmons et al [56]

Adolescents and young adults aged 12-25 years with mild to moderate-severe depression (n=66)

Acceptability (published instrument); C: +

Adoption (program utilization); C: +/–

Monitoring tool
Hetrick et al [58]

Adolescents and young adults aged 14-25 years diagnosed with depressive symptoms or a depressive disorder (n=101)

FHealthcare providers (n=33)

Acceptability (self-developed questionnaire); C, HCP: +

Appropriateness (self-developed questionnaire); C, HCP: +/–

Hetrick et al [57]

Adolescents and young adults aged 15-25 years diagnosed with major depressive disorder (n=15)

FHealthcare providers (n=7)

Appropriateness (self-developed questionnaire & interview); C, HCP: +

Feasibility (self-developed interview); C, HCP: +/–

Rebound
Rice et al [59]

Adolescents and young adults aged 15-24 years in partial or full remission of major depressive disorder (n=42)

Adoption (program utilization); C: +

MAYA
Carrasco [81]

Female adolescents aged 12-18 years with symptoms of depression (n=15)

Healthcare providers (n=5)

Acceptability (self-developed questionnaire); C: +/–

Feasibility (self-developed questionnaire); C: +/–

iDOVE
Ranney et al [60]

Adolescents aged 13-17 years at high risk for depression and with a past-year history of physical peer violence (n=16)

Acceptability (adapted published instrument); C: +/–

Adoption (program utilization); C: +/–

Appropriateness (self-developed interview); C: +

Feasibility (adapted published instrument); C: +

CATCH-IT
Gladstone et al [69]

Adolescents and young adults aged 14-21 years with subthreshold depression (n=83)

Appropriateness (adapted questionnaire); C: +/–

Feasibility (adapted questionnaire); C: +

Ruby et al [67]

Adolescents and young adults aged 14-21 years with subthreshold depression (n=83)

Cost (economic analysis); C: +

Eisen et al [70]

Adolescents and young adults aged 14-21 years with subthreshold depression (n=83)

Healthcare providers (n=63)

Acceptability (self-developed questionnaire); HCP: +/–

Appropriateness (self-developed questionnaire); HCP: +/–

Feasibility (self-developed questionnaire); HCP: +/–

Iloabachie et al [68]

Adolescents and young adults aged 14-21 years with subthreshold depression (n=83)

Appropriateness (adapted questionnaire); C: +/–

Van Voorhes et al [66]

Adolescents with subthreshold depression (n=83)

Primary healthcare providers (n=12)

Healthcare settings (n=5)

Cost (marketing strategy success/cost reporting); HCP: +

Penetration (uptake by practices); HCP: +

Van Voorhes et al [63]

Adolescents and young adults aged 14-21 years with subthreshold depression (n=83)

Acceptability (self-developed questionnaire); C: +

Adoption (program utilization); C: +/–

Appropriateness (self-developed questionnaire); C: +/–

Fidelity (physician adherence); HCP: +/–

Van Voorhes et al [64,65]

Adolescents and young adults aged 14-21 years with subthreshold depression (n=83)

Adoption (program utilization); C: +/–

Fidelity (physician adherence); HCP: +/–

aSelf-developed questionnaire/interview: bespoke questions or survey items created by the researcher; published instrument: validated tool with citation in text; program utilization/physician adherence: metrics of usage.

bC: child/adolescent/young adult report; HCP: healthcare provider report; P: parent report; +: high/positive findings; – negative findings; +/– mixed findings.

cCBT: cognitive behavioral therapy.

Implementation Findings for eMental Healthcare Technologies for Anxiety and Depression

Table 4 shows the implementation findings among eMental healthcare technologies for both anxiety and depression. All 5 studies examined the acceptability of technologies aimed at treating anxiety and depression. Of these, 3 (60%) reported high satisfaction [73-75], with children and parents describing that they would not change any aspects of the program, and 2 studies (40%) reported moderate satisfaction [76,77]. Two studies examined adoption and reported low adherence to program sessions [75] and high website usage rates [77]. Of the 2 studies that examined appropriateness, both found positive attitudes and perceived helpfulness of the intervention from children and parents [73] and healthcare providers [74]. One study examined penetration of technology, reporting successful integration of the program into a practice of 2000 healthcare providers [74]. None of the 5 studies examined cost, feasibility, fidelity, or sustainability.
Table 4

Implementation findings among eMental healthcare technologies for both anxiety and depression.

Program and studyParticipants (n)Implementation outcome (measurea); findingsb
Group therapy
Sapru et al [73]

Children aged 6-12 years referred with a mood or anxiety disorder (n=16) and their parents (n=NRg)

Acceptability (self-developed questionnaire); C, P: +

Appropriateness (open-ended feedback); C, P: +

Treasure Hunt
Brezinka [74]

Children and adolescents aged 6-19 years with anxiety, depression, ODDc, or ADHDd (n=218)

Healthcare providers (n=124)

Acceptability (self-developed questionnaire); C: +

Appropriateness (self-developed questionnaire); HCP: +

Penetration (uptake by practices); HCP: +

SPARX
Bobier et al [75]

Adolescents aged 16-18 years with severe psychiatric disorders (namely mood and anxiety disorders; n=20)

Acceptability (self-developed questionnaire); C: +

Adoption (program utilization); C: –

Problem-solving therapy
Hoek et al [76]

Adolescents and young adults aged 12-21 years with self-reported or parent-reported mild to moderate depressive or anxiety symptoms (n=22)

Acceptability (published instrument); C: +/–

RU-OK
Ercan et al [77]

Adolescents aged 13-15 years attending a hospital school for depression and anxiety (n=105)

Acceptability (self-developed questionnaire); C: +/–

Adoption (program utilization); C: +

aSelf-developed questionnaire/interview: bespoke questions or survey items created by the researcher; published instrument: validated tool with citation in text; program utilization/physician adherence: metrics of usage.

bC: child/adolescent/young adult report; HCP: healthcare provider report; P: parent report; +: high/positive findings; – negative findings; +/– mixed findings.

cNR: not reported.

dODD: oppositional defiant disorder.

eADHD: attention deficit hyperactivity disorders.

Implementation findings among eMental healthcare technologies for both anxiety and depression. Children aged 6-12 years referred with a mood or anxiety disorder (n=16) and their parents (n=NRg) Acceptability (self-developed questionnaire); C, P: + Appropriateness (open-ended feedback); C, P: + Children and adolescents aged 6-19 years with anxiety, depression, ODDc, or ADHDd (n=218) Healthcare providers (n=124) Acceptability (self-developed questionnaire); C: + Appropriateness (self-developed questionnaire); HCP: + Penetration (uptake by practices); HCP: + Adolescents aged 16-18 years with severe psychiatric disorders (namely mood and anxiety disorders; n=20) Acceptability (self-developed questionnaire); C: + Adoption (program utilization); C: – Adolescents and young adults aged 12-21 years with self-reported or parent-reported mild to moderate depressive or anxiety symptoms (n=22) Acceptability (published instrument); C: +/– Adolescents aged 13-15 years attending a hospital school for depression and anxiety (n=105) Acceptability (self-developed questionnaire); C: +/– Adoption (program utilization); C: + aSelf-developed questionnaire/interview: bespoke questions or survey items created by the researcher; published instrument: validated tool with citation in text; program utilization/physician adherence: metrics of usage. bC: child/adolescent/young adult report; HCP: healthcare provider report; P: parent report; +: high/positive findings; – negative findings; +/– mixed findings. cNR: not reported. dODD: oppositional defiant disorder. eADHD: attention deficit hyperactivity disorders.

Discussion

Principal Findings

Complimentary to recent reviews [12,82], this systematic review reports on how the implementation of eMental healthcare technologies for children and adolescents with anxiety or depression has been studied and reported. The majority of studies included in the review were RCTs, and the methodological quality of studies was scored as moderate to high in all but a few cases. Broadly synthesized using Proctor’s [31] 8 dimensions of implementation, our review suggests that measures of acceptability, adoption, and appropriateness are more frequently reported than indicators of cost, fidelity, and sustainability. Further, the review highlights the lack of measurement precision around implementation constructs and the need to elucidate the relationship between implementation and effectiveness. Below, we highlight 5 key implications of our findings for advancing this emerging literature. Results derived from new lines of research can have significant practical value for decision-makers and administrators by providing the design of training, helping promote provider engagement, assisting in troubleshooting the obstacles that adolescents and parents encounter, and guiding projects that scale-up interventions in new contexts.

Improving the Validity of Acceptability Measures

The vast majority of studies included in the review examined some dimension of acceptability, signifying that this construct is important as an indicator of effective implementation, but its measurement varied. Satisfaction, a frequent acceptability metric, was reported as high among participants (generally >70%), but was largely derived from self-reports of parents and adolescents taken at a single time-point (typically posttreatment). This means we still know little about satisfaction/dissatisfaction among those who fail to complete the treatment, or how early perceptions of satisfaction might impact effort and adherence during the later stages of treatment. More than half of the studies used nonvalidated measures of acceptability, which are problematic for assessing reliability and psychometric sensitivity. Given that almost all validated psychiatric patient satisfaction measures are validated for adults (not children and adolescents) and that developmental age affects perceptions of satisfaction with healthcare [83], our findings raise the possibility of overestimated satisfaction ratings within this literature. Low actual adherence rates reported in many studies, particularly those treating depression [84], suggest that we need to know more about the relationship among satisfaction, adherence, and clinical improvement. Most importantly, differences between those who do and do not respond to inquiries about service satisfaction (ie, bias in nonresponse [85]) and the impact of novelty (ie, bias resulting from perceived “new” or “innovative” technology [86]) imply that satisfaction is a potentially tendentious implementation metric. Without psychometrically strong and developmentally appropriate measures of satisfaction and acceptability for eMental health, stakeholders run the risk of focusing on the wrong “pragmatic” attributes when determining if a given adolescent-focused intervention is worth long-term investment. As a metric frequently used to inform decision-making around service delivery, a more systematic approach to instrumentation around the acceptability construct is vital. Future research can use pragmatic trial designs and hybrid effectiveness-implementation designs that aim to elucidate mechanisms of action between acceptability and effectiveness. Reframing Adoption as Process not Product While reporting on adoption (namely adherence) was fairly common in studies we reviewed, authors reported mixed findings. Moreover, none of the studies in our review formatively examined adolescent, parent, or clinician adoption in terms of readiness for eMental health, intent-to-use, or ongoing decision-making. All of these factors play a central role in behavior change associated with effective mental health treatments [87,88,89]. As adoption continues to be conceptualized in the literature primarily as a posthoc measure of “adherence,” our review suggests process-related measures of adoption could be a valuable new line of research. Most of the studies in our review reported on interventions involving multiple sessions (ie, anxiety interventions had a minimum of 8 sessions), requiring the youth to sustain and repeat interactions over time. Research from other fields, like adolescent online learning and gaming, could provide important insights here. For example, research has shown that young people’s internet self-efficacy, self-regulatory skills, and perceived quality of online learner-instructor interaction are important predictors of online engagement [90]. Rather than viewing adoption as a relatively stable end-product of individual effort, emphasis should be placed on understanding the situated, mutually constitutive relationship of a young person and the eMental healthcare environment. For example, use of modeling and path analysis techniques to identify the direct and indirect effects of provider (eg, therapeutic alliance, communication style) and technical (eg, persuasive system design components) or therapeutic (eg, comorbidity, treatment credibility) factors on adoption may provide valuable and practical insights. Improved knowledge of these processes could help administrators design training, promote provider engagement, and pre-emptively address obstacles for youth and their families.

Perceived Suitability of eMental Healthcare for Adolescents With Anxiety and Depression

A little less than half of the studies testing depression interventions and a third of those focused on anxiety measured some dimension of “appropriateness,” with many reporting overall mixed results. Perceptions about the suitability of a given healthcare service in a particular setting, for a particular purpose, with a particular provider and clientele can be a function of organizational culture and climate, as well as a public opinion. In practice, eMental healthcare is still considered outside standard practice by most youth mental health service providers [91]; yet, investments in eMental healthcare are rarely withdrawn because of purported safety risks or over concerns about the quality of care. This suggests that administrators, providers, and the general public feel that eMental healthcare is an appropriate treatment modality, but still continue to prioritize its use in some contexts over others. It may because the treatment ideologies (ie, beliefs about the etiology of illness, the roles of the provider/patient, and the efficacy of various treatments) [92] held by clinicians, parents, and children/youth lead them to greater skepticism about whether eMental healthcare can deliver the same quality of care [93] as face-to-face services for children and adolescents. In particular, public opinion and clinician beliefs about depression-associated risks (eg, suicide, self-harm [94]), privacy [95], and the changes in provider-patient interaction via eHealthcare delivery [96]) could impact perceived appropriateness. This could be one explanation for the higher acceptability rates of anxiety-focused interventions than of depression-focused ones. Research on appropriateness would benefit from an exploration of how eMental healthcare treatment ideologies develop for different clinical contexts (ie, diagnosis, severity) and technological modalities (ie, teleconsultation, mobile apps, SMS text messaging) and assess their subsequent influence on other implementation factors. These lines of research could eventually assist providers in selecting eMental healthcare technologies to match the intensity of treatment with the complexity of the condition (ie, stepped care).

Disruption of Established Professional Roles, Responsibilities, and Working Styles

Findings from this review also make an important contribution to expanding our understanding of feasibility. The feasibility results observed in our review were most frequently related to provider-level concerns (eg, issues of training, need for technical support). This suggests that the workflow impacts of eMental health services are a vital area for future implementation research. Given that most of the eMental healthcare technologies in our review included some form of healthcare provider interaction before or during the treatment, their role cannot be underappreciated. Many of the studies described atypical interactions for providers trained in traditional psychotherapy, including use of SMS text messages, frequent short emails, and bidirectional electronic exchanges, technical support, etc. Our review echoes recent calls to move beyond the simplistic analyses of barriers and facilitators to models of feasibility that allow researchers to test how eHealth modalities disrupt established professional roles, responsibilities, and working styles [97]. We recommend increased emphasis on underdeveloped implementation outcomes like feasibility, where few comprehensive and validated instruments exist [98]. Knowledge generated from this research could inform strategic targeting of resources and the tailoring of implementation strategies at an early stage to maximize opportunities for normalization of new eMental health workflows. Studies in our review were limited by small sample sizes and were mostly focused on measuring clinician attitudes with a lesser focus on quantifying actual clinician behaviors. Policy-focused research involving clinical practice models for eMental health [99], effective training practices for eMental health, and guidelines for selecting safe and effective eMental health tools are needed to shape behaviors that will make eMental health feasible in routine care settings.

Toward Sustainable, Cost-Effective, Scalable eMental Health for Anxiety and Depression

Finally, this review highlights persistent gaps in the measurement of fidelity, penetration, and sustainability constructs. These implementation facets are important macro-level determinants of policies and strategies for technology integration [100,101]. However, because these factors often require longer-term follow-up to adequately assess, they pose unique methodological challenges for researchers. For example, sustainability and penetration constructs typically require very large sample sizes that are hard to obtain [102] and there is concern that the current methodological approaches for eMental healthcare technology have a long lag-time from initiation, to publication of outcomes or implementation. While the promise of scalable, more cost-effective treatments is widely argued in eMental health planning, there are knowledge gaps pertaining to how these services are costed, billed, and supported in the long term. As implementation research matures in this area, it will be critical to apply research methodologies that optimize the ecological validity of constructs and address these practical, real-world implications [103]. The use of structured, theory-driven implementation methodologies would provide flexibility to allow interventions to be adapted for use in routine care settings [99,104].

Limitations

Although this review was rigorous, carefully executed, and employed a robust methodological approach, it is not without limitations. Technologies being deployed in healthcare systems that have not been scientifically investigated and without reported implementation data were not available for our review. We did not search databases such as the NIH Reporter, which may have yielded additional eMental healthcare technology studies. While some of the studies in the NIH Reporter may have been additionally registered in the clinicaltrials.gov registry after our search, some may not have been. Given that eMental healthcare technologies are constantly appearing and disappearing from the behavioral health service landscape, published accounts of the state of this field will likely always be slightly outdated. This is true for all eHealth-related research syntheses, and it only underscores the need to promote an “evergreen” mentality for research that acknowledges that the evidence base is always evolving. The inclusion of multiple study designs created a challenge for summarizing study features and generalizing study findings. Nonetheless, this approach allowed for the comparison of different kinds of evidences that shape real-world policy and service delivery. By not limiting our search based on study design, but rather reporting on quality via a validated appraisal tool, we established a starting point for broad critical appraisal. Finally, the inconsistent use of eHealth terminology [105-107] across the literature required us to make judgment calls regarding how to group implementation data across the 8 outcome categories. This could have resulted in the misclassification of some factors within the wrong outcome category [31].

Conclusions

Acceptability of eMental healthcare technology appears to be high among users, and it is the most commonly investigated implementation outcome. Perceptions of the appropriateness of eMental healthcare technology for use in healthcare varied, as did the adoption of technologies in healthcare practice. These findings suggest that the implementation science of eMental health for adolescent anxiety and depression needs to mature. Validated implementation measures as well as research designs and analytic techniques that model complex interactions and implementation contexts should be pursued in earnest. Future studies should help bridge gaps in knowledge about the fidelity of eMental health interventions over time and how eMental health technologies spread through the healthcare system, direct and indirect costs, and sustainability models. Closing these knowledge gaps has the potential to make treatments more accessible and reduce the burden of anxiety and depression on affected children and adolescents.
  88 in total

1.  E-mental health: a new era in delivery of mental health services.

Authors:  Helen Christensen; Ian B Hickie
Journal:  Med J Aust       Date:  2010-06-07       Impact factor: 7.738

Review 2.  Factors that promote or inhibit the implementation of e-health systems: an explanatory systematic review.

Authors:  Frances S Mair; Carl May; Catherine O'Donnell; Tracy Finch; Frank Sullivan; Elizabeth Murray
Journal:  Bull World Health Organ       Date:  2012-05-01       Impact factor: 9.408

Review 3.  Global burden of disease attributable to mental and substance use disorders: findings from the Global Burden of Disease Study 2010.

Authors:  Harvey A Whiteford; Louisa Degenhardt; Jürgen Rehm; Amanda J Baxter; Alize J Ferrari; Holly E Erskine; Fiona J Charlson; Rosana E Norman; Abraham D Flaxman; Nicole Johns; Roy Burstein; Christopher J L Murray; Theo Vos
Journal:  Lancet       Date:  2013-08-29       Impact factor: 79.321

Review 4.  Determinants of successful telemedicine implementations: a literature study.

Authors:  Tom H F Broens; Rianne M H A Huis in't Veld; Miriam M R Vollenbroek-Hutten; Hermie J Hermens; Aart T van Halteren; Lambert J M Nieuwenhuis
Journal:  J Telemed Telecare       Date:  2007       Impact factor: 6.184

5.  A randomized controlled trial of the Cool Teens CD-ROM computerized program for adolescent anxiety.

Authors:  Viviana M Wuthrich; Ronald M Rapee; Michael J Cunningham; Heidi J Lyneham; Jennifer L Hudson; Carolyn A Schniering
Journal:  J Am Acad Child Adolesc Psychiatry       Date:  2012-01-21       Impact factor: 8.829

6.  Interpretation bias modification for youth and their parents: a novel treatment for early adolescent social anxiety.

Authors:  Meg M Reuland; Bethany A Teachman
Journal:  J Anxiety Disord       Date:  2014-10-05

7.  Integrative internet-based depression prevention for adolescents: a randomized clinical trial in primary care for vulnerability and protective factors.

Authors:  Benjamin W Van Voorhees; Karen Vanderplough-Booth; Joshua Fogel; Tracy Gladstone; Carl Bell; Scott Stuart; Jackie Gollan; Nathan Bradford; Rocco Domanico; Blake Fagan; Ruth Ross; Jon Larson; Natalie Watson; Dave Paunesku; Stephanie Melkonian; Sachiko Kuwabara; Tim Holper; Nicholas Shank; Donald Saner; Amy Butler; Amy Chandler; Tina Louie; Cynthia Weinstein; Shannon Collins; Melinda Baldwin; Abigail Wassel; Mark A Reinecke
Journal:  J Can Acad Child Adolesc Psychiatry       Date:  2008-11

8.  Right choice, right time: Evaluation of an online decision aid for youth depression.

Authors:  Magenta B Simmons; Aurora Elmes; Joanne E McKenzie; Lyndal Trevena; Sarah E Hetrick
Journal:  Health Expect       Date:  2016-10-17       Impact factor: 3.377

9.  Health Care Provider Adoption of eHealth: Systematic Literature Review.

Authors:  Junhua Li; Amir Talaei-Khoei; Holly Seale; Pradeep Ray; C Raina Macintyre
Journal:  Interact J Med Res       Date:  2013-04-16

Review 10.  Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria.

Authors:  Cara C Lewis; Sarah Fischer; Bryan J Weiner; Cameo Stanick; Mimi Kim; Ruben G Martinez
Journal:  Implement Sci       Date:  2015-11-04       Impact factor: 7.327

View more
  17 in total

1.  Engaging Children and Young People in Digital Mental Health Interventions: Systematic Review of Modes of Delivery, Facilitators, and Barriers.

Authors:  Shaun Liverpool; Catarina Pinheiro Mota; Célia M D Sales; Anja Čuš; Sara Carletto; Camellia Hancheva; Sónia Sousa; Sonia Conejo Cerón; Patricia Moreno-Peral; Giada Pietrabissa; Bettina Moltrecht; Randi Ulberg; Nuno Ferreira; Julian Edbrooke-Childs
Journal:  J Med Internet Res       Date:  2020-06-23       Impact factor: 5.428

Review 2.  Technology Acceptance in Mobile Health: Scoping Review of Definitions, Models, and Measurement.

Authors:  Camille Nadal; Corina Sas; Gavin Doherty
Journal:  J Med Internet Res       Date:  2020-07-06       Impact factor: 5.428

3.  Implementation of e-mental health interventions for informal caregivers of adults with chronic diseases: a protocol for a mixed-methods systematic review with a qualitative comparative analysis.

Authors:  Chelsea Coumoundouros; Louise von Essen; Robbert Sanderman; Joanne Woodford
Journal:  BMJ Open       Date:  2020-06-21       Impact factor: 2.692

4.  Impact of an Electronic Health Service on Child Participation in Pediatric Oncology Care: Quasiexperimental Study.

Authors:  Britt-Mari Gilljam; Jens M Nygren; Petra Svedberg; Susann Arvidsson
Journal:  J Med Internet Res       Date:  2020-07-28       Impact factor: 5.428

5.  Digital Mental Health Interventions for Depression, Anxiety, and Enhancement of Psychological Well-Being Among College Students: Systematic Review.

Authors:  Emily G Lattie; Elizabeth C Adkins; Nathan Winquist; Colleen Stiles-Shields; Q Eileen Wafford; Andrea K Graham
Journal:  J Med Internet Res       Date:  2019-07-22       Impact factor: 5.428

6.  Barriers and Enablers Affecting Successful Implementation of the Electronic Health Service Sisom: Multicenter Study of Child Participation in Pediatric Care.

Authors:  Petra Svedberg; Susann Arvidsson; Ingrid Larsson; Ing-Marie Carlsson; Jens M Nygren
Journal:  J Med Internet Res       Date:  2019-11-15       Impact factor: 5.428

7.  A Mobile Phone-Based App for Use During Cognitive Behavioral Therapy for Adolescents With Anxiety (MindClimb): User-Centered Design and Usability Study.

Authors:  Amanda Newton; Alexa Bagnell; Rhonda Rosychuk; Janelle Duguay; Lori Wozney; Anna Huguet; Joanna Henderson; Janet Curran
Journal:  JMIR Mhealth Uhealth       Date:  2020-12-08       Impact factor: 4.773

8.  Measuring the Implementation of Behavioral Intervention Technologies: Recharacterization of Established Outcomes.

Authors:  Eric DA Hermes; Aaron R Lyon; Stephen M Schueller; Joseph E Glass
Journal:  J Med Internet Res       Date:  2019-01-25       Impact factor: 5.428

9.  Use of the Chatbot "Vivibot" to Deliver Positive Psychology Skills and Promote Well-Being Among Young People After Cancer Treatment: Randomized Controlled Feasibility Trial.

Authors:  Stephanie Greer; Danielle Ramo; Yin-Juei Chang; Michael Fu; Judith Moskowitz; Jana Haritatos
Journal:  JMIR Mhealth Uhealth       Date:  2019-10-31       Impact factor: 4.773

10.  Using stepped-care approaches within internet-based interventions for youth anxiety: Three case studies.

Authors:  Sonja March; Caroline L Donovan; Sarah Baldwin; Martelle Ford; Susan H Spence
Journal:  Internet Interv       Date:  2019-09-10
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.