Literature DB >> 20140489

The eye of the beholder: youths and parents differ on what matters in mental health services.

Gregory A Aarons1, Jill Covert, Laura C Skriner, Amy Green, Donna Marto, Ann F Garland, John Landsverk.   

Abstract

The goal of this study was to examine the degree to which youths and caregivers attend to different factors in evaluating their experiences with mental health programs. Youth (n = 251) receiving mental health services at community agencies and their caregivers (n = 275) were asked open-ended questions regarding the positive and negative aspects of the services. Qualitative analyses revealed some agreement but also divergence between youth and caregivers regarding the criteria by which services were evaluated and aspects of services that were valued most highly. Youths' positive comments primarily focused on treatment outcomes while caregivers focused more on characteristics of the program and provider. Youths' negative comments reflected dissatisfaction with the program, provider, and types of services offered while caregivers expressed dissatisfaction mainly with program characteristics. Results support the importance of assessing both youth and caregivers in attempts to understand the factors used by consumers to evaluate youth mental health services.

Entities:  

Mesh:

Year:  2010        PMID: 20140489      PMCID: PMC2977056          DOI: 10.1007/s10488-010-0276-1

Source DB:  PubMed          Journal:  Adm Policy Ment Health        ISSN: 0894-587X


“Beauty is in the eye of the beholder” Margaret Wolfe Hungerford (née Hamilton) 1878 Consumer and family member perspectives about the mental health services are little understood and often poorly described. With an increasing focus on a balanced approach to quality of care, consumer perspectives are coming to the forefront. For example, there is growing consensus that quality services are best conceptualized as a combination of research evidence, clinical judgment, consumer choice, preference, and cultural fit (APA Presidential Task Force on Evidence-Based Practice 2006; Institute of Medicine 2001; Sackett et al. 2007). However, both community mental health and managed behavioral health care organizations frequently focus on global measures of consumer satisfaction rather than delving deeper into what consumers might consider to be important priorities of mental health care. This occurs in spite of the considerable ambiguity about the meaning of consumer satisfaction and the mixed evidence regarding the association of consumer satisfaction with outcomes of mental health services (Garland et al. 2000, 2003). In child and adolescent mental health services, youths and their caregivers are the primary stakeholders and are likely have different perspectives, priorities, expectations, and concerns regarding treatment—each of which must be taken into account for successful treatment. A number of studies suggest that patient satisfaction ratings are not strongly related to clinical improvement (Garland et al. 2003; Kaplan et al. 2001; Lambert et al. 1998; Pekarik and Wolff 1996; Williams et al. 1998) whereas others indicate that clinical outcomes may be important determinants of patient satisfaction with mental health services (Garland et al. 2007; Liao and Sukumar 2005). These mixed results may by partly explained by research indicating that consumer satisfaction is influenced by a number of co-determinants including demographic variables, expectations for treatment, prior experiences, severity of patient pathology (Garland et al. 2000; Godley et al. 1998; Linder-Pelz 1982), duration of treatment (Garland et al. 2007; Godley et al. 1998; Lebow 1983; Rey et al. 1999), and youth motivation to enter treatment (Garland et al. 2000). However, studies involving these and other potential co-determinants of satisfaction also have produced mixed results (Garland et al. 2007; Martin et al. 2003), generating further confusion as to what satisfaction ratings indicate. There also is concern regarding the largely positive evaluations of services often reported in satisfaction surveys (Clark et al. 1999; Sarvela and McClendon 1987; Young et al. 1995). With little understanding of the meaning of satisfaction and limited variability in ratings, interpretation is difficult and limits our ability to discern relevant information regarding what is and isn’t valued relative to mental health services. Further, it is possible that reported high levels of satisfaction reflect social desirability responses rather than true satisfaction (Young et al. 1995) again inhibiting rating interpretability. The inconsistencies and limitations associated with satisfaction measures suggest that a more thorough understanding of consumers’ reactions, expectations, and criteria used to evaluate services is needed (Garland et al. 2003; Nock and Kazdin 2001; Rey et al. 1999). Most studies of consumer satisfaction with mental health services use brief surveys to measure satisfaction. While surveys are probably the most efficient and least invasive method of investigating consumer perspectives, there are limitations to this method. Researchers have noted that the survey approach offers little in the way of explaining findings, especially regarding reasons for dissatisfaction (Godley et al. 1998; Martin et al. 2003; Perreault et al. 1993). Avis et al. (1997) argue that asking consumers to respond to survey questions predetermined by researchers or behavioral health care organizations limits the expression of consumers’ experiences. Indeed, consumers’ criteria are likely to differ from those of organization management and service providers. Despite these limitations, consumer satisfaction remains a widely used measure and quality assurance metric (Young et al. 1995). Thus it is important to develop procedures to enhance the usefulness of consumer input. In an attempt to address this issue, Williams et al. (1998) used in-depth interviews and discussions based on a satisfaction questionnaire to investigate whether and how services users evaluate mental health services. They found that patients described their experiences in positive or negative terms but that despite patients’ description of negative experiences with services, satisfaction ratings did not reflect such experiences. Thus, quantitative measures may not focus on the full range of concerns of consumers. The use of a qualitative approach, on the other hand, may be a more effective method for gaining insight into the criteria by which consumers evaluate services. In fact, the use of open-ended qualitative measures is often considered essential to elicit a full range of values and experiences that might otherwise elude capture (Avis et al. 1997; Godley et al. 1998; Meehan et al. 2002). Such an approach may be more likely than satisfaction ratings to identify specific aspects of services that could be addressed through quality improvement efforts (Godley et al. 1998; Perreault et al. 1993). While research on youth’s perspectives regarding therapy is in its infancy (Young et al. 1995), existing research suggests that youths and their caregivers typically differ on ratings of youth behaviors and behavior problems, psychological symptoms, and satisfaction ratings, and tend to show low levels of agreement regarding multiple other aspects of treatment (Garland et al. 2001; Godley et al. 1998; Lambert et al. 1998). Yeh and Weisz (2001) reported that only 37% of parent–child dyads agreed when asked about the presenting problems of youths referred to outpatient treatment. Garland et al. (2004) assessed agreement on desired treatment outcomes among adolescent-parent-therapist triads and found that while the different stakeholders report some similar desired outcomes at an aggregate level, 62 percent failed to agree on any desired outcomes. Given such low levels of agreement it is likely that youths and caregivers attend to different factors when evaluating quality and outcomes of services and may place different importance on both positive and negative experiences with programs and providers. Youths who enter treatment often do not perceive their own behaviors as problematic, have not voluntarily sought help (Schwab and Stone 1983), and once in treatment likely perceive services differently than their caregivers due to unique vantage points, experiences, and cognitions. For instance, Rouse et al. (1995) found that while children’s perception of treatment outcome may be influenced by the relationship with their therapist a parent might place more importance on their own perceptions of services. A study by Hawley and Weisz (2005) examining youth and parent alliance with therapists in community-based outpatient mental health clinics echoes these findings. The authors report that parent (but not youth) alliance was related to parent satisfaction with services while youth (but not parent) alliance was related to youth satisfaction suggesting that youth and parents focus on those factors relevant to their own role in treatment when evaluating their satisfaction with services. Taken together, these findings demonstrate that youth perceptions should not be inferred from their caregiver’s views (Copeland et al. 2004), and that youth and caregiver perspectives on experiences regarding mental health services should be measured separately. Few studies, however, have focused on consumers’ actual experiences with services. We contend that consumers are likely to hold perspectives about mental health services that are more detailed than satisfaction ratings and can provide a more meaningful sense of what is important to youth and parent consumers in assessing their service experiences. Indeed, it had been previously noted that information about specific aspects considered important when rating satisfaction will be of substantial benefit when identifying problems or need and in implementing change (Pekarik and Wolff 1996; Rey et al. 1999; Riley et al. 2009). Youth and caregiver perceptions of services can affect participation and retention in services, compliance with treatment regimens and/or recommendations, and future engagement in mental health services (Brinkmeyer et al. 2004; Kazdin et al. 1997; Mckay and Bannon 2004; Meyer et al. 2002; Morrissey-Kane and Prinz 1999), factors that have been shown to affect treatment outcomes. For example, retention is a significant predictor of positive outcomes for adolescents in mental health and substance abuse services (Henggeler et al. 1996). In summary, satisfaction is not necessarily a good indicator of consumer experiences with mental health services and is not strongly related to patient outcomes. Qualitative approaches may be more effective for gaining detailed insight into consumer perspectives. It is essential to consider both parent and youth viewpoints because of their distinct and unique perceptions. Finally, while providing significant insight into what is important to patients, a deeper understanding of youth and caregiver evaluative criteria could be used to influence how services are carried out and how consumers are treated. The purpose of the present study was to identify and contrast criteria by which youth and parents/caregivers evaluate services and the relative importance of these criteria. We used open- ended questions and qualitative analyses in order to identify: (a) positive and negative perspectives on received mental health services, and (b) differences between youth and parent perspectives on mental health services provided for youths.

Methods

Study Context

This study was conducted in one of the communities funded in Phase 2 of the Federal Comprehensive Community Mental Health Services for Children and Their Families (CCMHS) Program (see Holden et al. 2001). Structured interviews were conducted with youths who had received mental health services (e.g., intensive case management, outpatient counseling, wraparound services, mentoring, and family services) and their caregivers. At the close of each interview, youths and caregivers were independently asked to talk about their experience with services. Two prompts were given, one asking about positive aspects of services and the other asking about problems experienced with services (if any). The study was approved by the appropriate institutional review boards.

Participants

As part of a child and adolescent mental health services outcome study, 306 youths and their caregivers were interviewed. Of the 306 families interviewed at baseline, 22 were interviewed prior to the addition of the two qualitative questions that are the focus of the present study. From the remaining 284 families who had the opportunity to answer the qualitative questions, 251 youth and 275 caregivers were interviewed. The discrepancy in the number of youth and caregiver responders occurred for a number of reasons such as youths not being available for the interview, being incarcerated, refusal, or being out of state at the time of the interview. Youth participant ages ranged from 11 to 17 years (M = 15.5, SD = 1.9) and 67.7% were male. Of the 251 participating youth, 37.1% self identified as Caucasian, 19.1% Hispanic, 19.1% African American, 2.4% Asian/Pacific Islander, and 4.8% Native American or Other. Youth were involved in an average of six different types of services over the study timeframe. Primary diagnoses of youths included an array of disorders. Thirty-seven percent were diagnosed with disruptive disorders, 20% depression, 10% anxiety, 13% attention deficit hyperactivity disorder, 3% bipolar disorder, 3% adjustment disorder, 3% substance disorder, and 9% classified as “other”. Diagnoses were drawn from administrative reports, largely from clinician reported data. Most families were involved in multiple services, with outpatient therapy being the most frequently used service (73.4%). Of the 275 participating caregivers, 72% were biological mothers, 6.2% biological fathers, 3.3% foster parents, 4.7% grandparents, 3.6% aunts or uncles, 1.1% siblings, 2.2% other relatives, 4.4% staff, and 2.5% other (see Table 1).
Table 1

Descriptive statistics for caregiver sociodemographic factors

Variable N (%)
Education levela
 Grade school84 (28.3)
 High school diploma79 (26.6)
 Some college85 (28.6)
 College degree14 (4.7)
 Graduate degree18 (6.1)
Yearly incomeb
 <$5,00018 (7.2)
 $5,000–$9,99938 (15.1)
 $10,000–$14,99953 (21.1)
 $15,000–$19,99927 (10.8)
 $20,000–$24,99924 (9.6)
 $25,000–$34,99921 (8.4)
 $35,000–$49,99919 (7.6)
 $50,000–$74,99918 (7.2)
 $75,000–$99,9994 (1.6)
 >$100,0005 (2.0)
Past year service use
 Outpatient201 (73.4)
 Residential/inpatient120 (43.6)
 Substance abuse91 (33.3)
 School-based179 (65.1)
 Day treatment86 (31.4)

Note. Total N = 251

aThere were eight missing data points, nine answered as “not known” or “not applicable”

bThere were six missing data points, six refusals, and twelve responses “not known”

Descriptive statistics for caregiver sociodemographic factors Note. Total N = 251 aThere were eight missing data points, nine answered as “not known” or “not applicable” bThere were six missing data points, six refusals, and twelve responses “not known”

Measures and Procedures

Attempts were made to contact and recruit all families who entered services with one of the four agencies, funded under the CCMHS between April 1999 and September 2001, into the longitudinal outcome study. Informed consent was obtained from all caregivers and legal-guardians, and assent was obtained from all youths who participated in the longitudinal outcome study. In cases where youths did not have a legal guardian to provide consent, ex-parte orders were obtained from the appropriate courts. Participants were informed that the information they shared would remain confidential and would only be reported to agencies in aggregate form. Computer assisted interviews were conducted with youth and caregivers in the home, residential treatment center, or detention center. Project staff were bilingual and bicultural interviewers trained and employed by a senior member of the research and evaluation team, and were independent from the agencies and services being evaluated. Staff underwent intensive instruction and training in interviewing techniques as well as the administration of each measure and qualitative interviewing. In the early stages of the project, interviewers reported that participants expressed a desire to talk about their individual experiences with the programs being evaluated. This issue was brought to the attention of the county mental health authority, and in consultation with program leaders and the county liaison, it was decided that it was important to address this need, and that the best way to do so was by asking open ended questions about the positive and negative aspects of participant experiences with the program at which they received services. The questions were intended to provide youths and caregivers the opportunity to candidly express their feelings, positive and negative, about the services being received. The two questions were: “What have you found to be most helpful in working with the program?” and, “What are some of the problems you’ve experienced, if any?” Interviewers were instructed to ask each question to both youths and caregivers individually, in private, and to record their responses verbatim. The Project Manager then sorted the qualitative responses by referring agency and reports were compiled after the removal of all identifying information. Program Managers and Contract Monitors from each agency were provided anonymous quarterly reports of responses from their families.

Qualitative Analyses

Qualitative analyses were conducted using the NVivo program to facilitate the identification and classification of responses into themes and categories predominant in the data. Main themes are represented by a primary node. Categories within each node are represented by subnodes. Two raters independently classified all youth and caregiver responses into predominant themes or nodes. A consensus panel of three investigators then met to review classifications and come to consensus on any discrepancies in classification. The frequencies of responses in each category were then tallied and rank ordered from most frequent to least frequent number of responses for youths and caregivers separately.

Results

Qualitative analyses of positive comments resulted in eight predominant categories of responses including: types of services (e.g. information and referrals, mentorship, coordination of services, recreation), outcomes (e.g. improved functioning in general and of the family), program characteristics (e.g. family focus, support, availability, consistency), basic needs (e.g. housing, transportation), provider (clinician or case manager) characteristics (e.g. nice, caring, liking), helpfulness of services (in general and with specific problems), communication (e.g. someone to talk to), and overall liking of the program. Analyses further revealed that caregivers and youth responded differently as demonstrated by the frequencies with which different aspects of services were mentioned. Table 2 presents the rankings of categories, number of positive responses, and most frequent types of responses for caregivers and youths. Caregivers provided 479 positive comments and youths provided 304 positive comments.
Table 2

Youth and caregiver positive response rankings

RankingCategoryYouthCaregiver
YC N (%) N (%)
11Types of service108 (35.5)132 (27.6)
27Outcomes50 (16.4)19 (4.0)
32Program characteristic38 (12.5)121 (25.2)
44Basic needs29 (9.5)41 (8.6)
53Provider characteristic28 (9.2)99 (20.7)
65Helpful25 (8.2)31 (6.5)
76Communication23 (7.6)25 (5.2)
88Like program overall3 (0.1)11 (2.3)
Totals304 (99)479 (100.1)

Note: Y = youth; C = caregiver

Youth and caregiver positive response rankings Note: Y = youth; C = caregiver Qualitative analyses of negative comments revealed three predominant categories of responses including: dissatisfaction with services (e.g. didn’t like aspect of services, lack of needed services, delayed service inception), dissatisfaction with program (e.g. unable to engage family/youth, unreliable, lack of continuity, ineffective), and dissatisfaction with provider (e.g. don’t like, unreliable, unavailable). Youth negative comments were nearly evenly distributed across all three categories with dissatisfaction with services having only a slightly higher frequency of comments than both program characteristics and provider characteristics. Dissatisfaction with program characteristics and provider characteristics were tied with respect to frequency of responses. Table 3 presents the rankings of categories, number of negative responses, and most frequent types of responses for caregivers and youths. Caregivers provided 82 negative statements and youth provided 28 negative statements.
Table 3

Youth and caregiver negative response rankings

RankingCategoryYouthCaregiver
YC N (%) N (%)
13Dissatisfaction with services10 (35.7)12 (14.6)
2a 1Dissatisfaction with program9 (32.1)42 (51.2)
2a 2Dissatisfaction with provider9 (32.1)28 (34.1)
Totals28 (99.9)82 (99.9)

Note: Y = youth; C = caregiver

aDenotes a tie

Youth and caregiver negative response rankings Note: Y = youth; C = caregiver aDenotes a tie Examples of youth and caregiver positive and negative responses to the two open-ended questions are presented in the appendix listed by category.

Discussion

The results suggest youths and caregivers attend to different factors at both the program level and individual provider level when evaluating their experiences with mental health services. Consistent with most existing patient satisfaction literature, caregivers and youths showed only minimal agreement in evaluations of services. While the positive and negative responses provided by youths and caregivers often overlapped in terms of broad categories of responses, they differed in the specific aspects of service mentioned and the frequency of these specific responses. Regarding positive perspectives of services, comments relating to types of services were the most frequent for both youth and caregivers. However, within this broad category there was considerable lack of agreement as to which specific services were valued. The majority of youths’ comments mentioned types of services related to recreation opportunities, school help, and life skills training as the most helpful aspects of the program, while the majority of caregivers’ positive comments focused on perceived coordination of services and information and referrals received. The one type of service that both youth and caregivers appeared to value highly was mentorship for the youth. Youth often cited having someone, like their counselor or mentor, to do activities with and to keep them out of trouble as the most helpful aspect of the program. Likewise, it was important to caregivers for the youth to be assigned a mentor who could act as a role model and who could get the youth involved in positive activities. The largest discrepancy to emerge was regarding the frequency of positive comments addressing treatment outcomes (e.g. improved family functioning, keeping youth on track). Youth positive comments referring to treatment outcomes ranked second in terms of response frequency (16.4 percent of responses), while such comments ranked seventh for caregivers (4.0 percent of responses). Youth valued a number of outcomes including having better self-esteem, feeling less depressed, and staying off of drugs. Youths also frequently mentioned outcomes relating to improved overall family functioning (e.g. improved communication and conflict resolution) and to their families reuniting, a finding consistent with previous research demonstrating that prior to treatment, youths often report desired outcomes related to family functioning (Garland et al. 2004; Hawley and Weisz 2005; Kazdin and Wassell 2000). Positive comments regarding provider characteristics were another source of substantial disagreement. Providers who were consistent, reliable, supportive, and who made an effort to show the family that they cared were valued by caregivers. Youth did not appear to attend to these qualities even when they did mention provider characteristics. With regard to negative comments, poor engagement/poor follow-through was the most frequently mentioned concern for both youth and their caregivers. However, results again indicate a certain degree of disagreement between youths and caregivers. Youths’ responses were approximately evenly split among dissatisfaction with services, the program, and the provider. In contrast, 51.2 percent of caregivers’ negative comments pertained only to dissatisfaction with program characteristics. It is important to note that for both caregivers and youths, the number of positive comments was far greater than negative comments. Because no limit was placed on the number of comments that could be made, this appears to represent a general positive valence regarding mental health services. This provides validation to quantitative studies of consumer satisfaction with mental health services that show generally positive ratings, and are in line with Williams et al. (1998) findings that even in the presence of negative experiences services still tend to be evaluated positively. Overall, based on frequency of response type, our results add support to studies showing that different stakeholders tend to place differing levels of importance on different aspects of mental health services (Garland et al. 2004). Youths’ feedback focused on more relational aspects of services and on treatment outcomes while caregivers emphasized concern with the basic logistics of service receipt such as receiving help with the coordination of services, and having consistency in services and treatment. These results reinforce the need to take into account both youth and parent perspectives to ensure that service providers can address the needs important to each. The current findings also add support the use of qualitative approaches in order to better understand positive and negative perspectives of youth and caregivers. For one, mental health service users may welcome the opportunity to express their opinions rather than completing satisfaction surveys. Qualitative measures provide a means to tap into consumers’ attitudes and feelings toward numerous aspects of service and treatment delivery and provide a much needed opportunity to gain insight into areas of dissatisfaction. Moreover, the variance in type and frequency of responses elicited through a qualitative approach reflects previous research suggesting that simply measuring satisfaction with services provides little meaningful information as to what factors influence reports of satisfaction. Our results provide evidence that youths and caregivers tend to determine whether they feel positive (i.e. are satisfied) about services received based on different sets of factors.

Limitations

The major limitation of this study is related to the timing of data collection. Due to the retroactive inclusion of the open-ended questions, a portion of youth-caregiver pairs were not given the opportunity to participate. From the remaining families, only 275 caregivers and 251 youth were interviewed, meaning that 24 caregiver responses did not have a corresponding youth response. This could have distorted the frequency of agreement disagreement between youth and caregivers in the analyses. However, given the size of the sample used and the magnitude of disagreement within several domains, it is likely that the range of issues identified represents the bulk of concerns of this sample. Recall bias is another factor that may have affected results. Parents and youth were interviewed after they had been receiving services for some time. Thus, they may have reported perspectives based on the most recent and or salient experiences. Finally, we can speculate that families who dropped out of treatment prior to data collection could have had more negative perceptions. For future studies, it would be of interest to question participants at multiple time points from initial visit through treatment termination.

Conclusions

The utility of assessing consumer or patient satisfaction for improving mental health services has been emphasized in recent years, however, while satisfaction surveys serve the needs of agencies and service providers there remains a lack of understanding of the factors consumers use to evaluate services (Nock and Kazdin 2001). Evidence suggests that there are discrepancies between youth and caregiver perspectives, thus making it critical to understand both in attempting to improve youth services. Awareness of these differences can help identify areas to focus on in improving the quality of care for youth with mental health problems and has the potential to improve patient care and retention in community-care settings. Based on the present findings, providers are advised to improve initial and ongoing communications with caregivers (e.g. presenting program/treatment options, feedback on youth progress) as well as consistency in the delivery of services and reliability of counselors/mentors (e.g. scheduling, follow-up contact, worker continuity). Improvement in these areas could potentially increase patient retention by increasing parents’ willingness to bring their child to services and by ensuring that youth feel their time is valued by providers. It also appears that focusing on providing mentorship, opportunities to participate in more activities, and helping youth feel that they have a voice in their treatment could increase youth engagement. These findings are consistent with those from a similar study by Martin et al. (2003) where the authors made recommendations to several community mental health clinics in Kansas to direct administrative attention to worker retention, shorten delays between worker reassignment, and give more attention to including parents in treatment planning. Results from the present study have already been used in various ways by the programs involved in the study. Reported uses include troubleshooting problems with staff members and providing information at staff meetings regarding the needs of families specific to each agency. This study revealed that youths and caregivers do indeed evaluate services based on differing perspectives. It is important for future studies to continue this line of research in order to better guide mental health services providers as to the best ways to obtain and utilize patient input. It may be that we need to develop a different set of questions for youths versus caregivers and continued research in this area can point to the kinds of questions providers should be asking consumers. Further, the responses obtained from qualitative approaches, such as the open-ended questions used in the present study, can be used to inform and improve the quality of quantitative measures of patient satisfaction used in health care settings. Future studies may also benefit from examining the specific youth-caregiver dyads to see if there are links between response types. This could provide a clearer view about how youths and caregivers perspectives differ. It also will be useful to assess and compare perspectives with actual outcomes along with the retention rates of youth in services. These data can then be used to better understand perceptions of services in relations to factors such as age, race, diagnosis, and family history. Identifying salient categories of concerns has the potential to inform the development of targeted strategies for fitting mental health services with the needs, preferences, and priorities of youths and their caregivers. While satisfaction surveys will likely remain an important tool for incorporating consumer perspectives into how services are implemented, the use of qualitative questions in conjunction with satisfaction surveys may be essential to examining and responding to the needs of both youths and caregivers.
Table 4

Example youth and caregiver comments for each service category

YouthCaregiver
Types of service“They have gone to court with me and helped me out by talking to the judge and telling him how good I’m doing.”“They have an education advocate that is going to help her fight for an IEP.”
“They helped me with my anger.”“Having a male mentor for him and a female mentor for me, about the same age.”
Outcomes“They’re helping me to act better for when I go home.”“Keeping him in line and going to school every day.”
“Getting me out of trouble and finding me programs that are helping me stay out of institutions.”“He’s responsible when he gets home. I don’t worry about him smoking or stealing anymore.”
Program characteristics“They are almost always available.”“Trying to keep the family together instead of tearing it apart. They look at all the aspects of the family instead of just blaming the parents.”
“They don’t always talk about the things I have done wrong.”“It’s an agency that can follow through with plans that they make.”
Basic needs“They helped us get this apartment and got us financially stable.”“They help with food when I was out of work.”
“Helping me with transportation from getting to one place to another like getting my medicine, going to job interviews and school, for some community hours”“Getting us into a stable home. If it wasn’t for them, we would still be struggling in a one bedroom.”
Provider characteristics“I can count on her with all my problems.”“He has a way of talking to him-words that I can’t come up with. [Youth] feels comfortable with him.”
“She focuses [more] on the positive things than on the negative.”“She went way above what she was required to do for her job.”
Helpful“They are there for me and help me. I don’t know how they help me, they just do.”“The things that she has recommended have been helpful.”
“They want to help me and no one is forcing them.”“They seem really wanting to be helpful”
Communication“I feel more comfort talking with them than other people like my counselors or parents.”“Being able to talk to someone about things going on here at home. Good to have another adult to talk to.”
“They allow me to express my feelings to someone who can keep a secret.”“Somebody has finally heard what I am talking about.”
Like program overall“I enjoyed it”“It’s a very good program.”
“Like everything”“I was totally impressed.”
Dissatisfaction with program“Sometimes they don’t answer our calls or lag on what they’re supposed to do.”“Some of the time frames they have (they take a long time to get things going)”
“Lack of consistency”“Staff turn around (turnover) rates”
Dissatisfaction w/provider“My first worker was making decision for me and not letting me give my opinion.”“The intensive case manager tends to act like the therapist and get into deeper issues than they should.”
“A couple times she showed up late and missed appointments.”“She was always opinionated and we disagreed.”
Dissatisfaction w/services“I hate the team meetings.”“No bilingual therapist (family therapist).”
“They stop coming when I went to JH so I don’t think they really did a good job.”“Getting the services because of where I live, due to lack of services in the area.”
  23 in total

1.  Relationship of youth satisfaction with mental health services and changes in symptoms and functioning.

Authors:  Ann F Garland; Gregory A Aarons; Kristin M Hawley; Richard L Hough
Journal:  Psychiatr Serv       Date:  2003-11       Impact factor: 3.084

2.  Relationship between caregiver hopefulness and satisfaction with their children's mental health services.

Authors:  Sharon E Riley; Arnold J Stromberg; James J Clark
Journal:  Community Ment Health J       Date:  2009-03-19

3.  Clinical outcome, consumer satisfaction, and ad hoc ratings of improvement in children's mental health.

Authors:  W Lambert; M S Salzer; L Bickman
Journal:  J Consult Clin Psychol       Date:  1998-04

4.  Prevalence of psychiatric disorders in youths across five sectors of care.

Authors:  A F Garland; R L Hough; K M McCabe; M Yeh; P A Wood; G A Aarons
Journal:  J Am Acad Child Adolesc Psychiatry       Date:  2001-04       Impact factor: 8.829

5.  Parental satisfaction and outcome: a 4-year study in a child and adolescent mental health service.

Authors:  J M Rey; J M Plapp; P L Simpson
Journal:  Aust N Z J Psychiatry       Date:  1999-02       Impact factor: 5.744

6.  Conceptual and methodologic issues in the evaluation of children's satisfaction with their mental health care.

Authors:  M E Schwab; K Stone
Journal:  Eval Program Plann       Date:  1983

7.  Multiple stakeholder agreement on desired outcomes for adolescents' mental health services.

Authors:  Ann F Garland; Caroline M Lewczyk-Boxmeyer; Elaine N Gabayan; Kristin M Hawley
Journal:  Psychiatr Serv       Date:  2004-06       Impact factor: 3.084

Review 8.  Research assessing consumer satisfaction with mental health treatment: a review of findings.

Authors:  J L Lebow
Journal:  Eval Program Plann       Date:  1983

9.  The meaning of patient satisfaction: an explanation of high reported levels.

Authors:  B Williams; J Coyle; D Healy
Journal:  Soc Sci Med       Date:  1998-11       Impact factor: 4.634

10.  Treatment expectancies, patient alliance, and outcome: further analyses from the National Institute of Mental Health Treatment of Depression Collaborative Research Program.

Authors:  Björn Meyer; Paul A Pilkonis; Janice L Krupnick; Matthew K Egan; Samuel J Simmens; Stuart M Sotsky
Journal:  J Consult Clin Psychol       Date:  2002-08
View more
  11 in total

1.  Development and psychometric evaluation of the youth and caregiver Service Satisfaction Scale.

Authors:  M Michele Athay; Leonard Bickman
Journal:  Adm Policy Ment Health       Date:  2012-03

2.  Supporting implementation: the role of community development teams to build infrastructure.

Authors:  Lisa Saldana; Patricia Chamberlain
Journal:  Am J Community Psychol       Date:  2012-12

3.  What is the state of children's participation in qualitative research on health interventions?: a scoping study.

Authors:  Jean M Hunleth; Julie S Spray; Corey Meehan; Colleen Walsh Lang; Janet Njelesani
Journal:  BMC Pediatr       Date:  2022-06-04       Impact factor: 2.567

Review 4.  Improving community-based mental health care for children: translating knowledge into action.

Authors:  Ann F Garland; Rachel Haine-Schlagel; Lauren Brookman-Frazee; Mary Baker-Ericzen; Emily Trask; Kya Fawley-King
Journal:  Adm Policy Ment Health       Date:  2013-01

5.  Patterns and Predictors of Mental Healthcare Utilization in Schools and other Service Sectors among Adolescents at Risk for Depression.

Authors:  Aaron R Lyon; Kristy A Ludwig; Ann Vander Stoep; Gretchen Gudmundsen; Elizabeth McCauley
Journal:  School Ment Health       Date:  2013-08-01

Review 6.  Application of discrete choice experiments to enhance stakeholder engagement as a strategy for advancing implementation: a systematic review.

Authors:  Ramzi G Salloum; Elizabeth A Shenkman; Jordan J Louviere; David A Chambers
Journal:  Implement Sci       Date:  2017-11-23       Impact factor: 7.327

7.  Perceived need and barriers to adolescent mental health care: agreement between adolescents and their parents.

Authors:  N Schnyder; D Lawrence; R Panczak; M G Sawyer; H A Whiteford; P M Burgess; M G Harris
Journal:  Epidemiol Psychiatr Sci       Date:  2019-09-20       Impact factor: 6.892

Review 8.  Revisiting caregiver satisfaction with children's mental health services in the United States.

Authors:  Lauren F Seibel; Robin Peth-Pierce; Kimberly E Hoagwood
Journal:  Int J Ment Health Syst       Date:  2021-08-28

9.  The stages of implementation completion for evidence-based practice: protocol for a mixed methods study.

Authors:  Lisa Saldana
Journal:  Implement Sci       Date:  2014-04-05       Impact factor: 7.327

10.  'Recovery' in the Real World: Service User Experiences of Mental Health Service Use and Recommendations for Change 20 Years on from a First Episode Psychosis.

Authors:  Donal O'Keeffe; Ann Sheridan; Aine Kelly; Roisin Doyle; Kevin Madigan; Elizabeth Lawlor; Mary Clarke
Journal:  Adm Policy Ment Health       Date:  2018-07
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.