Literature DB >> 35857762

'More than just numbers on a page?' A qualitative exploration of the use of data collection and feedback in youth mental health services.

Craig Hamilton1, Kate Filia1,2, Sian Lloyd1, Sophie Prober1, Eilidh Duncan3.   

Abstract

OBJECTIVES: This study aimed to explore current data collection and feedback practice, in the form of monitoring and evaluation, among youth mental health (YMH) services and healthcare commissioners; and to identify barriers and enablers to this practice.
DESIGN: Qualitative semi-structured interviews were conducted via Zoom videoconferencing software. Data collection and analysis were informed by the Theoretical Domains Framework (TDF). Data were deductively coded to the 14 domains of the TDF and inductively coded to generate belief statements.
SETTING: Healthcare commissioning organisations and YMH services in Australia. PARTICIPANTS: Twenty staff from healthcare commissioning organisations and twenty staff from YMH services.
RESULTS: The umbrella behaviour 'monitoring and evaluation' (ME) can be sub-divided into 10 specific sub-behaviours (e.g. planning and preparing, providing technical assistance, reviewing and interpreting data) performed by healthcare commissioners and YMH services. One hundred belief statements relating to individual, social, or environmental barriers and enablers were generated. Both participant groups articulated a desire to improve the use of ME for quality improvement and had particular interest in understanding the experiences of young people and families. Identified enablers included services and commissioners working in partnership, data literacy (including the ability to set appropriate performance indicators), relational skills, and provision of meaningful feedback. Barriers included data that did not adequately depict service performance, problems with data processes and tools, and the significant burden that data collection places on YMH services with the limited resources they have to do it.
CONCLUSIONS: Importantly, this study illustrated that the use of ME could be improved. YMH services, healthcare commissioners should collaborate on ME plans and meaningfully involve young people and families where possible. Targets, performance indicators, and outcome measures should explicitly link to YMH service quality improvement; and ME plans should include qualitative data. Streamlined data collection processes will reduce unnecessary burden, and YMH services should have the capability to interrogate their own data and generate reports. Healthcare commissioners should also ensure that they provide meaningful feedback to their commissioned services, and local and national organisations collecting youth mental health data should facilitate the sharing of this data. The results of the study should be used to design theory-informed strategies to improve ME use.

Entities:  

Mesh:

Year:  2022        PMID: 35857762      PMCID: PMC9299353          DOI: 10.1371/journal.pone.0271023

Source DB:  PubMed          Journal:  PLoS One        ISSN: 1932-6203            Impact factor:   3.752


Introduction

The collection, analysis, and feedback of health services data plays an essential role in the improvement of health care [1-5]. Globally, shortcomings in the quality of mental health care have been identified and there is substantial interest in enhancing the use of data to address these. Opportunities for this include strategies designed to bring about changes in healthcare provider behaviour such as routine outcome measurement [6]; audit and feedback [5]; and monitoring and evaluation [2,7-13]. These strategies can improve care and patient outcomes but the effects are highly variable [5] and their potential has not been fully realised [14]. Knowing more about the conditions under which collection and feedback of data works to change practice, and identifying the barriers to its effective use, helps us to understand how to optimise it [15]. There is a recognised risk within the healthcare improvement field that the “effort invested in collecting information (which is essential) is not matched by effort in making improvement” [16]. This paper focuses on the use of monitoring and evaluation (ME) in the context of Australian youth mental health care. ME involves the systematic collection and analysis of program data (e.g. program activity, patient outcomes) to provide strategic information, which can be used for decision-making by program managers and healthcare commissioners. Monitoring is a continuous process which tracks progress in implementation and performance, often against indicators and targets [17]. Evaluation is a periodic activity, which can identify the extent to which intended objectives have been achieved and can provide insight into what has contributed to their achievement or non-achievement [17].

Youth mental health care in Australia

In response to the high burden and incidence of mental ill-health among young people, and inadequacies of the mental health system to meet their needs, numerous countries have developed and implemented youth mental health (YMH) services targeted to young people aged 12 to 25 years [18-22]. In Australia, YMH services are typically commissioned by 31 Primary Health Networks (PHNs) [23] and delivered by local or national non-government organisations. A significant proportion of services operate as part of a national franchise led by the headspace National Youth Mental Health Foundation (110 centres by 2019) [24,25]. There is recognition of the importance of the collection and feedback of data within mental health services [2,6,7,9,26], and the practice of ME is perceived as an integral component of the commissioning process and contemporary YMH service provision [18,24,27-29]. Despite this, healthcare commissioners report that they find it challenging to make meaningful use of ME data collected from YMH services [30]. Little is known about how YMH services and healthcare commissioners currently use ME and given its potential to contribute to the improvement of mental health care provided to young people, it is essential that we understand what helps and hinders its use. This study aimed to explore current ME practice among YMH services and healthcare commissioners in Australia, and to identify individual and environmental barriers and enablers to these practices.

Methods

Sampling and recruitment

Participants were purposively sampled to ensure representation across healthcare commissioning organisations and YMH services, from a variety of roles/responsibilities, and with good coverage of geographical areas within Australia. The research team, members of which were employed at Orygen, a national youth mental health organisation [31], sent an email invitation to an existing network of contacts working in healthcare commissioning organisations and YMH services (n = 240). Snowball sampling was also used, where recipients of the original email invitation were asked to forward the email to their contacts they deemed relevant (based on the information provided in the email). A flowchart detailing participant recruitment to the two sample groups is provided in Fig 1.
Fig 1

Flow chart of participants’ recruitment to the study.

Interviews

Semi-structured interviews were conducted by the lead author (CH), a male completing a Master of Public Health, and supervision was provided by a senior researcher (ED). Data collection and analyses were informed by the Theoretical Domains Framework (TDF) [32]. The TDF incorporates 33 theories of behaviour change and is used to explore and identify factors which inhibit or enable professional behaviour change [32-34]. CH and ED met frequently, with expertise in TDF drawn from ED where required, related to reviewing the topic guide; coding guidelines; interview recordings; and all coding. While an interview topic guide informed by the TDF [32] was developed, the researcher encouraged a natural flow to the interviews; as such, they were semi-structured depending on when and how topics were raised by the participant. The topic guide was piloted with two mental health professionals with experience of managing YMH services and amended to improve clarity and reduce length. The topic guide is provided in S1 Appendix. Interviews were conducted between June 2020 and August 2020 using Zoom videoconferencing software, apart from one telephone interview. Due to the COVID-19 pandemic, most participants were at home during their interview but a small minority were in a private office at their usual place of work. Notably, there were very few internet connectivity issues during the Zoom interviews, with visuals and audio remaining largely stable throughout. Each interview was audio recorded and transcribed verbatim by a specialist transcription service. CH checked the transcripts to ensure accuracy.

Data analysis

Following guidance [34] on using the TDF in qualitative studies and under the supervision of ED, CH developed coding guidelines (“a set of explicit statements of how the TDF is to be applied to a specific data set” [34].) Transcripts were imported into QSR NVivo 12 [35] for analysis. A deductive approach was initially taken in which the researchers read the transcripts, considered the relevance of the data to the TDF’s domains and theoretical constructs, and then coded the data into one or more of the 14 theoretical domains [32,34]. This was followed by thematically analysing [36] the data coded to each theoretical domain to generate belief statements. A belief statement is a “collection of responses with a similar underlying belief that suggest a problem and/or influence of the beliefs on the target implementation problem” [34]. In line with standard practice with TDF studies [34,37], once coding was complete, three criteria were considered when judging the relevance of the TDF domains and associated belief statements to the target behaviour: (1) a high frequency of coding (≥80% participants), (2) presence of conflicting belief statements, and (3) presence of strong beliefs which may impact behaviour. All transcripts were analysed by CH, while ED independently coded a subset of transcripts to check for consistency of coding. Differences in coding were discussed and the coding guidelines were iteratively revised until coding was acceptably consistent.

Ethical considerations

The study was approved by The University of Melbourne Centre for Youth Mental Health Human Ethics Advisory Group (Ethics ID: 2056869). All participants were provided with written study information and signed a consent form prior to interview.

Results

Participant overview

A total of 40 participants were recruited across both sample groups and data saturation of themes was achieved. The healthcare commissioners sample included staff responsible for the youth mental health portfolio, or staff involved in analysing YMH service data. The YMH services sample included management and other staff involved in ME from a commissioned YMH service. Interviews lasted between 40 and 70 minutes (M = 56.63). Participant characteristics are summarised in Table 1.
Table 1

Participant characteristics.

Healthcare commissionersYMH services
No. of participants 20 20
No. (%) of commissioning regions represented from total of 31 18 (58.06%)15 (48.36%)
Participant roles (No. of participants)

Manager or program officer for youth mental health* (17)

Data or ME-related* (3)

Middle management (10)

Clinical management (4)

Data or ME-related (3)

Project management (2)

Clinician (1)

Types of YMH services represented (No. of participants) NA

headspace centres (11)

Other commissioned YMH services (9)

*All healthcare commissioner participants had experience of working directly with YMH services.

Manager or program officer for youth mental health* (17) Data or ME-related* (3) Middle management (10) Clinical management (4) Data or ME-related (3) Project management (2) Clinician (1) headspace centres (11) Other commissioned YMH services (9) *All healthcare commissioner participants had experience of working directly with YMH services.

Current practice

The types of ME behaviours performed by healthcare commissioners and YMH services are shown in Table 2. Involvement in evaluation was mentioned by a few participants, but most ME activity related to monitoring only.
Table 2

ME behaviours performed by healthcare commissioners and YMH services.

ME behaviourHealthcare commissionersYMH services
Planning and preparing for ME Y Y
Entering data into data systems N Y
Providing technical assistance to YMH services Y N
Retrieving data from data systems Y Y
Preparing monitoring reports for healthcare commissioners N Y
Analysing and visualising data Y Y
Providing feedback Y Y
Reviewing and interpreting data Y Y
Making decisions and taking action Y Y
Informal communication between healthcare commissioners and YMH services Y Y
Although there are commonalities in the types of behaviours, there is variation in how these behaviours are performed by services and commissioning organisations. While all healthcare commissioners require services to collect data, there are differences in the extent of these requirements. Some commissioners only require services to collect a nationally mandated primary mental health care minimum data set [38]. However, most commissioners require services to collect data in addition to this mandated data set and to provide monitoring reports (usually quarterly) which include quantitative data on service activities and outcomes, and qualitative data such as case studies or narrative. In addition to these formal monitoring mechanisms, many commissioners maintain informal communication with services to ensure they are up to date with what is happening and aware of any potential issues. The degree to which commissioners and YMH services partner on ME varies. The ME planning process appears to be highly collaborative in some cases, while highly prescriptive in others. Similarly, some commissioners actively engage services in data-informed discussions (e.g. service development workshops), while other services report receiving little to no feedback on the reports they provide to their commissioners. Who performs ME behaviours varies across services. For example, some services have specific data or ME-related staff who can retrieve data from data systems, and analyse and visualise data. However, in other services, staff may perform these behaviours on top of their formal job role (e.g. clinicians preparing commissioner reports). Services that operate as part of the headspace franchise are supported by the headspace National Office, which collects and analyses data from all centres, and provide centres and their commissioners with data reports and access to an online data visualisation tool.

Domains analysis

Table 3 overviews which TDF domains were relevant for the behaviours. Twelve of the 14 domains were relevant to healthcare commissioner behaviours and 11 to YMH service behaviours. S2 Appendix provides detailed information regarding the frequency of TDF coding and belief statements, the rationale for relevance, and illustrative quotes.
Table 3

TDF domains [32] and reasons for relevance/irrelevance.

Healthcare commissionersYouth mental health services
DomainRelevantReasons for relevance/irrelevanceRelevantReasons for relevance/irrelevance
Knowledge An awareness of the existence of something N No evidence of strong beliefs that may impact on behaviour present N No evidence of strong beliefs that may impact behaviour present
Skills An ability or proficiency acquired through practice Y High frequency, strong beliefs that may impact on behaviour, beliefs shared with YMH services Y High frequency, beliefs shared with healthcare commissioners, strong beliefs that may impact on behaviour
Memory, attention and decision processesThe ability to retain information, focus selectively on aspects of the environment and choose between two or more alternatives Y Strong beliefs that may impact on behaviour N Low frequency, no evidence of strong beliefs that may impact on behaviour
Behavioural regulation Anything aimed at managing or changing objectively measured actions Y High frequency, beliefs shared with YMH services, strong beliefs that may impact behaviour Y High frequency, strong beliefs that may impact behaviour, beliefs shared with healthcare commissioners
Environmental context and resourcesAny circumstance of a person’s situation or environment that discourages or encourages the development of skills and abilities, independence, social competence, and adaptive behaviour Y High frequency, conflicting beliefs present, strong beliefs that may impact behaviour, beliefs shared with YMH services Y High frequency, conflicting beliefs present, strong beliefs that may impact behaviour, beliefs shared with healthcare commissioners
Social influencesThose interpersonal processes that can cause individuals to change their thoughts, feelings, or behaviours Y High frequency, strong beliefs that may impact behaviour, conflicting beliefs present, beliefs shared with YMH services Y High frequency, strong beliefs that may impact behaviour, conflicting beliefs present, beliefs shared with healthcare commissioners
Social professional role and identity A coherent set of behaviours and displayed personal qualities of an individual in a social or work setting N Low frequency, no evidence of strong beliefs that may impact on behaviour N Low frequency, no evidence of strong beliefs that may impact behaviour
Beliefs about capabilitiesAcceptance of the truth, reality, or validity about an ability, talent, or facility that a person can put to constructive use Y High frequency, strong beliefs that may impact behaviour Y High frequency, conflicting beliefs present
EmotionA complex reaction pattern, involving experiential, behavioural and physiological elements, by which the individual attempts to deal with a personally significant matter or event N Low frequency, no evidence of strong beliefs that may impact behaviour Y Strong emotions present
Optimism The confidence that things will happen for the best or that desired goals will be attained Y High frequency, demonstrated high level of optimism N No evidence of strong beliefs that may impact behaviour
Intentions A conscious decision to perform a behaviour or a resolve to act in a certain way Y High frequency, strong beliefs that may impact behaviour, beliefs shared with YMH services Y High frequency, strong beliefs that may impact behaviour present, beliefs shared with healthcare commissioners
Beliefs about consequencesAcceptance of the truth, reality or validity about outcomes of a behaviour in a given situation Y High frequency, strong beliefs that may impact behaviour, conflicting beliefs present, beliefs shared with YMH services Y High frequency, conflicting beliefs present, beliefs shared with healthcare commissioners, strong beliefs that may impact behaviour
Goals Mental representations of outcomes or end states that an individual wants to achieve Y High frequency, strong beliefs that may impact behaviour, conflicting beliefs present, beliefs shared with YMH services Y High frequency, beliefs shared with healthcare commissioners, strong beliefs that may impact behaviour
ReinforcementIncreasing the probability of a response by arranging a dependent relationship, or contingency, between the response and a given stimulus Y High frequency, strong beliefs that may impact behaviour, conflicting beliefs present Y High frequency, strong beliefs that may impact behaviour

Belief statements shared by healthcare commissioners and YMH services

In total, 100 belief statements relating to healthcare commissioners and/or YMH service behaviours were generated. All belief statements, reasons for relevance, and illustrative quotes can be found in S2 Appendix. There were 26 belief statements that applied to both healthcare commissioner and YMH service behaviours, which are summarised in Table 4. Each of these belief statements are subsequently described in further detail (with the relevant TDF domains in bold), as well as those that were only held by one sample group.
Table 4

Belief statements shared by healthcare commissioners and YMH services.

TDF domainBelief statement
Skills You need to be able to build relationships with other organisations.
You need a good understanding of the YMH service context.
You need to be data literate.
You need to be able to empathise with YMH service staff.
You need to be inquisitive and open minded.
Behavioural regulation Improvements in monitoring and evaluation planning.
Improvements in data processes and tools.
Greater access to data collected by headspace centres.
Environmental context and resources I am able to access staff with monitoring and evaluation-related skills.
I lack the time to dedicate to monitoring and evaluation.
My organisation is supportive of the use of monitoring and evaluation information.
Data processes and tools are problematic.
Commissioned services have limited funds to allocate to monitoring and evaluation.
It can be difficult to use the data we receive from headspace National Office for monitoring and evaluation.
It feels like headspace centres have two masters.
Social influences It’s useful to access the support of national youth mental health organisations (e.g. Orygen and headspace National).
Intentions Monitoring and evaluation is an integral part of the work we do.
We intend on improving our use of monitoring and evaluation.
Beliefs about consequences It helps me to understand what is happening in the service and informs improvement.
It helps to identify service issues (including risks and gaps).
The data does not always accurately reflect what’s happening on the ground.
Qualitative data is needed to contextualise quantitative data.
Monitoring and evaluation is burdensome for services.
Goals I want to understand the experiences of young people and families.
Monitoring and evaluation helps to ensure young people receive the best care possible and experience improved outcomes.
Monitoring and evaluation should inform quality improvement.
Participants in both groups regarded ME as integral to their work (intentions). Many believed that ME should be primarily used to drive quality improvement (intentions) so that young people receive the best care possible and experience improved outcomes (goals). Numerous participants purported a desire to improve the use of ME (intentions) and improving ME planning was seen as key enabler of this (behavioural regulation). However, it was also widely acknowledged that ME is burdensome for services (beliefs about consequences) and that there are limited funds for them to allocate to it (environmental context and resources). ME helps participants to understand what is happening in services, identifies service risks and gaps, informs service improvements, and guides healthcare commissioners on how they can support services (beliefs about consequences). Participants also had particular interest in using ME to understand the experiences of young people and families accessing services (goals). The inclusion of qualitative data in monitoring reports was regarded as essential by many, as it helps to contextualise quantitative data (beliefs about consequences). “We receive monthly data and they’ve got a target and an achievement. Really they’re only numbers on paper, until you understand what they actually mean. So I find that the qualitative stuff behind the data is of equal importance, because it speaks to the data. I think that tells us the richest information.” (Participant 24, healthcare commissioner). “I think it’s really the case studies that are particularly useful, because we can really get a good sense, ourselves, around what the presentations were for young people, what their goals were, what our evidence-based approaches were to meeting those goals, where the young person came to in their trajectory, and what the outcomes were for good, for bad, for otherwise, and also what the service impacts have been within service.” (Participant 6, YMH service). Being data literate, inquisitive and open minded were regarded as important ME skills by numerous participants (skills). Similarly, having a good understanding of the YMH service context (skills) was seen as important, as was being able to empathise with YMH service staff, and being able to build relationships with organisations (e.g. service providers, healthcare commissioning organisations) (skills). One healthcare commissioner reflected on value of having previously worked in a service: “I understand the tensions within the work. Sure every service is different and I could never possibly say that I understand exactly what they’re encountering on a day to day basis… but as a general rule, having service delivery experience really does help you when you’re collaborating with providers.” (Participant 11, healthcare commissioner). Both groups acknowledged that ME data does not always accurately reflect what happens on the ground in services (beliefs about consequences). Several YMH service participants noted that reporting data from a single or limited number of outcome measures only provides a partial insight into the difference their service makes. “I don’t think that any of those measures should be taken individually. I think that would be reductionistic… they all need to be collected and viewed as a whole. I think to take any one of them individually and use that as the basis for the outcome is totally not valid.” (Participant 8, YMH service). Commissioners provided a different perspective on this issue. One participant spoke of finding out that a service had withheld important information about challenges they were experiencing from the commissioner, while others spoke of the integrity of data sets being reduced by data entry issues within services. Problems with data processes and tools were also widely cited by both groups as barriers to ME (environmental context and resources). “We’re just trying to enter things into multiple platforms and you do see differences in different platforms with even just caseloads and occasions of service numbers. They are slightly different and I think that’s because we’re trying to work across too many systems.” (Participant 28, YMH service). “We have a database that the PHN [healthcare commissioning organisation] manages, which all of the service providers enter into… it does come with some challenges because the service providers often have a lot of difficulty—it’s not the best system. It’s quite limited in what it can do with reporting. So the service providers often have challenges in being able to export and being able to filter according to the KPIs…” (Participant 3, healthcare commissioner).

Healthcare commissioner belief statements

The value of engaging with services on an ongoing, informal basis was raised by many (beliefs about consequences), and there was a strong desire among participants to develop stronger partnerships with services, so they can support them with quality improvement (goals). It was, however, mentioned that ME can identify service issues which the commissioner may not be able to help resolve (beliefs about capabilities). “I think the downside will probably be if I’ve found a need and I can’t support that… So if I’m aware of a gap or if I’m aware that someone is struggling and I can’t assist, I think that’s sort of a negative of evaluation.” (Participant 34, healthcare commissioner). While some reported having little contact with other healthcare commissioning organisations regarding ME, many spoke of how they learn from and collaborate with other commissioners (social influences). “I actually spoke to four other PHNs [healthcare commissioning organisations] to get their data to see what they collected and what some of their turnaround times were, which was fantastic. So we’ve done our own little benchmark study.” (Participant 12, healthcare commissioner). The ability to develop appropriate expectations and performance indicators for commissioned YMH services was viewed as a vital skill by several participants (skills). One commissioner spoke of the dangers of setting inappropriate performance indicators: “I think people underestimate how hard it is to develop a really good indicator… You have to be really careful because you can create perverse incentives.” (Participant 16, healthcare commissioner). Some mentioned that the way in which the government measures healthcare commissioner performance incentivises a focus on service activity rather than service outcomes (reinforcement). Others spoke of how the national primary mental health care minimum data set is of limited use when monitoring and evaluating commissioned services (environmental context and resources). “The PMHC-MDS [national primary mental health care minimum data set] is not fit for purpose. It has too many fields. It collects information that we don’t necessarily use or value. It creates a reporting burden for provider organisations that’s unnecessary and unwarranted.” (Participant 17, healthcare commissioner).

YMH service belief statements

Several service participants indicated that doing ME helps to ensure their service retains funding from their commissioner because it is a contractual obligation (reinforcement), while others spoke of wanting to use ME to demonstrate the difference their service makes (goals). However, most participants indicated that ME often takes a backseat to other priorities, such as attending to the needs of young people and staff (goals). Many felt that their commissioner actively supported them, but this feeling was not shared by all (social influences): “The PHNs [healthcare commissioners] that I find helpful are the ones who are willing to work in partnership… there are commissioners who have described themselves as like an ATM: ‘you complete the transaction and we give you the money’. Whereas others are more likely to work in partnership, so really collaborative kind of decision making.” (Participant 31, YMH service). Numerous participants expressed that they felt their commissioner’s expectations of their service was unrealistic (reinforcement). This related to either the volume of ME activity (i.e. data collection, reporting) required of services or expectations about service performance. Some participants said they were worried about the potential consequences of not meeting the commissioner’s expectations (emotion). “It can also make me feel nervous. I guess I had a lot of anxiety when we’d had to do the Q3 report when I’d first started and I had to put zero next to a lot of our KPIs. That was very anxiety provoking.” (Participant 10, YMH service). Participants suggested that commissioners could help YMH services with ME by collaborating with them (and young people and families) on decisions about ME planning, streamlining reporting requirements, and improving feedback (behavioural regulation). It was widely reported that staff need to feel that data collection is meaningful for them to actively engage in it, and it was beneficial to create formal opportunities to discuss data with staff (behavioural regulation). While participants asserted that ME should benefit clinical practice (goals), there were mixed views about its impact (beliefs about consequences), particularly in regard to using outcome measures with young people. While several participants spoke about the value of using measures, some felt that using measures that focus on symptoms and problems can inhibit recovery-orientated practice. Participants also spoke of how clinicians value the use of data in their practice to varying degrees (social influences). For many, the use of ME helps to ensure that their service operates in an evidence-based way (beliefs about consequences). “Without evaluation and reflection and looking at ourselves and looking at what we’re doing, we could be in the dark ages. We could be providing a service that is unhelpful… Evaluation means that we can’t not be focused on outcomes in the participant and their needs, and keeps us ethical, and keeps us up-to-date with best practice.” (Participant 33, YMH service).

Discussion

This study sought first to explore how data collection and feedback practice, in the form of monitoring and evaluation (ME), is used by YMH services and the organisations that commission them. Secondly, the study aimed to identify the barriers to and enablers of ME use from the perspectives of both YMH services and healthcare commissioners. We found that ME is a complex set of behaviours (e.g. planning and preparing for ME; entering data into data systems; providing technical assistance to YMH services; retrieving data from data systems; preparing reports for healthcare commissioners; analysing and visualising data; providing feedback; reviewing and interpreting data; making decisions and taking action; and informal communication between healthcare commissioners and YMH services). While there were commonalities in the types of behaviours performed, there was variation in how they were performed by commissioning organisations and YMH services. Both groups identified numerous individual, social, and environmental barriers and enablers. Many of these have the potential to be modified to enhance the use of ME activity to better support improving quality of service provision. It was important for both commissioners and YMH services that data should drive service quality improvement. However, both groups raised concerns that data does not provide a fully accurate picture of service performance, and YMH services also felt that commissioners’ expectations of service performance were sometimes unrealistic or not meaningful. Difficulties with measuring quality in mental health care have been raised in previous literature [7,9,10,39]. In one study, mental health service managers reported that because performance indicators set for them did not obviously relate to service performance, data collection was regarded as a compliance activity rather than an opportunity to identify potential service improvements [39]. Beliefs articulated by participants in the present study can also be related to Mannion and Braithwaite’s [40] taxonomy of dysfunctional outcomes of health performance measurement. The authors identify 21 unintended or adverse consequences relating to poor measurement, misplaced incentives or sanctions, breach of trust, and politicisation of performance systems [40]. Young people having a positive experience of care was of the utmost importance to commissioners and services alike. Many thus believed that data should provide meaningful insights that support them to improve patient experiences and outcomes. Literature suggests that providing clinicians with actionable feedback that presents aspects of care delivery that are under their control and relevant to their job has the greatest chance of making a difference to practice [41,42]. Yet, in this context, data collection focused primarily at the patient-level without a strong focus on clinician-level activities that contribute to patient outcomes. A greater focus on clinician-level data in ME plans may help to ensure that data optimally contributes to improving the experiences of young people receiving care. While clinician-level data is critical to actionable quality improvement, patient-level data is also important in measuring quality of care and clinical decision-making [2,6,7,13]. The role of outcome measurement in this was a topic of contention in this study. Participants reported variability in the value that clinicians place on using measurement in their practice and regarded mandated outcome measures to be of limited clinical utility or even a potential impediment to recovery-orientated practice. These issues are consistent with the literature on implementing outcome measurement in mental health settings [6,43-48]. To avoid the risk of it becoming a purely bureaucratic exercise, outcome measurement should be meaningful to clinicians, young people, and families and carers [6,45,49-52]. The dearth of clinically meaningful outcome measures designed for young people has been previously highlighted [53], but such measures are being developed [54,55]. It has also been advocated that using idiographic outcome measures (e.g. Goal-Based Outcome Tool) [56] can help to facilitate person-centred care [51,57,58] and has been associated with improvements in young people’s satisfaction and engagement with mental health services [59,60]. Challenges relating to data processes and systems, and minimum data sets are well documented in the literature [7,9,39,43,61,62], and consistent with the results of the present study. It was common for participants to speak of the burden of having to use multiple systems because of a lack of interoperability between systems or because data were needed that was not available in the national primary mental health care minimum data set. It is a priority for commissioners that YMH services collect the minimum data set because that data is used by the Australian government to measure commissioner performance, but for many commissioners, the minimum data set does not adequately capture service performance. This places commissioners in a challenging position. They are mindful that ME places a significant burden on YMH services but it is difficult for them to meaningfully monitor and evaluate services without requiring the collection of additional data. Lastly, commissioners and YMH services want to work in partnership and such an approach may help to address some of the challenges. Services spoke of the benefits of commissioners being collaborative and forthcoming with meaningful feedback and commissioners spoke of how valuable they found it to communicate with services on an ongoing and informal basis (‘soft governance’), which aligns with the commissioning literature [62-64]. Both groups also regarded interpersonal skills such as the abilities to empathise and build relationships as essential. This corroborates existing research, which emphasises that a trusting relationship between the provider and recipient of feedback improves the likelihood that the feedback will inform learning and improvement [65,66].

Strengths and limitations

The inclusion of both YMH service and healthcare commissioner perspectives from a good spread of geographical regions is a strength of the study. A limitation, however, is that a significant number of participants were recruited through the researcher’s professional network, so there is a potential risk of self-selection bias. Given the significance that participants placed on understanding the experiences of young people and families, future research is needed to include their views in ME activity. The use of the Theoretical Domains Framework also has strengths and limitations. The TDF’s 14 domains, underpinned by 33 behavioural theories, enabled the identification of a wide range of barriers and enablers to ME. Systematically exploring each of the TDF’s domains in the interviews may have helped unveil barriers and enablers that would have been otherwise missed. It also allows for the results of this study to be used in the development of strategies to enhance ME use, through mapping the relevant TDF domains to behaviour change theory (i.e. the behaviour change wheel approach to intervention design) [34,67]. However, it would be valuable to critically appraise the data when designing these strategies, as prior research shows that people tend to emphasise external (environment, social influences) rather than internal (knowledge, skills) factors as barriers to their own behaviour [68,69]. Finally, while the TDF is extensive in its scope, it is possible that there are barriers and enablers that are not currently covered by its 14 domains.

Implications for practice

There are several strategies emerging from this research that healthcare commissioners and YMH services should implement to ensure ME is meaningful. Firstly, ME plans should be co-designed [70] and should meaningfully involve young people and families whenever possible. The targets, performance indicators and outcome measures should explicitly link to YMH service quality improvement and, where possible, provide clear examples to demonstrate how improvements can be achieved. ME plans should also include qualitative data such as case studies. Streamlined data collection processes will reduce unnecessary burden, and YMH services should have the capability to interrogate their own data and generate reports. Healthcare commissioners should also ensure that they provide meaningful feedback to their commissioned services, and local and national organisations collecting youth mental health data should facilitate the sharing of this data. YMH services and commissioners should be provided with opportunities to build their ME capacity by organisations with relevant expertise. Finally, it must be noted that additional investment will likely be required for YMH services and commissioners to implement these recommendations.

Implications for future research

Future research could identify what targets, performance indicators and outcome measures would be most appropriate to use in youth mental healthcare. Young people and families should be meaningfully involved in this research, particularly in the development and ongoing validation of outcome measures.

Conclusions

By using a theory-informed behavioural approach to explore the use of ME in youth mental health care we found that current practice comprises of numerous interrelated behaviours performed by YMH services and healthcare commissioners, and that there are many barriers and enablers to this activity at the individual; organisational; and broader environmental levels. Importantly, this study illustrated scope for improvement. The results of the study should be used to design theory-informed strategies to improve ME use. This would help to ensure that the use of ME produces ‘more than just numbers on a page’ and leads to continuous improvements in the quality of mental health care provided to young people.

Interview topic guide.

(PDF) Click here for additional data file.

Domains, beliefs, rationale table.

(PDF) Click here for additional data file. 1 Apr 2022
PONE-D-22-06227
‘More than just numbers on a page?’ A qualitative exploration of the use of data collection and feedback in youth mental health services.
PLOS ONE Dear Dr. Hamilton, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by May 16 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'. A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'. An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Jason Scott Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf. 2. Please provide additional details regarding participant consent. In the ethics statement in the Methods and online submission information, please ensure that you have specified (1) whether consent was informed and (2) what type you obtained (for instance, written or verbal, and if verbal, how it was documented and witnessed). If your study included minors, state whether you obtained consent from parents or guardians. If the need for consent was waived by the ethics committee, please include this information. If you are reporting a retrospective study of medical records or archived samples, please ensure that you have discussed whether all data were fully anonymized before you accessed them and/or whether the IRB or ethics committee waived the requirement for informed consent. If patients provided informed written consent to have data from their medical records used in research, please include this information. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: N/A Reviewer #2: No ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: This is an interesting and well written manuscript on a topic that is important to ensure the sustainability of services. The conclusions drawn are appropriate based on the data presented and appropriate recommendations have been made. I thought the results were particularly will presented, highlighting the domains of the TDF. Throughout the manuscript authors should make explicit which behaviour is being performed and by whom. Abstract 1. Design – ‘data was’ should be ‘date were’. Please check the full manuscript for this error. 2. Results – it’s unclear what ‘behaviour’ refers to here and what the ten types of behaviour are. How does this link to TDF? 3. Conclusions – Could the conclusions be better presented than a list? Makes it difficult to get a sense of impact. Introduction 4. Good and concise overview of the evaluation and feedback. 5. It is highlighted there are two main gaps 1) to understand more about how data is used, and 2) barriers/enablers for improvement based on data collected. But the aims for the manuscript do not align with these and instead look to give an account of how data is collected and the barriers to collecting data from these services. Could authors please update this. Methods 6. Within the study context – could authors please state what type of data is collected (clinical scores, wellbeing, engagement, etc.)? 7. The authors briefly discuss the use of zoom for most interviews. Could you please reflect on how this influenced data collection, e.g., were there any distractions for participants (such as shared office) that may have influenced answers? See for instance the below papers that examined benefits and drawbacks of Zoom: Oliffe, J. L., Kelly, M. T., Gonzalez Montaner, G., & Yu Ko, W. F. (2021). Zoom interviews: benefits and concessions. International Journal of Qualitative Methods, 20, 16094069211053522. Archibald, M. M., Ambagtsheer, R. C., Casey, M. G., & Lawless, M. (2019). Using zoom videoconferencing for qualitative data collection: perceptions and experiences of researchers and participants. International journal of qualitative methods, 18, 1609406919874596. 8. Could the authors please discuss whether they reached saturation / sufficient information power? e.g., whether they felt that saturation had been reached, they had reached sufficient information power, or whether they were time constrained? Results 9. Throughout the results section could authors please state the number of participants rather than using ‘few’, ‘many’, ‘some’, ‘most’, etc. 10. Could authors please make clear what the 27 shared belief statements are- either through a clear description or use of a table/figure for example. 11. There are only nine supporting quotes presented intext and while these are representative there is more scope to include additional data to provide further meaning behind each belief. Discussion 12. Authors mention ‘a complex set of behaviours’ performed. Could authors please make explicitly clear which behaviours are being referred to. 13. Authors reflect on their use of the TDF across the strengths and limitations, implications for future research, and conclusions sections. This makes the discussion of the TDF feel fragmented and would be best to have a central discussion within in the main section. The TDF highlights important conclusions and could be discussed in greater depth. Appendix 14. Could authors please check appendix 2 table to ensure the illustrative quotes are representative of the specific beliefs. For example, Skills domain – Specific belief: You need to be data literate. I do not feel the quote is representative and instead reflects collaboration and belief about capability. 15. Could authors please check for syntax errors within the table (use of quotation marks, indenting text, missing letters from beginning of words). Reviewer #2: This manuscript presents the results of a study of a qualitative study of beliefs about monitoring and evaluation (ME) from mental health commissioners’ and staff associated with youth mental health services in Australia. Forty separate 40-70-minute semi-structured interviews were conducted with participants recruited from a pool of 240+ potential participants using the authors’ social network and a snowball approach (asking first-level contacts to forward the invitation to others). A strength of the study is the use of a theory-based approach to developing the qualitative coding scheme using the Theoretical Domains Framework (TDF) to organize the a priori categorization of themes arising from the interviews. Study findings have the potential to further our understanding of ME in youth mental health services both in terms of use and implementation. However, there were several significant limitations that would need to be addressed to improve the scientific rigor and allow readers to better assess the implications of the findings. First, the introduction mixes very separate conceptual areas by combining into a single construct the notions of routine outcome monitoring, audit and feedback, and monitoring and evaluation. While there is some overlap amongst these strategies for informing patient care and quality improvement efforts, they are different. Routine outcome monitoring, for example, has a single reference for a 2019 systematic review (6). There are dozens of studies and several systemic reviews and meta-analyses on this evidence-based practice available (e.g., de Jong et al., 2021, see reference below). The utility of this manuscript could be much clearer by focusing on ME and operationalizing what it is, how it is used, and what the known barriers are in the introduction. Second, the methods description is not sufficiently rigorous for a qualitative study of high quality. For example, it is difficult to assess the quality of the recruitment process from what is described here. Was a single email sent? How does this differ from a sample of convenience? How were additional recruits vetted from the snowball sampling? In addition, a serious limitation is the lack of description of qualitative methods outside of the TDF coding scheme. What is meant by inductive and deductive coding and how did this arise from pre-planning coding schemes versus being developed as themes arose? How was quality maintained on the coding itself (e.g., common methods are consensus coding, use of training for inter-rater reliability, etc.). This reviewer is unclear what was meant by the three criteria described in paragraph 3 on page 6 – why these criteria, how and when were they applied, etc. Without further description, it is impossible for this reviewer to assess the quality of the study, nor to assess the interpretation of findings. In other words, there is little to no description of the analytic approach and methods. de Jong, K., Conijn, J. M., Gallagher, R. A. V., Reshetnikova, A. S., Heij, M., & Lutz, M. C. (2021). Using progress feedback to improve outcomes and reduce drop-out, treatment duration, and deterioration: A multilevel meta-analysis. Clinical Psychology Review (85), 102002. https://doi.org/10.1016/j.cpr.2021.102002 ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Matthew Cooper Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. 15 May 2022 We thank reviewer 1 and 2 for taking the time to review the manuscript and the constructive feedback they have provided. We believe addressing them will strengthen the paper. We have detailed how we have addressed each piece of feedback in the attached document 'response to reviewers'. Submitted filename: Response to Reviewers.docx Click here for additional data file. 22 Jun 2022 ‘More than just numbers on a page?’ A qualitative exploration of the use of data collection and feedback in youth mental health services. PONE-D-22-06227R1 Dear Dr. Hamilton, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Jason Scott Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: N/A Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Clear and easily understood. Thank you for addressing comments and presenting this insightful work. Reviewer #2: With the new tables and figures, expanded detail in the methods section, and the clarification of focus on ME and not routine outcome monitoring (ROM) nor audit and feedback, the authors have thoroughly addressed this reviewer's comments. If interested, it is recommended that the authors update the seminal reference for ROM to the most recent meta-analysis (reference provided again below) rather than just a 2019 systematic review. de Jong, K., Conijn, J. M., Gallagher, R. A. V., Reshetnikova, A. S., Heij, M., & Lutz, M. C. (2021). Using progress feedback to improve outcomes and reduce drop-out, treatment duration, and deterioration: A multilevel meta-analysis. Clinical Psychology Review (85), 102002. https://doi.org/10.1016/j.cpr.2021.102002 ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Matthew Cooper Reviewer #2: Yes: Susan Douglas ********** 27 Jun 2022 PONE-D-22-06227R1 ‘More than just numbers on a page?’ A qualitative exploration of the use of data collection and feedback in youth mental health services. Dear Dr. Hamilton: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Jason Scott Academic Editor PLOS ONE
  47 in total

1.  Transforming youth mental health services and supports in Ireland.

Authors:  Robert J Illback; Tony Bates
Journal:  Early Interv Psychiatry       Date:  2011-02       Impact factor: 2.732

Review 2.  Implementing Measurement-Based Care in Behavioral Health: A Review.

Authors:  Cara C Lewis; Meredith Boyd; Ajeng Puspitasari; Elena Navarro; Jacqueline Howard; Hannah Kassab; Mira Hoffman; Kelli Scott; Aaron Lyon; Susan Douglas; Greg Simon; Kurt Kroenke
Journal:  JAMA Psychiatry       Date:  2019-03-01       Impact factor: 21.596

3.  Creating headspace for integrated youth mental health care.

Authors:  Patrick McGorry; Jason Trethowan; Debra Rickwood
Journal:  World Psychiatry       Date:  2019-06       Impact factor: 49.548

4.  Validation of the theoretical domains framework for use in behaviour change and implementation research.

Authors:  James Cane; Denise O'Connor; Susan Michie
Journal:  Implement Sci       Date:  2012-04-24       Impact factor: 7.327

5.  Learning from a Learning Collaboration: The CORC Approach to Combining Research, Evaluation and Practice in Child Mental Health.

Authors:  Isobel Fleming; Melanie Jones; Jenna Bradley; Miranda Wolpert
Journal:  Adm Policy Ment Health       Date:  2016-05

6.  Australia's innovation in youth mental health care: The headspace centre model.

Authors:  Debra Rickwood; Marie Paraskakis; Diana Quin; Nathan Hobbs; Vikki Ryall; Jason Trethowan; Patrick McGorry
Journal:  Early Interv Psychiatry       Date:  2018-10-12       Impact factor: 2.732

7.  Goal setting improves retention in youth mental health: a cross-sectional analysis.

Authors:  Alice J Cairns; David J Kavanagh; Frances Dark; Steven M McPhail
Journal:  Child Adolesc Psychiatry Ment Health       Date:  2019-07-09       Impact factor: 3.033

8.  Revitalising audit and feedback to improve patient care.

Authors:  Robbie Foy; Mirek Skrypak; Sarah Alderson; Noah Michael Ivers; Bren McInerney; Jill Stoddart; Jane Ingham; Danny Keenan
Journal:  BMJ       Date:  2020-02-27

9.  Uses and abuses of patient reported outcome measures (PROMs): potential iatrogenic impact of PROMs implementation and how it can be mitigated.

Authors:  Miranda Wolpert
Journal:  Adm Policy Ment Health       Date:  2014-03

10.  Implementing child and youth mental health services: early lessons from the Australian Primary Health Network Lead Site Project.

Authors:  Sanne Oostermeijer; Bridget Bassilios; Angela Nicholas; Michelle Williamson; Anna Machlin; Meredith Harris; Philip Burgess; Jane Pirkis
Journal:  Int J Ment Health Syst       Date:  2021-02-23
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.