Literature DB >> 35298512

The critical factor: The role of quality in the performance of supported accommodation services for complex mental illness in England.

Nerea Almeda1, Carlos Ramón García-Alonso2, Helen Killaspy3, Mencía R Gutiérrez-Colosía1, Luis Salvador-Carulla4.   

Abstract

Rehabilitation services have a key role in ensuring integrated and comprehensive mental health (MH) care in the community for people suffering from long-term and severe mental disorders. MH-supported accommodation services aim to promote service users' autonomy and independence. Given the complexity associated with MH-supported accommodation services in England, a comparative evaluation of critical performance indicators, including service provision and quality of care, seems to be necessary in designing evidence-informed policies. This study aims to explore the influence of service quality indicators on the performance of MH-supported accommodation services in England. The analysed sample includes supported accommodation services from 14 nationally representative local authorities in England from the QuEST study grouped by three main types of care: residential care homes (divided into two subgroups: move-on and non-move-on oriented), supported housing and floating outreach. EDeS-MH (efficient decision support-mental health) was used to assess the performance indicators for the selected services by combining a Monte Carlo simulation engine, data envelopment analysis and a fuzzy inference engine for integrating expert knowledge. Depending on the type of care, six/seven quality domains were sequentially included after a baseline scenario (only technical) was analysed. Relative technical efficiency scores for the baseline scenarios revealed high performance in all the selected supported accommodation services, but the statistical variability was high. Quality domains significantly improved performance in every type of care. The inclusion of quality indicators has a positive impact on the global performance of each type of care. Remaining at the corresponding services more than expected for two years has a negative impact on performance. These findings can be considered from a planning perspective to facilitate the design of pathways of care with more realistic expectations about gaining autonomy in two years.

Entities:  

Mesh:

Year:  2022        PMID: 35298512      PMCID: PMC8929565          DOI: 10.1371/journal.pone.0265319

Source DB:  PubMed          Journal:  PLoS One        ISSN: 1932-6203            Impact factor:   3.240


Introduction

The Hospital Plan of 1962 in England established its position as a pioneer in the transition of psychiatric services from largely hospital-based settings to community-based mental health (MH) care [1]. Supported accommodation services are a key component of the ‘theoretical whole system care pathway’ that provides support in the community for people with more complex MH needs [2]. Ideally, these should be organized into a ‘care pathway’ with the expectation that people graduate from higher to lower levels of support over time. The QuEST (Quality and Effectiveness of Supported Tenancies for People with Mental Health Problems) study analysed supported accommodation services across England in 2014. This study identified three main types of services, residential care homes, supported housing and floating outreach, and investigated their quality and effectiveness. Residential care homes cater to those with the highest needs and include communal facilities staffed 24 hours that provide meals, medication supervision, cleaning, etc., where placements are not time-limited. Supported housing services provide shared or individual, self-contained, time-limited tenancies with staff based on-site up to 24 hours a day to help residents gain skills to move on to less supported accommodations. Finally, floating outreach services provide visiting support consisting of a few hours per week to people living in permanent, self-contained, individual tenancies with the aim of reducing support over time to zero. Users of residential care and supported housing are more likely to have a diagnosis of schizophrenia or other psychosis, whereas users of floating outreach are more likely to have a diagnosis of a common mental disorder, but the level of needs of people using supported housing and floating outreach is similar [3,4]. This study found that the quality of care was higher in MH-supported housing than in residential care or floating outreach services using multivariate statistical techniques. To increase its applicability and transferability, this analysis should be completed with standard information on service availability and modelling based on system dynamics and complexity [5]. The standard performance assessment of any kind of comparable service includes studying their input consumption and output production (directly related to the inputs consumed). These inputs/outputs are mainly technical, and in the end, the best service performance is always related to the most appropriate balance between the available inputs and the produced outputs. Researchers and decision-makers can seek to reduce (minimize) the amount of inputs for a given amount of outputs (input orientation) or vice versa to increase (maximize) the output production for a specific amount of inputs (output orientation). The incorporation of quality variables in performance assessments (represented by six/seven quality domains in this study) is always complicated because these variables are subjective (perceived quality from users, managers, etc.). To our knowledge, this study is the first to include quality domains estimated by the managers of the selected MH services in a performance analysis [6]. Independent of the technical performance of the services (the baseline scenario analysed without quality domains), the main research objective is to assess the impact on service performance when quality domains are incorporated into the analysis. Considering that MH service managers have a specific amount of inputs and always try to obtain the corresponding best results, a neutral/positive relationship between managerial processes (input and output management) and their quality perceptions of performance is expected. Relative technical efficiency (RTE) is a decision support measure that can be used to guide health informed evidence-based policy-making, mainly to improve resource allocation [7,8]. RTE assesses the relationship among the amount of inputs consumed and outputs produced by a set of comparable decision-making units [9]. RTE can be regarded as a synthetic meta-indicator that facilitates monitoring the evolution of a system and the dynamic relationships or connections across different performance indicators [10,11]. It has been used to identify tailor-made improvement strategies for health care ecosystems such as the provision and resourcing of addiction treatment clinics [12,13], residential MH facilities [14], homes for people with mental disability [15], clinics for children and youth [16] and community-based youth services [17]. The RTEs of primary care and MH ecosystems have been systematically assessed in Basque County (Spain) [5,18-20]. Data envelopment analysis (DEA) has been widely used to assess the RTE of health services [6,18,21]. This group of nonparametric techniques is very robust and flexible because they do not need any preliminary assumption on the variable statistical structure, implying that variables (inputs/outputs) from different origins and types (e.g., the number of beds, technical input, the quality of care, manager perception) can be analysed at the same time. Quality variables have been included in DEA to assess the RTE of systems operating in different socioeconomic contexts (e.g., hospital care, schools or the banking industry) [22-25]. DEA can be included in a Monte Carlo simulation engine to include uncertainty and randomness in data values (all of which are considered statistical distributions) and design more realistic models [5]. As DEA is an operational model, it is completely blind. Variable values must be interpreted according to existing expert knowledge (usually a theoretical paradigm). A fuzzy inference engine allows us to operationalize the formalization of the balanced care model following the Expert-based Collaborative Analysis—EbCA methodology [5]. This engine interprets variable values before RTE was calculated. This study aims to assess the impact of quality indicators (managerial perspective) on the performance (RTE) of selected MH-supported accommodation services in the English pathway of care. This objective includes the formalization of specific quality domains into variables (rates), their integration in RTE assessment, and a comparative impact analysis of quality variables on ecosystem performance to support decision-making and investment by providing relevant information for service managers to inform practice and service planning. Accordingly, this paper first presents a description of the ecosystem under study (a representative sample of supported accommodation services in England). Then, the selected variables are described and grouped into scenarios to highlight different perspectives of the ecosystem situation. Finally, the methodology used to assess ecosystem performance (including quality domains) is briefly described.

Methods

Setting

Data for supported accommodation services from 14 nationally representative local authorities in England were collected for the QuEST study [4]. Face-to-face interviews were conducted with service managers, key staff, and service users to assess the quality and characteristics of the services and those using them. The Quality Indicator for Rehabilitative Care–Supported Accommodation (QuIRC-SA) was completed with service managers. This standardized tool assesses service quality in seven domains: living environment, therapeutic environment, treatments and interventions, self-management and autonomy, social interface, human rights and recovery-based practice [26]. Data on the service’s annual budget, weekly cost per user and service resources were also provided by the service managers to complement standard service costs for estimation of the cost-effectiveness of services [4]. The dataset is available at the Dryad digital repository (https://doi.org/10.5061/dryad.j0zpc86dz).

Scenarios (variable grouping)

Measurement units, usually expressed as rates, for each variable were discussed by an expert group comprising senior clinicians, policy-makers, providers, and researchers with expertise in MH-supported accommodation. Specifically, the experts who were involved included a rehabilitation psychiatrist who works with people living in MH-supported accommodations and is an international leader in this field, a psychiatrist with national-level expertise in policy pertaining to people with severe MH problems, two national leaders in MH-supported accommodation service provision and policy and three researchers who collected data from 87 supported accommodation services across England during the QuEST study. According to the background on evaluating the performance of MH-supported accommodation services [26-28], the relevant inputs were service budget (£ per place), places (number), and full-time equivalent staff (professionals per service user), and the relevant outputs were the average length of stay (years), occupied beds (%), the number of service users who moved to more independent accommodations (users per place), and the seven QuIRC-SA quality of care variables considering the service size (value of the domain×available places/100). By using the last mathematical transformation, original quality indicators considered the size of the service (an indicator value of 90 does not have the same meaning for a service with 10 places/beds and a service with 100 places/beds). All the considered transformations of original data render the selected services comparable by eliminating the potential “size” effect on performance assessments. Eight different scenarios, or input/output variable combinations, were designed to assess the RTE of residential care and supported housing services (Table 1). For floating outreach services, only seven scenarios were identified because this type of care does not include the “living environment” QuIRC-SA service quality domain. Scenario 1 can be considered the “reference” scenario because it does not include any quality domains.
Table 1

Descriptions of the scenarios for the RTE assessment of MH residential care, supported housing and floating outreach services.

MH supported accommodation servicesScenariosVariables
Residential care and supported housing servicesScenario 1 (Baseline)Inputs: N° of available beds or places, N° of available staff per service user, Annual budget per bed/place.Outputs: Years of stay in each service, Occupied beds/places (%), N° of service users who moved to a more independent accommodation per bed/place
Scenario 2Baseline variables + QuIRC-SA living environment domain score
Scenario 3Baseline variables + QuIRC-SA therapeutic environment domain score
Scenario 4Baseline variables + QuIRC-SA self-management and autonomy domain score
Scenario 5Baseline variables + QuIRC-SA social interface domain score
Scenario 6Baseline variables + QuIRC-SA human rights domain score
Scenario 7Baseline variables + QuIRC-SA treatments and interventions domain score
Scenario 8Baseline variables + QuIRC-SA recovery-based practice domain score
Floating outreach servicesScenario 1 (Baseline)Inputs: N° of available places, N° of available staff per service user, Annual budget per place.Outputs: Years of stay in each service, Occupied places (%), N° of service users who have moved from the service to another with greater independence per place.
Scenario 2Baseline variables + therapeutic environment domain score
Scenario 3Baseline variables + self-management and autonomy domain score
Scenario 4Baseline variables + social interface domain score
Scenario 5Baseline variables + human rights domain score
Scenario 6Baseline variables + treatments and interventions domain score
Scenario 7Baseline variables + recovery-based practice domain score

Decision support system

An adaptation of the hybrid Decision Support System (DSS) EDeS-MH (Efficient Decision Support-Mental Health) was used [19] to assess the performance of the services. This computer-based tool included a Monte Carlo simulation engine, DEA and a fuzzy inference engine. The simulation engine was developed to address the uncertainty (data imprecision and vagueness) and randomness (unexpected facts) of real environments and to artificially multiply the number of observations [29]. The inner uncertainty of any ecosystem can be overcome by transforming original data values into statistical distributions (from standard datasets to statistical distribution bases). In each simulation, the Monte Carlo simulation engine analyses a new dataset selected at random. The statistical analysis of the final results (the process is stopped when the statistical error is lower than 2.5% for the mean) includes a sensitivity analysis of the ecosystem under study. The results (RTE scores) for each DMU and scenario are statistical distributions that can be studied in a more (basic statistics) or less (stability and entropy) standard manner [19,21]. The characteristics of these statistical distributions represent the potential reaction of the DMU to data changes. DEA [9] was selected to evaluate the RTE of MH-supported accommodation services. This nonparametric technique has been widely used to assess health service performance [30] and, to a lesser extent, to evaluate MH services [6]. The standard DEA model is a linear programming one which structure is: where d is the number of DMU, i the number of inputs and j the number of outputs. The DMU m consumes x of input h and produces y of output r. θ is the efficiency score and and are the slacks [21]. In this research, the variable returns to scale DEA [31] was selected because when studying MH services, real output variations cannot be considered proportional to the corresponding input modifications [32], and constant returns to scale would involve a constant variation that cannot be considered realistic. The input-oriented DEA model was applied to assess whether service input consumption can be reduced while assuming a constant output level [9], which is especially relevant for decision-makers who must allocate finite resources to meet population needs. Output maximization (output-oriented DEA) is especially difficult and sometimes not recommendable (for example, when the system artificially tries to maximize the number of users who are moved to a service with greater independence, this can be mathematically correct but from a health care perspective it has no sense at all) when supported accommodation services are assessed. Finally, no weight control of the variables was considered. When the data envelopment analysis (DEA) did not include weight control of the inputs/outputs, a two-step process was followed to verify the impact of quality variable needs. First, RTE must be assessed without them, which yields ‘operational results’ (baseline). Second, quality variables must then be included as outputs to investigate the influence of quality on performance. Finally, when the number of observations is low (here, the number of move-on residential services is especially low), DEA cannot be sufficiently discriminative (in the end, the methodology tends to show that all DMUs are efficient). However, as explained before, the uncertainty analysis (Monte Carlo simulation engine) multiplies the number of observations by the number of selected simulations, which overcomes this DEA drawback. Prior to RTE assessment, variable (inputs and outputs) data values must be interpreted according to a preselected paradigm; otherwise, performance results can be biased. The fuzzy inference engine carries out this process automatically by including an expert-based rule base (IF … THEN …) according to the Balanced Care Model developed by Thornicroft and Tansella [33] and the pathway of care of MH-supported accommodation services [34]. This paradigm was used to define the range of adequacy for each variable. For example, the variable values for the service budget (£ per place)—input—in floating outreach services were considered adequate between the range [5000, 6000]. In this range, a greater value corresponds to better competence (it is mathematically transformed to be interpreted by the analytical procedure). Outside this range, the variable value is penalized by multiplying it by a parameter (in this specific case, by 2) because it is considered less adequate than when inside the range. The specific references for data value interpretation have been defined by a panel of experts according to the paradigm selected and their expertise. This process followed the EbCA model [5], where an iterative sequence of expert-based reviews culminates in a consensus. Once the references for interpreting variable values are defined, the EDeS-MH automatically runs a mathematical transformation based on an equation (linear monotone transformation) or a fuzzy operator (product-sum gravity method) to obtain the “transformed” value [21]. These transformed values will be analysed by DEA to determine the corresponding RTE scores (statistical distributions). To evaluate the statistical significance of the differences between the baseline (without quality variables, scenario 1) and the rest of the designed scenarios (corresponding to each quality domain, scenarios 2 to 7/8), the nonparametric statistical Wilcoxon signed-rank test was used.

Procedure

First, the types of services were coded according to the DESDE-LTC (Description and Evaluation of Services and DirectoriEs for Long Term Care) [35,36], an international service classification system that facilitates comparisons across different jurisdictions and studies and is used to describe service availability in our previous RTE studies using EDeS-MH. Second, expert-based cooperative analysis (EbCA) [5] was used to formalize expert knowledge on MH-supported accommodation services into a knowledge base structured by rules. This knowledge base facilitates interpretation of variable values according to the selected paradigm. EbCA has previously been used to evaluate the RTE of MH areas in Basque County [19]. The EbCA panel of experts on MH-supported accommodation services in England included persons in charge of the QuEST study, MH planners, MH managers and academic researchers (see the Scenario section for details). The interaction among these agents in the meetings was considered crucial for acquiring explicit their implicit knowledge: EbCA is an iterative procedure. The fuzzy inference engine interpreted variable values according to the resulting ranges of adequacy [19] according to the selected paradigm. Services considered outliers in terms of the three main supported accommodation service groups were also identified by the EbCA panel. For example, some of the residential care services were oriented to work with people to helping them move on to more independent accommodations, whereas others worked with people who were likely to need high levels of support long term. Missing values (very few) were statistically imputed using a Monte Carlo simulation procedure based on the real statistical distribution of the corresponding variable. Original data were randomized using symmetric triangular statistical distributions (5% variation on each side of the corresponding original value). This range includes feasible data variations (imprecision and vagueness). No critical stress on the ecosystem was included in the analysis (randomness). This procedure includes both data variations corresponding to ecosystem evolution (population, user mobility, etc.) and the effect of the time. The Monte Carlo simulation engine selects a specific data value from the dataset in each computer run. Then, the fuzzy inference engine interprets these values according to the paradigm. The obtained RTE performance scores for each MH service are statistical distributions that can be analysed accordingly. RTE scores are always in a [0, 1] range. If RTE = 0, then the service is completely inefficient; if RTE = 1, then the service is efficient, and if the RTE score is between (0, 1), then the service is inefficient (a greater RTE score corresponds to higher service efficiency). For example, if a service has an RTE = 0.45 and another has an RTE = 0.98, both are inefficient, but the second service is very close to the efficiency.

Results

Basic statistics

Nine move-on residential services, 13 non-move-on residential services, 34 supported housing services and 30 floating outreach services were finally analysed because they have complete quality datasets. Five hundred simulations were run by the DSS. Basic statistics for the original data are shown in Table 2. The results show that the variability among services in each type of care is very high.
Table 2

Basic statistics for the variables used in the RTE assessment.

Type of supported accommodationBasic statisticsPlacesTotal full time-equivalent professionalsAnnual budget (£)Length of stay (years)Occupied placesNumber of service users who have moved to a more independent accommodation over the last 2 years
Residential care (non-move on oriented)Mean21.260.66500,623.101019.680.58
Standard deviation7.120.31167,559.515.427.460.69
Variation coefficient (%)33.4746.4133.4754.2537.91119.61
Minimum90.34211,897.40480
Maximum401.66941,766.2320372
Residential care (move on oriented)Mean15.670.83386,467.363.1311.566.22
Standard deviation7.520.56285,133.491.555.413.46
Variation coefficient (%)47.9867.6047.9849.6846.8355.54
Minimum70.46265,535.16272
Maximum272.251,024,207.0662312
Supported housingMean10.990.45334,635.123.2410.275.63
Standard deviation5.110.27155,682.082.975.156.84
Variation coefficient (%)46.5259.4746.5291.7650.14121.55
Minimum30.1091,363.00110
Maximum281.61852,721.33202840
Floating outreachMean29.970.17171,950.082.8328.8913
Standard deviation22.900.17131,354.682.1623.0216.49
Variation coefficient (%)76.39103.5176.3976.4579.69126.85
Minimum50.0328,685.67140
Maximum800.97458,970.679675

Relative technical efficiency

The representativeness of the variables (inputs/outputs) for the RTE assessment was confirmed by the EbCA panel. Therefore, they can be considered indicators for real informed evidence-based decision-making. Quality domains (assessed by the QuIRC-SA [26]) were considered outcomes for performance assessments. Only one service was considered an outlier and removed from the analysis. On the other hand, the analytical process of setting the measurement units for the variables highlighted the existence of two different groups in the residential care services dataset: move-on and non-move-on oriented services. Regarding the move-on-oriented MH residential care services, the average RTE oscillated from 0.7, the worst, (scenario 1, baseline) to 0.86, the best, (scenario 2, therapeutic environment) (Table 3). In the baseline scenario, the highest performance (0.94) was achieved by service 22, while service 18 had the lowest score (0.5). Therefore, the performance of this type of care is very heterogeneous. When quality variables are included as outputs in scenarios 2–8, the RTE mean (which was always greater than 0.84) and the performance of each service increased significantly (p < 0.001) compared to the baseline. Services 4 and 26 had the worst performance.
Table 3

Average relative technical efficiency scores for MH residential care services (move-on oriented).

Darker shading corresponds to lower RTE scores (less efficient scenarios and services).

ServicesScenario 1Scenario 2Scenario 3Scenario 4Scenario 5Scenario 6Scenario 7Scenario 8
10.80870.81950.80360.81540.82930.82680.81810.8113
20.90500.92690.93880.94070.94600.90980.93930.9175
40.76080.78850.66510.68850.57730.73770.60840.6706
130.51990.93560.95190.93810.94460.94120.93770.9431
170.80520.95340.95200.95780.96150.95420.96600.9603
180.50390.89980.93420.92800.93270.94150.92460.9052
210.48340.92410.92710.92680.92460.91720.91880.9273
220.94020.95180.94670.94390.94840.95550.95110.9462
260.53720.57380.55590.57210.51060.55690.57090.5801
Global average 0.6960 0.8637 0.8528 0.8568 0.8417 0.8601 0.8483 0.8513

Average relative technical efficiency scores for MH residential care services (move-on oriented).

Darker shading corresponds to lower RTE scores (less efficient scenarios and services). In the non-move-on MH-oriented residential care services, the average RTE was 0.79, with service 90 (0.96) having the best and services 61 and 81 the worst RTE scores (Table 4). Again, including quality variables had a statistically significant (p < 0.001) positive impact (the performance increased), but the heterogeneity remained high. Scenarios 7 (treatments and interventions), 8 (recovery-based practices) and 4 (self-management and autonomy) were associated with the highest increase in performance. Services 61 and 81 showed low performance.
Table 4

Average relative technical efficiency scores for MH residential care services (non-move-on oriented).

Darker shading corresponds to lower RTE scores (less efficient scenarios and services).

ServicesScenario 1Scenario 2Scenario 3Scenario 4Scenario 5Scenario 6Scenario 7Scenario 8
960.66110.81880.84030.77800.82950.71950.75990.8764
1080.93430.93770.94350.94210.93370.93350.94270.9434
900.67700.96370.96620.96880.96150.96270.96760.9665
610.31270.33610.35660.39250.28580.39520.38290.3804
1240.64890.91940.84660.89420.86900.88320.86300.8874
910.62720.74210.73680.68870.71660.73450.80910.8006
140.68710.88250.86880.86930.92240.86600.87490.8558
120.86100.84000.87340.88310.82480.86250.90910.8815
810.38780.47520.42690.51400.39650.45600.48630.4652
790.73670.83770.86880.91070.83210.79400.92610.8709
800.82790.85740.87470.88970.81010.85400.91570.8788
880.83420.91970.91670.92740.88840.89010.90260.9378
830.72720.73340.79660.83550.76800.81610.82400.7830
Global average 0.6659 0.7895 0.7935 0.8072 0.7722 0.7821 0.8126 0.8098

Average relative technical efficiency scores for MH residential care services (non-move-on oriented).

Darker shading corresponds to lower RTE scores (less efficient scenarios and services). For MH-supported housing services, the average RTE oscillated from [0.54 to 0.64 (Table 5). In the baseline model, the average RTE was 0.54 (with service 86 (0.89) performing the best and service 89 (0.33) performing the worst; the aggregation of quality variables increased the average RTE, and these differences were significant (p < 0.001). Here, scenario 7 (treatment and interventions) showed the lowest increase. Services 104, 4, 114, 70, 84, 19, 101, 113, 60, 52, 89 and 95 had an average RTE lower than 0.5. The performance variability was very high in all scenarios.
Table 5

Average relative technical efficiency scores for MH-supported housing services.

Darker shading means lower RTE scores (less efficient scenarios and services).

ServicesScenario 1Scenario 2Scenario 3Scenario 4Scenario 5Scenario 6Scenario 7Scenario 8
860.89260.35460.32740.33830.33750.34010.32350.3344
820.68680.74400.77180.77490.74880.74900.67770.7860
310.62970.49450.52120.48230.49420.48350.47540.5159
1040.48390.41830.40170.41900.43090.45000.39470.3995
1120.54720.55370.58020.60760.60420.54890.58900.5804
40.36650.34190.33570.35190.34540.34780.33510.3453
1160.56830.53040.59630.62530.56330.60850.49720.6324
1110.56460.59770.65300.66800.71630.65240.59910.7021
1230.47650.53710.53970.54060.54110.53520.53900.5365
1140.35220.33380.34100.34960.32740.33300.31970.3264
220.62780.89480.72380.79060.92230.77030.67060.7024
580.54880.87710.89600.87080.93840.93510.86780.8854
180.56470.79740.80750.82350.82100.85700.82810.8437
700.33850.89970.89610.90640.89550.90080.90570.9011
840.50900.59220.59260.59940.58960.60030.59710.5930
1030.33900.35700.34630.34320.34170.33860.34150.3543
150.74640.57510.56920.57840.57470.57540.57100.5765
190.35540.35910.36080.36050.34160.34570.34760.3568
1010.37370.38150.42700.37350.43880.42230.38180.4112
1130.35510.55650.68630.64800.74250.58460.62180.6705
1100.51340.78420.94690.93960.90950.93330.92100.9491
1150.57650.88300.90590.86380.90050.86860.89120.8708
980.72710.56830.56230.55930.56080.57420.56020.5630
600.35430.34040.34180.32930.36090.35070.32390.3380
520.36200.36130.34010.34050.32770.33290.33930.3304
410.95230.93230.92880.93330.93060.93420.92740.9309
1050.53740.67290.74440.74260.81290.75950.74140.7572
270.50590.96000.95780.96050.94380.95360.95690.9580
460.65880.78170.82520.86120.94320.82760.68070.9087
930.90600.91940.92290.91910.91370.91860.90870.9165
890.33010.34280.34470.34500.35280.34070.34070.3189
950.33890.56070.53210.55480.58570.52490.52410.5724
320.68470.36790.36910.37060.37700.36960.35090.3719
970.45650.95770.95650.93260.96140.96330.96290.9424
Global average 0.5362 0.6067 0.6192 0.6207 0.6352 0.6185 0.5974 0.6230

Average relative technical efficiency scores for MH-supported housing services.

Darker shading means lower RTE scores (less efficient scenarios and services). In MH floating outreach services, the average RTE was in the 0.5 to 0.65 range (Table 6). For the other two types of supported accommodations, the baseline model showed the worst performance, and the inclusion of quality variables significantly increased both global and individual average service RTE (p < 0.001). Services 1, 63, 37 and 35 had very high performance. The heterogeneity of the sample was also very high.
Table 6

Average relative technical efficiency scores for MH floating outreach services.

Darker shading corresponds to lower RTE scores (less efficient scenarios and services).

ServicesScenario 1Scenario 2Scenario 3Scenario 4Scenario 5Scenario 6Scenario 7
920.58310.60220.60950.59600.60240.59120.6134
850.33870.34290.35620.33620.32350.33830.3371
1020.44710.49610.49530.46410.47910.48080.4877
870.54760.72810.70720.64310.70680.72330.7127
1070.51840.61840.62510.60010.59560.64550.6230
1210.46100.76680.83740.93730.73970.79180.8087
1220.51040.96190.96100.95530.96440.95810.9649
1090.48210.95390.96270.96400.95420.80790.9651
1180.36450.41490.41360.37560.38950.41930.4154
1190.34980.35580.35320.33430.34430.34470.3471
10.72580.97240.96490.94410.96030.97050.9695
1060.33750.35420.34600.34380.35130.34300.3397
710.35440.40780.41360.41070.40670.42000.4147
160.53840.60610.60910.58830.60380.60710.6060
630.95180.94840.94920.94680.95870.94820.9485
940.46970.75620.79060.69750.67920.80150.7964
650.45170.48200.49710.50940.48350.47550.4798
70.49580.90960.87570.94340.85190.96490.8707
80.48340.50180.51090.50670.50240.50120.5187
400.51130.74060.75380.67520.73270.70490.7839
990.33480.36890.38000.35960.34600.35560.3715
1000.48550.62920.64570.66970.61470.62310.6631
560.49200.73550.72410.60480.68810.71310.7712
510.34380.57380.48570.41610.64510.77970.7535
760.36580.36200.37360.35450.33610.36060.3597
780.41370.40010.41490.40350.41380.42190.4009
770.35230.35320.35330.35670.34040.35010.3459
360.34430.81320.85800.89390.87550.44390.8841
370.93170.93660.93390.93700.93040.93960.9427
350.86280.85990.85230.85630.85960.86540.8675
Global average 0.4950 0.6318 0.6351 0.6208 0.6227 0.6230 0.6454

Average relative technical efficiency scores for MH floating outreach services.

Darker shading corresponds to lower RTE scores (less efficient scenarios and services). Generally, the inclusion of quality domains highlights a neutral-positive or positive impact on the performance of supported accommodation services in England, which is especially relevant in some services that showed a lower average RTE: 13, 18 and 21 (33.3% of the surveyed services, Table 3); 14, 90, 96 and 124 (30.8% of the surveyed services, Table 4); 18, 22, 27, 58, 70, 97, 110 and 115 (23.5% of the surveyed services, Table 5); and, finally, 1, 7, 36, 56, 87, 109, 121 and 122 (26.7% of the surveyed services, Table 6). In these services, the manager´s perception of quality provided by their services surpasses technical results (baseline scenario), probably because of non-evaluated variables or circumstances in care provision. On the other hand, quality variables decrease service performance only in supported accommodation services 15, 31, 32, 86 and 98 (14.7% of the surveyed services, Table 5). This behaviour can be considered strange because here, the manager´s perception of quality provided by their services underestimates their own technical results. Again, additional non assessed variables or circumstances can affect how the manager processes information with respect to detecting frameworks of improving quality of care. Services where the inclusion of quality variables results in a neutral behaviour of their performance constituted the majority, respectively: 66.7%, 69.2%, 61.8% and 73.3% of the surveyed services. In these cases, the manager´s perception of quality provided by their corresponding services matches their technical results; they obtain a quality according to their resources and outcomes.

Discussion

This study aimed to assess the impact of quality domains (formalized as variables) on the performance (RTE) of selected MH-supported accommodation services in the English pathway of care. To our knowledge, this study is the first to evaluate the performance of MH-supported accommodation services with the inclusion of service quality indicators in the analyses. The integration of the quality construct (structured by a set of specific domains with varying relevance) in DEA is complex. Authors have addressed this issue using different techniques: quality-adjusted DEA (Q-DEA) [37], the two-model approach [38], quality and operating efficiency models with weight restrictions [39], and the multiple objective approach to DEA (MODEA) [40]. The EDeS-MH decision support system has been successfully adapted to assess MH services instead of catchment areas [5,18,19,21]. This DSS transfers the robust methods for resource allocation and health technology assessment used at national (macro) and regional (macro) levels to the local (meso) and service (micro) levels, reducing the lack of transparency and accountability in relation to resource management, which often occurs at this level [41]. Finally, the EbCA highlights the existence of two different residential services: move-on and non-move-on oriented services, which must be studied separately. To date, MH ecosystem performance has been assessed by combining operational and technical variables without including quality indicators [6]. Quality of care scores were determined by a standardized instrument applied through interviews with service managers; therefore, they are managerial perceptions. However, the ratings produced have been shown to correlate well with service user experiences of care [42]. Expert knowledge, formalized in a knowledge base, was crucial for interpreting the level of adequacy of variable values. For this process, selecting the appropriate paradigm was mandatory to overcome the limitations of classical DEA, where less input (resources) consumption, given an output level, was related to higher efficiency. In this research, the selected paradigm was structured by the Balanced Care Model [33] and the English pathway of care in MH-supported accommodation services [34]. The knowledge base is the core of the fuzzy inference engine because it provides expert-based information for transforming the original data values (from the Monte Carlo simulation engine) according to the balanced care model. This transformation is based on experts’ opinions on the adequacy of each data value and represents a specific framework that must change throughout the time span and depend on the socioeconomic environment. Therefore, the same dataset (raw data) can result in different results according to experts’ perceptions of reality. The fuzzy inference engine can “understand” the nuances that decision-makers assume in system management. A recent national survey of MH-supported accommodation services across England found that the quality of care was higher in MH-supported housing than in residential care or floating outreach services [4]. Considering that quality variables have not been included in any RTE assessment of MH services and ecosystems [6,13,15,18,43], the question to answer here is whether the perception (given by the corresponding managers) about the quality provided by an MH service is consistent with its technical performance. If the impact on RTE scores is negative, then the perception of the quality provided is not aligned with the technical performance (baseline RTE). In this situation, technical results do not achieve an equivalent quality according to the manager´s opinion. If the impact can be considered neutral, then technical results and quality perception are balanced. Finally, if it is positive, managers know that the quality provided by their services is better than that of the corresponding neutral impact and probably manage other variables that could not be gathered in the QuEST study. The incorporation of quality domains as variables (outputs) in DEA had a neutral-positive or positive global impact on the performance of MH-supported accommodation services. The MH residential care services dataset was divided into those aiming to promote autonomy among service users (move-on oriented) and those focused on providing residential care (non-move-on oriented). This fact was highlighted by the EbCA panel of experts and allowed DEA to avoid the bias mainly induced by the interpretation of a critical output: “the number of service users who have moved to more independent accommodation”. Instead of being the most expensive and probably complex services, RTE scores revealed relatively high performance in both groups (especially when quality variables were included), but the variability was high, revealing probable differences in management strategies. Quality variables increased (sometimes not significantly) the average RTE in a relevant number of the selected services, but most showed a neutral-positive profile. Quality indicators included nontechnical but real service characteristics estimated by the respective managers that ultimately improved service performance scores. The average RTE of MH-supported housing services was lower than that of residential care homes, and the input/output balance was slightly worse because of the higher statistical variability. This variability can be associated with differences in service users’ needs and personal characteristics. The incorporation of quality variables significantly increased service performance and confirms the conclusions of a previous study by Killaspy et al. [4]: supported housing is a cost-effective type of care, as it provides support while promoting autonomy. Nevertheless, service performance scores were lower than expected due to the high variability among services and the low number of service users who moved to more independent accommodations (a critical technical output for the paradigm). A considerable number of services in MH-supported housing had an average RTE below 0.5 (relatively low performance), indicating that these types of services adjust their structures and obviously obtain different outcomes considering users’ needs (tailor-made structures). For these specific services, quality variables did not significantly increase performance. Again, considering that quality was assessed by the service managers (managerial perspective), they implicitly stated that in these services, major changes can be developed. The English pathway of care is not a one-way road, and service users can be moved from one supported housing service to another (instead of to a floating outreach service) or even remain in the same place longer than expected. This variable (N° of service users who moved to a more independent accommodation per bed/place) is likely to be outside the service’s control to some degree, i.e., the service may be working hard with service users to help them gain skills to move on, but the supply of more independent accommodations where they can move on may be insufficient. In this type of care, a relatively pertinent number of services decreases performance scores when quality variables are included, which may indicate that managers consider other variables or service characteristics that lead to poorer quality results than they were expecting considering the corresponding resource level and outcome production. The performance of MH floating outreach services was the lowest on average, but the incorporation of quality variables still led to a statistically significant increase in the average global RTE. The results reported by Killaspy et al. [4] also showed that these services provided lower-quality care than supported housing or residential care, but they also achieved the highest rate of movement to more independent accommodations. The high statistical variability of the services in this type of care indicates that they may be struggling to meet a wide range of service user needs. After adjusting for differences in patient characteristics, the move-on characteristic was a critical variable considering the Balanced Care Model (a paradigm for interpreting variable values). As occurs in residential houses, a relevant number of floating outreach services significantly increased their performance scores. Again, this increase is not aligned with the service management (resources and outcomes) shown in the baseline scenario. MH service names were not revealed in this study to maintain data privacy.

Conclusions

The ideal integration of supported accommodation services in an MH care pathway is unfortunately difficult to determine outside of highly integrated care systems. The existence of functioning care pathways is a major example of integration. With increasing integration, the inclusion of service quality variables in the RTE assessment of MH-supported accommodation services in England has become possible. Quality increased global service performance in the three types of care: residential care, supported housing and floating outreach. This neutral-positive or positive impact showed that RTE assessment using only technical variables is not sufficient to achieve a holistic view of service performance. Expert knowledge formalization was critical for distinguishing the types of service, identifying outliers and interpreting variable values according to a specific care paradigm. EbCA demonstrated its practical utility when appropriate experts were available to join the panel. The adaptation of the EDeS-MH to include quality variables required a two-step process. The results allowed us to have a better understanding of the performance of individual services and the supported accommodation care pathway. This approach may have utility in designing tailor-made improvement strategies for specific services as well as in service planning. The input/output balance of MH residential care services was appropriate because they were more structured from a managerial point of view (user needs are usually very well defined), while supported housing and floating outreach could be improved. These types of care are not structured due to the diversity of user needs associated with their higher level of autonomy compared to residential care service users. Considering that the English pathway of MH-supported accommodation is not a one-way road, the impact of remaining in supported housing and floating outreach services longer than the usual two years appears to have a major negative influence on service performance. Services showing a significant increase or decrease in their performance scores when quality variables are included in the analysis should be studied. The observed differences from the baseline scenario (only technical) must be a consequence of something related to specific structural or managerial characteristics. These characteristics are a crucial source of information for the design of new interventions, policies or strategies to improve MH care. Future research including other components of the whole system (such as inpatient, day and outpatient care) is recommended to understand global MH ecosystem performance in England. By selecting service benchmarks, key variables requiring improvement can be identified to design specific policies and interventions. The integration of the user’s satisfaction construct into the analysis is also a major trend. 14 Jun 2021 PONE-D-21-17723 The critical factor: the role of quality in the performance of supported accommodation services for complex mental illness in England. PLOS ONE Dear Dr. Almeda, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Jul 29 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript: A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'. A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'. An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see:  http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at  https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols . We look forward to receiving your revised manuscript. Kind regards, Dragan Pamucar Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at and 2. We note that you have stated that you will provide repository information for your data at acceptance. Should your manuscript be accepted for publication, we will hold it until you provide the relevant accession numbers or DOIs necessary to access your data. If you wish to make changes to your Data Availability statement, please describe these changes in your cover letter and we will update your Data Availability statement to reflect the information you provide. Additional Editor Comments: Abstract The abstract is loosely written. A standard abstract must present, without leaving any doubt, the objective of the paper precisely; source of data (which is not present in your abstract) and analytical approach used; key findings and any policy implication and recommendations. Introduction • The arguments are fairly presented but the statement that justifies the study does not come clearly (i.e. Why did you started this research?). • The introduction does not precisely construct the research problem tackled and does not show how the problem is taken care. • Why did you used MC simulation, DEA and fuzzy system in the study? These techniques also should be discussed. • The research hypotheses’ are not mentioned in the introduction or clear in the literature review. Literature review and critical analysis of theories, practices or commentary focusing on existing documents • The study lacks clear description of the literature review: • What I am missing is a description of the review. Did you conduct a systematic literature review? Which years? Key words? What was the literature you found? • Can you better describe how you came to your major variables? You have them from the literature review, but how was literature screened to derive these factors. Methodology and scope of work The analytical design is well prepared. Results The author has poorly discussed the results of the paper. One would expect to find the previous empirical work enriching the discussions of the results, but unfortunately, that has not been done. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. 29 Jul 2021 Dear Editor, Many thanks for your comments, we highly appreciate your suggestions and questions. According to them, we have carried out the corresponding changes in the paper (highlighted in yellow). Sincerely, The authors. EDITOR’S COMMENTS Comment nº 1. The abstract is loosely written. A standard abstract must present, without leaving any doubt, the objective of the paper precisely; source of data (which is not present in your abstract) and analytical approach used; key findings and any policy implication and recommendations. Thank you for your recommendation. A new abstract has been included following the proposed structure. Comment nº 2. The arguments are fairly presented but the statement that justifies the study does not come clearly (i.e. Why did you started this research?). According to your kind suggestion, the following paragraph has been included in the introduction: The standard performance assessment of any kind of comparable services includes input consumption and output production (directly related to the inputs consumed). These inputs/outputs are mainly technical. The incorporation of quality variables (in this study representing by six/seven quality domains) is always complicated because they are perceptions (perceived quality from users, managers, etc.). To our knowledge, this is the first study that includes quality domains, estimated by the managers of the selected MH services, into a performance analysis (6). Independently of the technical performance of the services (the baseline scenario analysed without quality domains) the main research question is to assess the impact on the service performance when quality domains are incorporated into the analysis. Taking into account that MH service managers have a specific amount of inputs and always try to obtain the corresponding best results, a neutral/positive relationship between managerial processes (input and output management) and their quality perceptions on performance is expected. Other changes include a relevant modification of the following paragraph: Relative Technical Efficiency (RTE) is a decision support technique that can be used for guiding health informed evidence-based policy-making, mainly on improving resources allocation (7,8). RTE assesses the relationship among the amount of inputs consumed and outputs produced by a set of comparable decision making units (9). RTE can be regarded as a synthetic meta-indicator that facilitates monitoring the evolution of a system and the dynamic relationships or connections across different performance indicators (10,11). It has been used for identifying tailor-made improvement strategies for health care ecosystems like provision and resourcing of addiction treatment clinics (12,13), residential MH facilities (14), homes for people with mental disability (15), clinics for children and youth (16) and community based youth services (17). The RTE of primary care and MH ecosystems have been systematically assessed in the Basque Country (Spain) (5,18–20). Data Envelopment Analysis (DEA) has been widely used to assess the RTE of health services (6,18,21). This group of non-parametric techniques are very robust and flexible because they don´t need any preliminary assumption on the variable statistical structure. This means that variables (inputs/outputs) from different origins and types (e.g. number of beds, technical input, and quality of care, manager perception) can be analysed at the same time. Quality variables have been included in DEA for assessing the RTE of systems operating in different socio-economic contexts (e.g. hospital care, schools or the banking industry) (22–25). DEA can be included in a Monte Carlo simulation engine to include uncertainty and randomness in data values (all of them are considered statistical distributions) and design more realistic models (5). Being DEA an operational model, it is completely blind. Variable values must be interpreted according to the existing expert knowledge (usually a theoretical paradigm). A fuzzy inference engine allows to make operational the formalization of the Balanced Care model following the EbCA methodology (5). This engine interprets variable values before RTE was calculated. Comment nº 3. The introduction does not precisely construct the research problem tackled and does not show how the problem is taken care. Thank you for your comment. For that reason, we have included the following paragraph (see also the previous comment): The standard performance assessment of any kind of comparable services includes input consumption and output production (directly related to the inputs consumed). These inputs/outputs are mainly technical. The incorporation of quality variables (in this study representing by six/seven quality domains) is always complicated because they are perceptions (perceived quality from users, managers, etc.). To our knowledge, this is the first study that includes quality domains, estimated by the managers of the selected MH services, into a performance analysis (6). Independently of the technical performance of the services (the baseline scenario analysed without quality domains) the main research question is to assess the impact on the service performance when quality domains are incorporated into the analysis. Taking into account that MH service managers have a specific amount of inputs and always try to obtain the corresponding best results, a neutral/positive relationship between managerial processes (input and output management) and their quality perceptions on performance is expected. Comment nº 4. Why did you used MC simulation, DEA and fuzzy system in the study? These techniques also should be discussed. Thanks again, according to this comment we have modify the following paragraph in the introduction (see also comment 2): Relative Technical Efficiency (RTE) is a decision support technique that can be used for guiding health informed evidence-based policy-making, mainly on improving resources allocation (7,8). RTE assesses the relationship among the amount of inputs consumed and outputs produced by a set of comparable decision making units (9). RTE can be regarded as a synthetic meta-indicator that facilitates monitoring the evolution of a system and the dynamic relationships or connections across different performance indicators (10,11). It has been used for identifying tailor-made improvement strategies for health care ecosystems like provision and resourcing of addiction treatment clinics (12,13), residential MH facilities (14), homes for people with mental disability (15), clinics for children and youth (16) and community based youth services (17). The RTE of primary care and MH ecosystems have been systematically assessed in the Basque Country (Spain) (5,18–20). Data Envelopment Analysis (DEA) has been widely used to assess the RTE of health services (6,18,21). This group of non-parametric techniques are very robust and flexible because they don´t need any preliminary assumption on the variable statistical structure. This means that variables (inputs/outputs) from different origins and types (e.g. number of beds, technical input, and quality of care, manager perception) can be analysed at the same time. Quality variables have been included in DEA for assessing the RTE of systems operating in different socio-economic contexts (e.g. hospital care, schools or the banking industry) (22–25). Additionally, we have modified the following paragraph: DEA can be included in a Monte Carlo simulation engine to include uncertainty and randomness in data values (all of them are considered statistical distributions) and design more realistic models (5). Being DEA an operational model, it is completely blind. Variable values must be interpreted according to the existing expert knowledge (usually a theoretical paradigm). A fuzzy inference engine allows to make operational the formalization of the Balanced Care model following the EbCA methodology (5). This engine interprets variable values before RTE was calculated. Comment nº 5. The research hypotheses’ are not mentioned in the introduction or clear in the literature review. Based on your comment, here you can see an additional explanation: Based on original data from the QUEST project (Killaspy et al., 2017) and a previous systematic review about DEA models in mental health (Garcia-Alonso et al., 2019), this study tries to check, if it exists, the relationship between “… managerial processes (input and output management) and their -mental health service managers- quality perceptions on performance …”. Taking into account this comment and according to previous ones, we have included the following paragraph: The standard performance assessment of any kind of comparable services includes input consumption and output production (directly related to the inputs consumed). These inputs/outputs are mainly technical. The incorporation of quality variables (in this study representing by six/seven quality domains) is always complicated because they are perceptions (perceived quality from users, managers, etc.). To our knowledge, this is the first study that includes quality domains, estimated by the managers of the selected MH services, into a performance analysis (6). Independently of the technical performance of the services (the baseline scenario analysed without quality domains) the main research question is to assess the impact on the service performance when quality domains are incorporated into the analysis. Taking into account that MH service managers have a specific amount of inputs and always try to obtain the corresponding best results, a neutral/positive relationship between managerial processes (input and output management) and their quality perceptions on performance is expected. Comment nº 6. Literature review and critical analysis of theories, practices or commentary focusing on existing documents. The study lacks clear description of the literature review: Thanks again, we have modified the introduction trying to make it clearer: 1. Explaining the relevance of the situation. 2. Introducing the relevance of integrating quality variables into the technical performance analysis of mental health services and introducing our main hypothesis. 3. Exploring the techniques in an integrated way. 4. Introducing the objective in a clearer way, as follows: This study aims to assess the impact of quality indicators (managerial perspective) in the performance (RTE) of selected MH supported accommodation services in the English pathway of care. This objective includes the formalization of specific quality domains into variables (rates), their integration in RTE assessment, and a comparative impact analysis of quality variables on the ecosystem performance to support decision-making and investment by providing relevant information for service managers to inform practice and service planning. In all of these sections, relevant literature has been reviewed and explained. Comment nº 7. What I am missing is a description of the review. Did you conduct a systematic literature review? Which years? Key words? What was the literature you found? Sorry again, we have modified the objective of the paper because the introduction offered a wrong perspective of our analysis: This study aims to assess the impact of quality indicators (managerial perspective) in the performance (RTE) of selected MH supported accommodation services in the English pathway of care. This objective includes the formalization of specific quality domains into variables (rates), their integration in RTE assessment, and a comparative impact analysis of quality variables on the ecosystem performance to support decision-making and investment by providing relevant information for service managers to inform practice and service planning. No systematic review was conducted for the present paper. Comment nº 8. Can you better describe how you came to your major variables? You have them from the literature review, but how was literature screened to derive these factors. Methodology and scope of work. The original dataset was collected from the QuEST study (raw data). In order to compare the selected MH services original data have been transformed into rates in order to eliminate the “size” effect. For example, it is not the same to have a budget of 100,000 pounds if you have 10 or 100 places or beds. All these rates were discussed several times by experts according to the EbCA model. The following sentence has been included in the “Scenarios” section: All the considered transformations of original data make the selected services comparable by eliminating the potential “size” effect on performance assessment. The rates included in this study provide a high reliability of measurement. Additionally, the large number of previous attempts to reach optimal representation of the data allowed us to find differences between residential services: move on and non-move on oriented. According to this, we have included the following sentence in the “Relative Technical Efficiency” section. On the other hand, the analytical process of setting the measurement units for the variables highlighted the existence of two different groups in the residential care services dataset: move on and non-move on oriented. Comment nº 9. The author has poorly discussed the results of the paper. One would expect to find the previous empirical work enriching the discussions of the results, but unfortunately, that has not been done. According to your kind suggestion we have added additional comments in the “Results” section: Globally speaking, the inclusion of quality domains highlights a neutral-positive or positive impact on the performance of supported accommodation services in England. This is especially relevant in some services that showed lower RTE on average: 13, 18 and 21 (33.3%, Table 3); 14, 90, 96 and 124 (30,8%, Table 4); 18, 22, 27, 58, 70, 97, 110 and 115 (23.5%, Table 5); and, finally, 1, 7, 36, 56, 87, 109, 121 and 122 (26,7%, Table 6). In these services the manager´s perception on quality provided by their services surpass technical results (baseline scenario), probably because there are non-evaluated variables or circumstances in the care provision. On the other hand, quality variables decrease service performance only in supported accommodation services 15, 31, 32, 86 and 98 (14.7%, Table 5). This behaviour can be considered strange because here the manager´s perception on quality provided by their services underestimates their own technical results. Again, there will be additional non-assessed variables or circumstances that can affect the manager´s way of processing information to conclude in detecting frameworks of improving quality of care. Services where the inclusion of quality variables results in a neutral behaviour in their performance are the majority, respectively: 66.7%, 69,2%, 61,8% and 73,3%. In these cases, the manager´s perception on quality provided by their corresponding services matches with their technical results. They obtain a quality according to their resources and outcomes. Additionally, the “Discussion” section has been enriched including the following sentences: A recent national survey of MH supported accommodation services across England found that the quality of care was higher in MH supported housing than in residential care or floating outreach services (4). Taking into account that quality variables have not included in any RTE assessment of MH services and ecosystems (6,13,15,18,43), the question to answer here is if the perception (given by the corresponding managers) about the quality provided by the MH service is aligned or not with its technical performance. If the impact on RTE scores is negative, then the perception of the quality provided is not aligned with the technical performance (baseline RTE). In this situation, technical results do not achieve an, at least, equivalent quality, according to the manager´s opinion. If the impact can be considered neutral, then technical results and quality perception are balanced. Finally, if it is positive, managers know that the quality proved by their services is better than the corresponding neutral, probably they manage other variables that could not gathered in the QuEST study. … Quality variables increased (sometimes not significantly) RTE on average in a relevant number of the selected services [residential houses], but the majority showed a neutral-positive profile. Quality indicators included non-technical but real service characteristics estimated by the respective managers that, in the end, improved service performance scores. … In this type of care [supported houses] a relatively relevant number of services decrease performance scores when quality variables are included. This situation probably indicates that managers take into consideration other variables or service characteristics that end into a poorer quality results than they were expected considering the corresponding resource level and outcome production. … As happens in residential houses, a relevant number of floating outreach services increased their performance scores significantly. Again, this increase is not aligned with the service management (resources and outcomes) showed in the baseline scenario. Also in the “Conclusion” section Services that have a significant increase or decrease of their performance scores when quality variables are included into the analysis, should be studied. The observed differences to the baseline scenario (only technical) must be a consequence of something related to specific structural o managerial characteristics. These characteristics are a crucial source of information to design new interventions, policies or strategies in order to improve MH care. Submitted filename: Response to the reviewers.docx Click here for additional data file. 6 Sep 2021 PONE-D-21-17723R1 The critical factor: the role of quality in the performance of supported accommodation services for complex mental illness in England. PLOS ONE Dear Dr. Almeda, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Oct 21 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript: A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'. A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'. An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see:  http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at  https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols . We look forward to receiving your revised manuscript. Kind regards, Dragan Pamucar Academic Editor PLOS ONE [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: (No Response) ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Partly ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: No ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: I didn't review the paper in the previous round, but I think the reviewer has pointed the major issues that were existing in the manuscript. In my opinion, the authors have solved issues properly. Based on that I think the paper deserves to be published in a present form. Reviewer #2: The paper “The critical factor: the role of quality in the performance of supported accommodation services for complex mental illness in England” tries to define critical quality factor for better efficiency evaluation and set up decision support system help in resource allocation. My comments and concerns regarding the paper are given as follows. Major comments: 1. The authors use DEA as a method of measuring RTE of supported accommodation services as part of health care system by employing commonly used input/outputs with addition of quality variables. However, the procedure and purpose of using all three methods is not clear and justified enough. Especially, it is not clear why and how authors use randomization of the original data (understanding the environmental and structural uncertainty and randomness). Is it objective to multiply number of DMU and why? How many DMUs are in the final dataset that has been used in DEA assessment? 2. There is no theoretical fundaments of used methods (DEA, Fuzzy interference and Monte Carlo) given. Minor comments: 3. The paper is not flawlessly written. It is a bit difficult to follow the line and purpose of paper. 4. It is not necessary to have words Introduction, Objectives, Method, Results and Conclusion in the abstract. It is clear from the text what the purpose of those parts in abstract are. 5. In the Introduction, it says „Relative Technical Efficiency (RTE) is a decision support technique …”. This statement is not correct since Relative Technical Efficiency is a measure not technique. 6. “DEA can be included in a Monte Carlo simulation engine to include uncertainty and randomness in data values (all of them are considered statistical distributions) and design more realistic models (5).” This statement is not clear. Please explain how the DEA is included in a Monte Carlo Simulation engine (randomizing the data values or RTE). 7. The meaning of the following sentence after Table 6 is not clear: “This is especially relevant in some services that showed lower RTE on average: 13, 18 and 21 (33.3%, Table 3)”; What is 33.3% referred on? There are similar examples in the whole section. 8. There is no straight-forward fuzzy inference interpretation and it importance explained in discussion or conclusion. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. 4 Oct 2021 Dear Editor and Reviewers, many thanks for your feedback, we have carried out all the changes proposed, which are highlighted in yellow in the “Revised Manuscript with Track Changes”. In this document we have developed a rebuttal letter that responds to each point raised. Data availability We have uploaded the dataset to Dryad Digital Repository, and it will be available for the public once the article has been published. In addition, Dryad has provided a temporal link for reviewers so they can access to the dataset. This is the link: https://datadryad.org/stash/share/1eXNj4HhRp0ztCLfgzkwO53k7Rh8q0MvVeZh_A14-xs Nevertheless, in the text, we should maintain the original DOI for public access because the reviewers link is temporal and only for revision process. In the text: The dataset is available at the Dryad digital repository (https://doi.org/10.5061/dryad.j0zpc86dz). (Methods, “Setting” section). Reviewer #1: I didn't review the paper in the previous round, but I think the reviewer has pointed the major issues that were existing in the manuscript. In my opinion, the authors have solved issues properly. Based on that I think the paper deserves to be published in a present form. Many thanks for reviewing the manuscript and for your feedback. Reviewer #2: The paper “The critical factor: the role of quality in the performance of supported accommodation services for complex mental illness in England” tries to define critical quality factor for better efficiency evaluation and set up decision support system help in resource allocation. My comments and concerns regarding the paper are given as follows. Major comments: 1. The authors use DEA as a method of measuring RTE of supported accommodation services as part of health care system by employing commonly used input/outputs with addition of quality variables. However, the procedure and purpose of using all three methods is not clear and justified enough. Especially, it is not clear why and how authors use randomization of the original data (understanding the environmental and structural uncertainty and randomness). Is it objective to multiply number of DMU and why? How many DMUs are in the final dataset that has been used in DEA assessment? Thank you very much for your comments. We have answered your questions separately and propose the corresponding changes in the paper (in yellow). • Relative Technical Efficiency (RTE) is a common and robust indicator for assessing the performance of any kind of comparable decision making units (in our case, mental health services). RTE is always concerned by the balance between the inputs consumed and the outputs produced, obviously looking for the best one. In order to clarify this concept we modified the introduction: The standard performance assessment of any kind of comparable service includes studying their input consumption and output production (directly related to the inputs consumed). These inputs/outputs are mainly technical, and in the end, the best service performance is always related to the most appropriate balance between the available inputs and the produced outputs. Researchers and decision-makers can seek to reduce (minimize) the amount of inputs for a given amount of outputs (input orientation) or vice versa to increase (maximize) the output production for a specific amount of inputs (output orientation). • In order to assess RTE we have used a non-parametric set of robust techniques that has been widely used in research and management: Data Envelopment Analysis (DEA) following the input orientation with variable return to scale as it was described in section “Decision Support System”. This technique has three main drawbacks: first, if the number of original observations (services) is not big enough DEA could not be discriminative enough (all the services would be efficient), second, it cannot understand uncertainty and randomness and, finally, it is completely blind and raw data values have to be interpreted according to a expert perspective in order to avoid wrong results. These drawbacks have to be solved when the EDeS-MH (Efficient Decision Support-Mental Health) decision support system was designed (see references number 19 and 21 for more detailed technical explanations). • When the number of observations are less than the multiplication of the number of inputs by the number of outputs by 2 (2xNºinputsxNºoutputs), DEA can be no discriminative enough. Here the baseline scenarios have 3 inputs and 3 outputs so we need at least 18 services or more. When quality variables are included, we need, at least, 2x3x4=24 services. Surveyed residential services (9 move on services and 13 non-move on services) are not enough to carry out DEA in an standard way. On the other hand, the number of services 34 supported housing and 30 floating outreach ones can be considered likely scarce. Taking into account that all raw data should be considered under uncertainty (for example, if a specific service has 13 places it doesn’t mean that sometimes, due to more or less expected or unexpected reasons, it can have 12 or 14 or …) this is especially true when population, budged, etc. variables are included in the analysis (directly or for calculating rates). Uncertainty plays a relevant decisional role and it was managed by the Monte-Carlo simulation engine (more technical explanations can be seen in references 18, 19, 21 and 32). In order to carry out this statistical technique, we have to transform raw data in statistical distributions (one for each data value). We have selected standard triangular ones with small variations at the left and right hands of the modal value that was the original data value (see section “Randomization …”). When the simulation process is run, the number of observations is multiplied (Nºsimulations by Nºobservations, for example: 500 simulations by 9 move on residential services = 4,500 observations to be analysed). Simulation makes an automatic sensitivity analysis by varying original values at random according to the selected statistical distribution. This process solves the two first drawbacks (the third one is explained in the following section in this document). In order to explain this process better we have included the following sentences in the paper: Nine move-on residential services, 13 non-move-on residential services, 34 supported housing services and 30 floating outreach services were finally analysed because they have complete quality datasets. Five hundred simulations were run by the DSS. (“Basic statistics” section) The simulation engine was developed to address the uncertainty (data imprecision and vagueness) and randomness (unexpected facts) of real environments and to artificially multiply the number of observations (29). The inner uncertainty of any ecosystem can be overcome by transforming original data values into statistical distributions (from standard datasets to statistical distribution bases). In each simulation, the Monte Carlo simulation engine analyses a new dataset selected at random. The statistical analysis of the final results (the process is stopped when the statistical error is lower than 2.5% for the mean) includes a sensitivity analysis of the ecosystem under study. The results (RTE scores) for each DMU and scenario are statistical distributions that can be studied in a more (basic statistics) or less (stability and entropy) standard manner (19,21). The characteristics of these statistical distributions represent the potential reaction of the DMU to data changes.(“Decision Support System” section) … Second, quality variables must then be included as outputs to investigate the influence of quality on performance. Finally, when the number of observations is low (here, the number of move-on residential services is especially low), DEA cannot be sufficiently discriminative (in the end, the methodology tends to show that all DMUs are efficient). However, as explained before, the uncertainty analysis (Monte Carlo simulation engine) multiplies the number of observations by the number of selected simulations, which overcomes this DEA drawback. (“Decision Support System” section) 2. There is no theoretical fundaments of used methods (DEA, Fuzzy interference and Monte Carlo) given. Sorry and thank you very much for this comment that helps us to improve the paper. Again We have answered your questions separately and propose the corresponding changes in the paper. • (Same explanation of the previous section). Relative Technical Efficiency (RTE) is a common and robust indicator for assessing the performance of any kind of comparable decision making units (in our case, mental health services). RTE is always concerned by the balance between the inputs consumed and the outputs produced, obviously looking for the best one. In order to clarify this concept we modified the introduction: The standard performance assessment of any kind of comparable service includes studying their input consumption and output production (directly related to the inputs consumed). These inputs/outputs are mainly technical, and in the end, the best service performance is always related to the most appropriate balance between the available inputs and the produced outputs. Researchers and decision-makers can seek to reduce (minimize) the amount of inputs for a given amount of outputs (input orientation) or vice versa to increase (maximize) the output production for a specific amount of inputs (output orientation). • Data Envelopment Analysis (DEA) is a set of well known non-parametric techniques to assess RTE (se references 19 and 21 for more technical details). In this paper we used input-oriented DEA with variable returns to scale. In order to clarify these aspects we have included the following sentences in the paper: … In this research, the variable returns to scale DEA (31) was selected because when studying MH services, real output variations cannot be considered proportional to the corresponding input modifications (32), and constant returns to scale would involve a constant variation that cannot be considered realistic. The input-oriented DEA model was applied to assess whether service input consumption can be reduced while assuming a constant output level (9), which is especially relevant for decision-makers who must allocate finite resources to meet population needs. Output maximization (output-oriented DEA) is especially difficult and sometimes not recommendable (for example, when the system artificially tries to maximize the number of users who are moved to a service with greater independence, this can be mathematically correct but from a health care perspective it has no sense at all) when supported accommodation services are assessed. (“Decision Support System” section). • Monte-Carlo simulation was used to include uncertainty into the analysis. This procedure allows us to develop a sensitivity analysis on the variables and scenarios. In the end, results are always statistical distributions where probabilities (to be efficient, to have an efficiency greater than 0.75, stability, entropy, etc.) are fundamental. In order to clarify this, we have included the following sentences into the paper. …under study. The results (RTE scores) for each DMU and scenario are statistical distributions that can be studied in a more (basic statistics) or less (stability and entropy) standard manner (19,21). The characteristics of these statistical distributions represent the potential reaction of the DMU to data changes. (“Decision Support System” section). • The fuzzy inference engine has been developed to interpret variable values according to a specific paradigm. In this paper the Balanced of Care model has been selected as this paradigm. All the variable values from the Monte-Carlo simulation engine are interpreted in terms of adequacy. For example, a low annual budget per place/bed is considered bad (not adequate) but too much budget per place/bed is also bad (not adequate), here the most appropriate values are located in between a range that is defined by the experts. Another example, a low quality score is bad and, in this case, the greater the value the greater the adequacy (greater values are always good). The fuzzy inference engine includes a knowledge-base formalized by standard IF … THEN rules. For example, IF the variable value of X is greater than a specific values THEN the value has to be transformed according to a mathematical equation or a fuzzy operator (product-sum gravity method). Our fuzzy inference engine transforms original data values (from the Monte-Carlo engine) into “interpreted” values according the information given by the experts (more technical details can be found in references 19 and 21). Trying to clarify this, we have included the following sentences into the paper. …when inside the range. The specific references for data value interpretation have been defined by a panel of experts according to the paradigm selected and their expertise. This process followed the EbCA model (5), where an iterative sequence of expert-based reviews culminates in a consensus. Once the references for interpreting variable values are defined, the EDeS-MH automatically runs a mathematical transformation based on an equation (linear monotone transformation) or a fuzzy operator (product-sum gravity method) to obtain the “transformed” value (21). These transformed values will be analysed by DEA to determine the corresponding RTE scores (statistical distributions). (“Decision Support System” section). Minor comments: 3. The paper is not flawlessly written. It is a bit difficult to follow the line and purpose of paper. Thanks again for your comment. We have modify the aim of the paper in order to clarify the purpose of our research. We include these sentences in the introduction: • … investment by providing relevant information for service managers to inform practice and service planning. Accordingly, this paper first presents a description of the ecosystem under study (a representative sample of supported accommodation services in England). Then, the selected variables are described and grouped into scenarios to highlight different perspectives of the ecosystem situation. Finally, the methodology used to assess ecosystem performance (including quality domains) is briefly described. The “Procedure” section has been completely rewritten in order to improve the readability of the paper. 4. It is not necessary to have words Introduction, Objectives, Method, Results and Conclusion in the abstract. It is clear from the text what the purpose of those parts in abstract are. Many thanks, we have deleted these words in the abstract. 5. In the Introduction, it says „Relative Technical Efficiency (RTE) is a decision support technique …”. This statement is not correct since Relative Technical Efficiency is a measure not technique. Many thanks, we have corrected it. 6. “DEA can be included in a Monte Carlo simulation engine to include uncertainty and randomness in data values (all of them are considered statistical distributions) and design more realistic models (5).” This statement is not clear. Please explain how the DEA is included in a Monte Carlo Simulation engine (randomizing the data values or RTE). Thanks again for your comment. In almost any ecosystem raw data should be considered under uncertainty (for example, if a specific service has 13 places it doesn’t mean that sometimes, due to more or less expected or unexpected reasons, it can have 12 or 14 or …) this is especially true when population, budged, etc. variables are included in the analysis (directly or for calculating rates). Variable values were also obtained in a specific time and, if a wider perspective of the situation is required, researchers have to include some variability in order to be relatively sure (a probability) that the analysis can match (more or less) to a more actual situation. In order to clarify this, we have included the following into the paper: Original data were randomized using symmetric triangular statistical distributions (5% variation on each side of the corresponding original value). This range includes feasible data variations (imprecision and vagueness). No critical stress on the ecosystem was included in the analysis (randomness). This procedure includes both data variations corresponding to ecosystem evolution (population, user mobility, etc.) and the effect of the time. (“Procedure” section). 7. The meaning of the following sentence after Table 6 is not clear: “This is especially relevant in some services that showed lower RTE on average: 13, 18 and 21 (33.3%, Table 3)”; What is 33.3% referred on? There are similar examples in the whole section. You are completely right, sorry again. These percentages are calculated on the number of surveyed services in each type. The paragraphs have been modify accordingly. Generally, the inclusion of quality domains highlights a neutral-positive or positive impact on the performance of supported accommodation services in England, which is especially relevant in some services that showed a lower average RTE: 13, 18 and 21 (33.3% of the surveyed services, Table 3); 14, 90, 96 and 124 (30.8% of the surveyed services, Table 4); 18, 22, 27, 58, 70, 97, 110 and 115 (23.5% of the surveyed services, Table 5); and, finally, 1, 7, 36, 56, 87, 109, 121 and 122 (26.7% of the surveyed services, Table 6). In these services, the manager´s perception of quality provided by their services surpasses technical results (baseline scenario), probably because of non-evaluated variables or circumstances in care provision. On the other hand, quality variables decrease service performance only in supported accommodation services 15, 31, 32, 86 and 98 (14.7% of the surveyed services, Table 5). … Services where the inclusion of quality variables results in a neutral behaviour of their performance constituted the majority, respectively: 66.7%, 69.2%, 61.8% and 73.3% of the surveyed services. In these cases, the manager´s perception of quality provided by their corresponding services matches their technical results; they obtain a quality according to their resources and outcomes. 8. There is no straight-forward fuzzy inference interpretation and it importance explained in discussion or conclusion. Thank you very much. We have added the following sentences in the “Discussion” section. … accommodation services (34). The knowledge base is the core of the fuzzy inference engine because it provides expert-based information for transforming the original data values (from the Monte Carlo simulation engine) according to the balanced care model. This transformation is based on experts’ opinions on the adequacy of each data value and represents a specific framework that must change throughout the time span and depend on the socioeconomic environment. Therefore, the same dataset (raw data) can result in different results according to experts’ perceptions of reality. The fuzzy inference engine can “understand” the nuances that decision-makers assume in system management. Submitted filename: Response to Reviewers.docx Click here for additional data file. 20 Jan 2022
PONE-D-21-17723R2
The critical factor: the role of quality in the performance of supported accommodation services for complex mental illness in England.
PLOS ONE Dear Dr. Almeda, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.
 
Please submit your revised manuscript by Mar 06 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'. A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'. An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Dragan Pamucar Academic Editor PLOS ONE Journal Requirements: Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The authors have addressed the point of my concern. I am happy with their corrections. Hence, I would like to recommend this manuscript to be published. Reviewer #2: The authors addressed improved paper and addressed all the comment from the previous review round. I would like to recommend to authors to include to DEA mathematical models used in the paper. ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.
22 Feb 2022 Response to Reviewers Dear Editor and Reviewers, many thanks for assessing this research and proposed your valuable comments. We have carried out all the changes proposed, which are highlighted in yellow in the “Revised Manuscript with Track Changes”. In this document we have developed a rebuttal letter that responds to each point raised. Journal Requirements Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. Many thanks for the comment. We have reviewed the citations and references, and we have deleted the link (http://www.ucl.ac.uk/quest) which was in “Methods>Setting” because it was not working. In addition, we have added a relevant reference for this sentence (in Methods>Setting) (highlighted in yellow in the manuscript with track changes): “Data for supported accommodation services from 14 nationally representative local authorities in England were collected for the QuEST study, which was funded by the National Institute of Health Research (2012-2017) (4). Reviewers’ comments Reviewer #1: The authors have addressed the point of my concern. I am happy with their corrections. Hence, I would like to recommend this manuscript to be published. Dear reviewer 1, many thanks for your evaluating this research and providing your valuable comments. Reviewer #2: The authors addressed improved paper and addressed all the comment from the previous review round. I would like to recommend to authors to include to DEA mathematical models used in the paper. Dear referee, the DEA model has been included in section “Methods>Decision Support System” as follows (highlighted in yellow in the manuscript with track changes) The standard DEA model is a linear programming one which structure is detailed in the revised manuscript with track changes (highlighted in yellow), in the manuscript and in the response to reviewers document. It is not possible to detail the standard DEA model in this box because it does not admit mathematical notation. Best regards, The authors Submitted filename: Response to Reviewers.docx Click here for additional data file. 1 Mar 2022 The critical factor: the role of quality in the performance of supported accommodation services for complex mental illness in England. PONE-D-21-17723R3 Dear Dr. Almeda, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Dragan Pamucar Academic Editor PLOS ONE Additional Editor Comments (optional): The authors have addressed the point of my concern. I am happy with their corrections. Hence, I would like to recommend this manuscript to be published. Reviewers' comments: 9 Mar 2022 PONE-D-21-17723R3 The critical factor: the role of quality in the performance of supported accommodation services for complex mental illness in England. Dear Dr. Almeda: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Dragan Pamucar Academic Editor PLOS ONE
  27 in total

1.  An efficiency data envelopment analysis model reinforced by classification and regression tree for hospital performance evaluation.

Authors:  Chun-Ling Chuang; Peng-Chan Chang; Rong-Ho Lin
Journal:  J Med Syst       Date:  2010-09-28       Impact factor: 4.460

2.  Use of an operational model of community care to assess technical efficiency and benchmarking of small mental health areas in Spain.

Authors:  Luis Salvador-Carulla; Carlos García-Alonso; Juan Luis Gonzalez-Caballero; Marco Garrido-Cumbrera
Journal:  J Ment Health Policy Econ       Date:  2007-06

Review 3.  LSE-Lancet Commission on the future of the NHS: re-laying the foundations for an equitable and efficient health and care service after COVID-19.

Authors:  Michael Anderson; Emma Pitchforth; Miqdad Asaria; Carol Brayne; Barbara Casadei; Anita Charlesworth; Angela Coulter; Bryony Dean Franklin; Cam Donaldson; Michael Drummond; Karen Dunnell; Margaret Foster; Ruth Hussey; Paul Johnson; Charlotte Johnston-Webber; Martin Knapp; Gavin Lavery; Marcus Longley; Jill Macleod Clark; Azeem Majeed; Martin McKee; John N Newton; Ciaran O'Neill; Rosalind Raine; Mike Richards; Aziz Sheikh; Peter Smith; Andrew Street; David Taylor; Richard G Watt; Moira Whyte; Michael Woods; Alistair McGuire; Elias Mossialos
Journal:  Lancet       Date:  2021-05-06       Impact factor: 79.321

4.  Relative Technical Efficiency Assessment of Mental Health Services: A Systematic Review.

Authors:  Carlos R García-Alonso; Nerea Almeda; José Alberto Salinas-Pérez; Mencía R Gutiérrez-Colosía; Luis Salvador-Carulla
Journal:  Adm Policy Ment Health       Date:  2019-07

5.  [Technical efficiency assessment of public primary care providers in the Basque Country (Spain), 2010-2013].

Authors:  José Manuel Cordero; Roberto Nuño-Solinís; Juan F Orueta; Cristina Polo; Mario Del Río-Cámara; Edurne Alonso-Morán
Journal:  Gac Sanit       Date:  2015-12-02       Impact factor: 2.139

6.  Predictors of moving on from mental health supported accommodation in England: national cohort study.

Authors:  Helen Killaspy; Stefan Priebe; Peter McPherson; Zohra Zenasni; Lauren Greenberg; Paul McCrone; Sarah Dowling; Isobel Harrison; Joanna Krotofil; Christian Dalton-Locke; Rose McGranahan; Maurice Arbuthnott; Sarah Curtis; Gerard Leavey; Geoff Shepherd; Sandra Eldridge; Michael King
Journal:  Br J Psychiatry       Date:  2020-06       Impact factor: 9.319

7.  Integrating clinicians, knowledge and data: expert-based cooperative analysis in healthcare decision support.

Authors:  Karina Gibert; Carlos García-Alonso; Luis Salvador-Carulla
Journal:  Health Res Policy Syst       Date:  2010-09-30

8.  Quality of longer term mental health facilities in Europe: validation of the quality indicator for rehabilitative care against service users' views.

Authors:  Helen Killaspy; Sarah White; Christine Wright; Tatiana L Taylor; Penny Turton; Thomas Kallert; Mirjam Schuster; Jorge A Cervilla; Paulette Brangier; Jiri Raboch; Lucie Kalisova; Georgi Onchev; Spiridon Alexiev; Roberto Mezzina; Pina Ridente; Durk Wiersma; Ellen Visser; Andrzej Kiejna; Patryk Piotrowski; Dimitris Ploumpidis; Fragiskos Gonidakis; José Miguel Caldas-de-Almeida; Graça Cardoso; Michael King
Journal:  PLoS One       Date:  2012-06-04       Impact factor: 3.240

9.  A decision support system for assessing management interventions in a mental health ecosystem: The case of Bizkaia (Basque Country, Spain).

Authors:  Carlos R García-Alonso; Nerea Almeda; José A Salinas-Pérez; Mencía R Gutiérrez-Colosía; José J Uriarte-Uriarte; Luis Salvador-Carulla
Journal:  PLoS One       Date:  2019-02-14       Impact factor: 3.240

10.  Developing a tool for mapping adult mental health care provision in Europe: the REMAST research protocol and its contribution to better integrated care.

Authors:  Luis Salvador-Carulla; Francesco Amaddeo; Mencia R Gutiérrez-Colosía; Damiano Salazzari; Juan Luis Gonzalez-Caballero; Ilaria Montagni; Federico Tedeschi; Gaia Cetrano; Karine Chevreul; Jorid Kalseth; Gisela Hagmair; Christa Straßmayr; A-La Park; Raluca Sfetcu; Kristian Wahlbeck; Carlos Garcia-Alonso
Journal:  Int J Integr Care       Date:  2015-12-01       Impact factor: 5.120

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.