Literature DB >> 28725821

Correlation of Inpatient Experience Survey Items and Domains With Overall Hospital Rating.

Kyle Kemp1,2, Brandi McCormack1, Nancy Chan1, Maria J Santana2, Hude Quan2.   

Abstract

OBJECTIVE: To determine which individual patient experience questions and domains were most correlated with overall inpatient hospital experience.
METHODS: Within 42 days of discharge, 27 639 patients completed a telephone survey based upon the Hospital-Consumer Assessment of Healthcare Systems and Processes instrument. Patients rated their overall experience on a scale of 0 (worst care) to 10 (best care). Correlation coefficients were calculated to assess the relationships between individual survey questions and domains with overall experience.
RESULTS: Questions on provider coordination and nursing care were most correlated with overall experience. Hospital cleanliness, quietness, and discharge information questions showed poor correlation. Correlation with overall experience was strongest for the "communication with nurses" domain.
CONCLUSIONS: Our individual question results are novel, while the domain-based findings replicate those of US-based providers, results which had not yet been reported in the Canadian context-one with universal health care coverage. Our results suggest that our large health care organization may attain initial inpatient experience improvements by focusing upon personnel-based initiatives, rather than physical attributes of our hospitals.

Entities:  

Keywords:  HCAHPS; correlation; domains; inpatient experience

Year:  2015        PMID: 28725821      PMCID: PMC5513631          DOI: 10.1177/2374373515615977

Source DB:  PubMed          Journal:  J Patient Exp        ISSN: 2374-3735


Introduction

Inpatient experience is a patient-reported outcome which assesses the perceived quality of health care interactions and services which are delivered over the course of a given hospital stay. In the United States, the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey was introduced to measure hospital experience in a rigorous, systematic fashion. As such, this validated survey allows for meaningful comparison between multiple hospitals, something that was previously not possible with the use of ad-hoc, in-house instruments which varied between institutions (1 –4). On a larger scale, HCAHPS results in the United States now play a crucial role in the value-based purchasing program introduced under the Affordable Care Act of 2010 (5). As such, patient experience results now directly affect a portion of hospital funding, resulting in a clear incentive to not only collect patient experience data, but also to act upon it. The HCAHPS survey contains over 30 questions and touches upon 9 different domains (communication with doctors, communication with nurses, responsiveness of hospital staff, pain management, communication about medicines, discharge information, cleanliness of the hospital environment, quietness of the hospital environment, and transition of care) (1, 6). With respect to the US-based data, the Centers for Medicare & Medicaid Services (central repository for HCAHPS data) reports correlation data between each of these domains, as well as overall experience (7), which is asked of patients as per the following statement: By determining which area(s) may provide the most benefit to overall care experience, this type of correlational analysis overcomes one of the perceived challenges in trying to implement organizational change from HCAHPS. Simply said, the results may inform the end-user as to the specific domains which when improved would theoretically provide the greatest gains in overall patient experience (8, 9). However, the reported data do not examine individual questions nor do it address these comparisons within a Canadian setting, one which employs a universal health care model (10). Therefore, the purposes of this project were to assess which (a) individual questions and (b) HCAHPS domains are most correlated with overall inpatient hospital experience in our Canadian data set. We want to know your overall rating of your stay at . This is the stay that ended around <DDATE>. Please do not include any other hospital stays in your answer. Using any number from 0 to 10, where 0 is the worst hospital possible and 10 is the best hospital possible. What number would you use to rate this hospital during your stay?

Methods

Survey Instrument

Our organization’s province-wide inpatient hospital experience survey has been administered on a continuous basis since 2009. The overall rating of care question is one of 16 publicly-reported performance measures (11). The survey is comprised of 51 questions, which includes 32 core HCAHPS items, as well as 19 other items which address organization-specific policies and procedures. Surveys are administered by a trained team of health research interviewers using computer-assisted telephone interview (CATI) software (Voxco (Voxco version 1.10); Montreal, Canada). Potential respondents are contacted Monday to Friday from 9 am to 9 pm and on Saturdays from 9 am to 4 pm. To ensure standardization, 10% of all calls are monitored for quality assurance and training purposes. Each survey typically requires 15 to 20 minutes to complete using a standard script with a list of standard prompts and responses to frequently asked questions. Responses to each survey question are Likert-type scales. Certain questions ask the respondent to rate aspects of their care on a scale of 0 (worst) to 10 (best), while other items employ categorical responses (eg, always, usually, sometimes, and never). Detailed information about the development, validity, and American results from HCAHPS is publicly available at www.hcahpsonline.org (1, 6).

Sample Derivation and Dialing Protocol

Across our province, acute care admission, discharge, and transfer information are captured in a series of clinical databases which are updated daily. On a biweekly basis, eligible discharges are extracted using a standard script. This script filters all inpatient records based on our survey exclusion criteria. These include age less than 18 years old, inpatient stay of less than 24 hours, death during hospital stay (no proxy surveys are permitted), any day surgery or ambulatory procedures, and any psychiatric unit or psychiatric physician service on record (12). For compassionate reasons, our organization also excludes any records containing any dilation and curettage procedures, as well as visits relating to still births, or those associated with a baby with length of stay greater than 6 days (eg, complication/neonatal intensive care unit stay). The 4 data extracts are combined into 1 complete provincial list, with duplicate entries (if present) being filtered out. Once compiled, the complete list of eligible inpatient discharges is imported into the CATI software program and stratified at the hospital level. At the time of interview, random dialing is performed on the sample, until a quota of 5% of eligible discharges across all of our province’s 94 acute care hospitals is met. Patients are contacted up to 42 days postdischarge. To maximize the potential for survey completion, each dialed number is called up to 9 times on varying days and times or until a definitive result (eg, survey completed, refusal, etc) is obtained.

Analysis

To account for differences in individual question response scales (eg, 3-point, 4-point, and 11-point), all inpatient experience responses were converted to a normalized scale of 0 (worst possible score) to 100 (best possible score). For example, for questions with a response scale of never, sometimes, usually, and always, scores were converted to 0, 33.33, 66.66, and 100, respectively. For the independent variable (overall rating of care), respondents were asked to rate the overall care that they received during their inpatient visit on an 11-point scale of 0 to 10. These scores were converted to the 0 (worst possible) to 100 (best possible) score, where 1 equals 10, 2 equals 20, and so on. With respect to HCAHPS domains, mean scores were calculated using the following formula: Sum of normalized question scores in the domain/Number of questions in the domain. For example, the mean domain score for nurse communication, which is comprised of 3 questions, was calculated by dividing the sum of the nurse respect, nurse listening, and nurse explanation normalized question scores by 3. This process was completed for each domain score. A listing of the domains with their corresponding questions is provided in Table 1.
Table 1.

Inpatient Survey Domains and Question Composition.

DomainQuestions
Communication from nursesNurse respect
Nurse listening
Explanations from nurses
Communication from doctorsDoctor respect
Doctor listening
Explanations from doctors
Responsiveness of hospital staffCall button response
Bathroom assistance
Pain managementHelp with pain
Pain control
Communication about medicinesNew medicine purpose
New medicine side effects
Cleanliness of hospitalRoom cleanliness
Quietness of hospitalRoom quietness
Discharge informationHelp after discharge
Symptoms after discharge
Inpatient Survey Domains and Question Composition. The characteristics of the sample were generated using descriptive statistics. The relation between normalized individual questions and overall rating of care (objective a) was calculated using the Spearman correlation statistic. The relation between normalized domain scores and overall rating of care (objective b) was calculated using the Pearson correlation statistic. All analyses were performed using SAS Network Version 9.3 for Windows (Cary, North Carolina). P values of less than .05 were considered statistically significant for all comparisons.

Results

Over the 3-year study period (April 2011 to March 2014), 27 492 inpatient experience surveys were completed. One-hundred twenty-three patients did not provide a response to the “overall rating of care” question. These surveys were excluded from analysis, resulting in a final sample of 27 369 completed surveys (99.6% of original cohort). Characteristics of the final sample are provided in Table 2. Our survey cohort was primarily female (64.7%) and between 25 and 74 years of age (74.8%). The majority of respondents was born in Canada (85.3%) and primarily spoke English at home (90.5%). The mean age of the cohort was 53.8 ± 20.0 years (median = 56.0), and the mean length of stay was 5.4 ± 9.3 days (median = 3.0).
Table 2.

Characteristics of Sample.a

CharacteristicnPercentage of Sample
Sex
 Male966535.3
 Female17 70464.7
Age, in years
 18-2418636.8
 25-34502818.4
 35-44291610.7
 45-54343912.6
 55-64468317.1
 65-74442416.2
 75-7920117.4
 80 and older300511.0
Birth location (n = 27 351)
 Canada23 32385.3
 Outside Canada402814.7
Primary language spoken at home (n = 27 338)
 English24 72890.5
 Other26109.5
Marital status (n = 27 203)
 Single (never married)281110.3
 Married/common-law/living with partner18 88669.4
 Divorced/separated/widowed550620.2
Maximum education level (n = 26 214)
 Elementary or junior high334912.8
 Senior high (some or complete)861732.9
 College/technical school (some or complete)860232.8
 Undergraduate level (some or complete)444717.0
 Postgraduate degree complete11994.6
Length of hospital stay, days (n = 27 368)
 1.0-2.0814929.8
 2.01-4.0943334.5
 4.01-8.0486617.8
 Greater than 8.0492018.0

an = 27 369 unless otherwise stated.

Characteristics of Sample.a an = 27 369 unless otherwise stated. Correlation results between individual questions and the patients’ overall rating of care are presented in Table 3. The ranked results, a brief description of the question, the possible answers, and the wording of each item (as read verbatim to the patient) are also presented. From this, the question pertaining to provider coordination was most correlated with overall rating of care (n = 27 258, r = .54, P < .001). Other top-5 ranking questions pertained to nurse follow-up (n = 26 533, r = .46, P < .001), nurse listening (n = 27 253, r = .45, P < .001), help with pain (n = 20 775, r = .42, P < .001), and nurse respect (n = 27 243, r = .41, P < .001). All nursing questions ranked within the top 7 questions, with the lowest-ranking nursing question being related to nurse explanations (n = 27 131, r = .38, P < .001). Items pertaining to physical attributes/environment of the hospital showed poor correlation with overall rating of care. These included room cleanliness (ranked 15th of 24; n = 26 944, r = .35, P < .001), and room quietness (ranked 18th of 24; n = 27 112, r = .31, P < .001). The two lowest-ranked questions were related to discharge information: help after discharge (n = 24 103, r = .23, P < .001), and symptoms after discharge (n = 24 826, r = .16, P < .001).
Table 3.

Individual Item Correlations With Overall Hospital Experience.

RankItem DescriptionPossible AnswersWording of Question/ItemSpearman’s r
1Provider coordination1: ExcellentHow would you describe how well all of the health care professionals coordinated their efforts to serve your needs?0.54 (n = 27 258)
2: Very good
3: Good
4: Fair
5: Poor
2Nurse follow-up1: NeverHow often did nurses follow-up on your concerns and observations?0.46 (n = 26 533)
2: Sometimes
3: Usually
4: Always
3Nurse listening1: NeverHow often did nurses listen carefully to you?0.45 (n = 27 253)
2: Sometimes
3: Usually
4: Always
4Help with pain1: NeverHow often did the hospital staff do everything they could to help you with your pain?0.42 (n = 20 775)
2: Sometimes
3: Usually
4: Always
5Nurse respect1: NeverHow often did nurses treat you with courtesy and respect?0.41 (n = 27 243)
2: Sometimes
3: Usually
4: Always
6Physician follow-up1: NeverHow often did doctors follow-up on your concerns and observations?0.39 (n = 25 756)
2: Sometimes
3: Usually
4: Always
7Nurse explanations1: NeverHow often did nurses explain things in a way that you could understand?0.38 (n = 27 131)
2: Sometimes
3: Usually
4: Always
8Call button response1: NeverAfter you pressed the call button, how often did you get help as soon as you wanted it?0.38 (n = 20 424)
2: Sometimes
3: Usually
4: Always
9New medicine side effects1: NeverBefore giving you any new medicine, how often did hospital staff describe possible side effects in a way you could understand?0.38 (n = 14 261)
2: Sometimes
3: Usually
4: Always
10Patient involvement in decisions1: No, I wanted to be more involvedDid you have enough involvement in decisions about your treatment?0.38 (n = 25 588)
2: Yes, somewhat
3: Yes, definitely
11Patient preferences1: Strongly disagreeThe hospital staff took your preferences and those of your family or caregiver into account in deciding what your health care needs would be when you left the hospital.0.37 (n = 25 431)
2: Disagree
3: Agree
4: Strongly agree
12Bathroom assistance1: NeverHow often did you get help getting to the bathroom or in using a bedpan as soon as you wanted?0.36 (n = 11 545)
2: Sometimes
3: Usually
4: Always
13Physician listening1: NeverHow often did doctors listen carefully to you?0.35 (n = 26 945)
2: Sometimes
3: Usually
4: Always
14Pain control1: NeverHow often was your pain well controlled?0.35 (n = 20 729)
2: Sometimes
3: Usually
4: Always
15Room cleanliness1: NeverHow often were your room and bathroom kept clean?0.35 (n = 26 944)
2: Sometimes
3: Usually
4: Always
16Physician explanations1: NeverHow often did doctors explain things in a way that you could understand?0.33 (n = 27 008)
2: Sometimes
3: Usually
4: Always
17Physician respect1: NeverHow often did doctors treat you with courtesy and respect?0.32 (n = 27 073)
2: Sometimes
3: Usually
4: Always
18Room quietness1: NeverHow often was the area around your room quiet at night?0.31 (n = 27 112)
2: Sometimes
3: Usually
4: Always
19Patient discharge information1: Strongly disagreeWhen you left the hospital, you had a clear understanding of the things that you were responsible for in managing your health.0.31 (n = 27 003)
2: Disagree
3: Agree
4: Strongly agree
20New medicine purpose1: NeverBefore giving you any new medicine, how often did hospital staff tell you what the medicine was for?0.30 (n = 14 620)
2: Sometimes
3: Usually
4: Always
21Patient discharge medications1: Strongly disagreeWhen you left the hospital, you clearly understood the purpose for taking each of your medications.0.27 (n = 25 438)
2: Disagree
3: Agree
4: Strongly agree
22Family involvement1: Not as much as I wantedDuring your hospital stay, how much did hospital staff include your family or someone close to you in decisions about your care?0.26 (n = 19 719)
2: As much as I wanted
3: More than I wanted
23Help after discharge1: YesDuring your hospital stay, did doctors, nurses or other hospital staff talk with you about whether you would have the help you needed when you left the hospital?0.23 (n = 24 103)
2: No
24Symptoms after discharge1: YesDuring this hospital stay, did you get information, in writing, about what symptoms or health problems to look out for, after you left the hospital?0.16 (n = 24 826)
2: No

aAll correlations were significant at the P < .01 level.

Individual Item Correlations With Overall Hospital Experience. aAll correlations were significant at the P < .01 level. Patient-level domain correlation results are shown in Table 4. Communication with nurses was the domain most correlated with overall rating of care (r = .60, P < .001). Four domains showed similar correlation with overall rating of care. These included responsiveness of hospital staff (r = .49, P < .001), pain management (r = .48, P < .001), communication with doctors (r = .43, P < .001), and communication about medicines (r = .42, P < .001). Cleanliness of the hospital (r = .35, P < .001), quietness of the hospital (r = .30, P < .001), and discharge information (r = .29, P < .001) were the 3 domains that showed least correlation with overall rating of care.
Table 4.

Patient-Level Domain Correlations.

Communication with NursesCommunication with DoctorsResponsiveness of Hospital StaffPain ManagementCommunication About MedicinesCleanliness of HospitalQuietness of HospitalDischarge InformationOverall Hospital RatingRecommend the Hospital
Communication with nurses10.430.560.520.450.300.230.280.600.54
Communication with doctors 10.320.340.380.180.160.290.430.41
Responsiveness of hospital staff 10.470.370.280.260.220.490.43
Pain management 10.350.230.220.240.480.43
Communication about medicines 10.230.200.370.420.37
Cleanliness of hospital 10.250.110.350.28
Quietness of hospital 10.080.300.22
Discharge information 10.290.29
Overall hospital rating 10.68
Recommend the hospital 1

Abbreviation: HCAHPS, Hospital Consumer Assessment of Healthcare Providers and Systems.

aPatient-level Pearson correlations of rescaled linear means of HCAHPS measures for patients discharged between April 2011 and March 2014 (27 369 surveys). All correlations are significant at the P < .001 level.

Patient-Level Domain Correlations. Abbreviation: HCAHPS, Hospital Consumer Assessment of Healthcare Providers and Systems. aPatient-level Pearson correlations of rescaled linear means of HCAHPS measures for patients discharged between April 2011 and March 2014 (27 369 surveys). All correlations are significant at the P < .001 level.

Discussion

The present study provides novel information on the comparison of individual HCAHPS questions to overall rating of care in the inpatient setting, something that to our knowledge has not been done previously. Second, correlation results examining the comparison of HCAHPS domains to overall rating of care in the Canadian context are shown. Our main study finding was that in the inpatient setting, staff-based questions (eg, staff coordination, nurse follow-up, and nurse listening) and domains (eg, communication with nurses and responsiveness of hospital staff) were more correlated with overall rating of care, when compared to items/domains pertaining to physical features (eg, hospital cleanliness and hospital quietness) and care processes (eg, discharge information). The domains-based findings are similar to those observed in the United States, as published by the CMS (7). Similar results have also been reported in settings as remote as rural China (13), which speaks volumes as to the robustness of these findings. Our study results are timely. With the introduction of the Affordable Care Act, hospital reimbursement, in part, now focuses upon the quality of services delivered, as opposed to volume. The HCAHPS performance is now directly tied to a portion of hospital funding, providing a clear incentive to improve the care that is delivered to patients. Poor performance on the HCAHPS and other CMS programs, such as the Readmissions Reduction Program (14, 15) and the Hospital-Acquired Condition Reduction Program (16, 17), now result in financial penalties to poor-performing hospitals. Although hospitals that use the HCAHPS instrument may routinely obtain reports of their results from their survey vendor (such as Press Ganey and Deidre Mylod, Personal communication, April 16, 2015), the information and methodology contained within these reports has yet to be available within the public domain. As such, we suggest that publicly-reported results not only explore the correlations between domains and overall ratings of care but should also include the results of individual questions. Additionally, the methodology in the current manuscript presents an analytic plan that may allow organizations who conduct their own survey to reliably assess the key survey items that drive the overall experience scores of their inpatients. Although somewhat intuitive given the “first face” role that nurses represent with their patients, our findings document the importance of nursing questions and domains in contributing to the overall rating of care. In addition to communication among providers, nurse listening, follow-up, respect, and explanations all figured prominently in our correlational analysis. Simply said, if a patient has a good experience with their nurse(s), they tend to report a pleasant overall hospital experience in our inpatient setting. Recognizing this powerful relationship, strategies to engage nurses are a means of improving the patient experience. A primary study limitation is that our survey was conducted by telephone. As such, our results may not apply when other modes such as mail-outs or interactive voice response are used. Prior to organization-wide inception, we conducted a pilot study which found differences in response rates and response patterns between mail and phone survey modes. This finding has been replicated in other health surveys (including HCAHPS), where telephone respondents typically rate their care experience more positively, when compared to paper-based questionnaires (18 –22). For this very reason, the CMS employs a mode adjustment algorithm when comparing survey results from varying modes (23, 24). A secondary limitation pertains to the study location. As the study was conducted in Canada, a country with universal health care coverage, a similar investigation in the United States may be warranted due to inherent differences in the funding structure. Additionally, a potential limitation may be that prospective participants with a strongly negative opinion of their inpatient care may have refused to take the survey. Given the low percentage of outright refusals obtained over the study period (approximately 5% of all dialed numbers), we feel this to be of minimal concern. A final limitation lies within the interpretation of our results. Despite showing a poor correlation with overall experience, some items/domains may still provide excellent opportunities for quality improvement. In our own analysis, hospital cleanliness was not correlated with overall experience scores. However, we do not advocate simply discounting hospital cleanliness, as it would be foolish to not consider it a priority. Patients view hospital cleanliness as a marker of quality, one that has been associated with hospital-acquired infections (25). Qualitative reports of what patients deem important may provide additional value. In summary, our findings replicate those of the US-based HCAHPS-reporting hospitals, which showed that staff-based domains were most correlated with overall hospital experience. Our investigation has delved one level deeper by examining the relationships between individual questions and the overall rating of care. As with the domains, staff-based items, particularly those relating to staff coordination and nursing care, were most correlated with overall rating of care. Interestingly, hospital cleanliness, quietness, and questions pertaining to discharge planning did not have a high degree of correlation with overall rating of care. Our results provide excellent opportunities for targeted quality improvement initiatives in our jurisdiction as well as the broader Canadian context. Based on our findings, we advocate that our health care organization should aim to improve overall inpatient care by commencing with initiatives to improve staff-related items (eg, staff coordination and interactions with patients), as these were most correlated with the overall rating. Perhaps most importantly, other organizations may use our methodology to determine additional areas in which to focus their quality improvement efforts.
  10 in total

1.  Comparing telephone and mail responses to the CAHPS survey instrument. Consumer Assessment of Health Plans Study.

Authors:  F J Fowler; P M Gallagher; S Nederend
Journal:  Med Care       Date:  1999-03       Impact factor: 2.983

2.  Further evaluations of the PJHQ scales.

Authors:  R D Hays; E C Nelson; H R Rubin; J E Ware; M Meterko
Journal:  Med Care       Date:  1990-09       Impact factor: 2.983

3.  Equivalence of mail and telephone responses to the CAHPS Hospital Survey.

Authors:  Han de Vries; Marc N Elliott; Kimberly A Hepner; San D Keller; Ron D Hays
Journal:  Health Serv Res       Date:  2005-12       Impact factor: 3.402

4.  Measuring hospital care from the patients' perspective: an overview of the CAHPS Hospital Survey development process.

Authors:  Elizabeth Goldstein; Marybeth Farquhar; Christine Crofton; Charles Darby; Steven Garfinkel
Journal:  Health Serv Res       Date:  2005-12       Impact factor: 3.402

5.  Patient satisfaction measurement strategies: a comparison of phone and mail methods.

Authors:  T E Burroughs; B M Waterman; J C Cira; R Desikan; W Claiborne Dunagan
Journal:  Jt Comm J Qual Improv       Date:  2001-07

6.  Keeping score on cleanliness. How to improve your HCAHPS ratings. Interview by Bob Kehoe.

Authors:  Deirdre Mylod
Journal:  Health Facil Manage       Date:  2013-05

7.  Analyzing patient satisfaction: a multianalytic approach.

Authors:  S Abramowitz; A A Coté; E Berry
Journal:  QRB Qual Rev Bull       Date:  1987-04

8.  Patient experiences with inpatient care in rural China.

Authors:  Heather Sipsma; Yu Liu; Hong Wang; Yan Zhu; Lei Xue; Rachelle Alpern; Martha Dale; Elizabeth Bradley
Journal:  Int J Qual Health Care       Date:  2013-06-27       Impact factor: 2.038

9.  How does satisfaction with the health-care system relate to patient experience?

Authors:  Sara N Bleich; Emre Ozaltin; Christopher K L Murray
Journal:  Bull World Health Organ       Date:  2009-04       Impact factor: 9.408

10.  Testing survey methodology to measure patients' experiences and views of the emergency and urgent care system: telephone versus postal survey.

Authors:  Alicia O'Cathain; Emma Knowles; Jon Nicholl
Journal:  BMC Med Res Methodol       Date:  2010-06-09       Impact factor: 4.615

  10 in total
  7 in total

1.  Identifying areas for improvement in paediatric inpatient care using the Child HCAHPS survey.

Authors:  Sadia Ahmed; Kyle Kemp; David Johnson; Hude Quan; Maria Jose Santana
Journal:  Paediatr Child Health       Date:  2019-04-19       Impact factor: 2.253

2.  Peplau's Theory of Interpersonal Relations: An Alternate Factor Structure for Patient Experience Data?

Authors:  Thomas A Hagerty; William Samuels; Andrea Norcini-Pala; Eileen Gigliotti
Journal:  Nurs Sci Q       Date:  2017-04       Impact factor: 0.883

3.  Dimensions of Patient Experience and Overall Satisfaction in Emergency Departments.

Authors:  Mohan Tanniru; Jiban Khuntia
Journal:  J Patient Exp       Date:  2017-02-13

4.  Relationship Between Orthopedic Surgeon's Empathy and Inpatient Hospital Experience Scores in a Tertiary Care Academic Institution.

Authors:  Johanna Dobransky; Kathleen Gartke; Lissa Pacheco-Brousseau; Edward Spilg; Ashley Perreault; Mohammad Ameen; Alexandra Finless; Paul E Beaulé; Stéphane Poitras
Journal:  J Patient Exp       Date:  2020-10-27

5.  Building from Patient Experiences to Deliver Patient-Focused Healthcare Systems in Collaboration with Patients: A Call to Action.

Authors:  Karlin Schroeder; Neil Bertelsen; Jessica Scott; Katherine Deane; Laura Dormer; Devika Nair; Jim Elliott; Sarah Krug; Ify Sargeant; Hayley Chapman; Nicholas Brooke
Journal:  Ther Innov Regul Sci       Date:  2022-07-19       Impact factor: 1.337

Review 6.  How to practice person-centred care: A conceptual framework.

Authors:  Maria J Santana; Kimberly Manalili; Rachel J Jolley; Sandra Zelinsky; Hude Quan; Mingshan Lu
Journal:  Health Expect       Date:  2017-11-19       Impact factor: 3.377

7.  Patient experience of hospital care in China: major findings from the Chinese Patient Experience Questionnaire Survey (2016-2018).

Authors:  Guangyu Hu; Yin Chen; Qiannan Liu; Shichao Wu; Jing Guo; Shiyang Liu; Zijuan Wang; Pengyu Zhao; Jing Sun; Linlin Hu; Huixuan Zhou; Li Luo; Ying Mao; Jack Needleman; Jing Ma; Yuanli Liu
Journal:  BMJ Open       Date:  2019-09-20       Impact factor: 2.692

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.