Literature DB >> 24418997

The DEPICT model for participatory qualitative health promotion research analysis piloted in Canada, Zambia and South Africa.

Sarah Flicker1, Stephanie A Nixon2.   

Abstract

Health promotion researchers are increasingly conducting Community-Based Participatory Research in an effort to reduce health disparities. Despite efforts towards greater inclusion, research teams continue to regularly exclude diverse representation from data analysis efforts. The DEPICT model for collaborative qualitative analysis is a democratic approach to enhancing rigour through inclusion of diverse stakeholders. It is broken down into six sequential steps. Strong leadership, coordination and facilitation skills are needed; however, the process is flexible enough to adapt to most environments and varying levels of expertise. Including diverse stakeholders on an analysis team can enrich data analysis and provide more nuanced understandings of complicated health problems.
© The Author (2014). Published by Oxford University Press.

Entities:  

Keywords:  collaborative; community-based participatory research; methods; qualitative research

Mesh:

Year:  2014        PMID: 24418997      PMCID: PMC4542917          DOI: 10.1093/heapro/dat093

Source DB:  PubMed          Journal:  Health Promot Int        ISSN: 0957-4824            Impact factor:   2.483


Public health researchers interested in health promotion are increasingly conducting Community-Based Participatory Research in an effort to reduce health disparities (Jones and Wells, 2007; Flicker ). They are inviting community members (those affected personally or professionally by the issues under study) to participate more centrally in ‘all’ aspects of the research process (Israel ). Yet, multiple in-depth evaluations show that community members are more likely to participate in research design, data collection and dissemination than in data analysis (Flicker ,b; Khodyakov ). Data analysis is widely perceived as a technical skill requiring particular expertise. It is assumed to be time-consuming, boring, tedious, and/or mysterious (Nind, 2011). Strong literacy or numeracy skills are expected. Consequently, community members often willingly decline participation (Stoecker, 1999). Some are never invited. Others argue that diverting scarce community resources towards developing analysis skills may be inappropriate, particularly if academic partners are well positioned to assume these tasks (Cashman ). However, promoting more equitable research relationships that enhance rigour may require challenging these perceptions and providing opportunities for more meaningful participation (Macaulay ). Lack of community involvement in data analysis excludes those with much to lose from key decisions made during data interpretation. Rather than adopt a deficit model (i.e. partners are too unskilled/immature/illiterate/busy), some researchers are devising accessible analysis opportunities to build on the skills, talents, and knowledge of community members (Cashman , Flicker, 2008, Jackson, 2008, Daley ; Nind, 2011). Recognizing that community members bring rich lay expertise, these pioneers of collaborative analysis are finding creative ways to make health research inclusive, participatory, rigorous and, often, more fun. This article describes ‘DEPICT’, an approach to collaborative qualitative data analysis designed to involve individuals with varying levels of research proficiency. Team members may include patients, activists, community-based service providers, community members, students, clinicians and/or interdisciplinary researchers. The title acronym was chosen deliberately: to depict is an active verb, meaning to describe using words, which is a core activity within analysis for qualitative health researchers. DEPICT has six sequential steps, as outlined below and summarized in Table 1. While many of the steps or elements herein may be familiar to seasoned qualitative health researchers, what makes this model unique is its emphasis on participation, collaboration and transparency. We codified DEPICT over the last decade, drawing on experience from over a dozen health and HIV research partnerships. Table 2 describes six sample projects. Some of these projects were large, interdisciplinary, international endeavours; others were smaller local efforts. Their common thread is a strong social justice orientation that sought to find new health promotion possibilities by using more inclusive research practices.
Table 1:

DEPICT steps, roles and guiding questions

DEPICT stepCoordination functionsTeam member rolesQuestions to ask
Dynamic readingCollate, assign and distribute a subset of transcripts to each team member.Set deadlines and meeting times.Review a subset of assigned transcripts.Record notes on important concepts.What ideas seem to be important in these texts? (inductive)
Engaged codebook developmentAssemble supplies (e.g. post-it notes, pens) and arrange for team meetings.Ensure skilled meeting facilitation.Ensure that a preliminary codebook is developed.Coordinate pilot testing and refining of codebook.List important ideas for categorizing data.As a group, organize categories into clusters.Come to consensus around a preliminary codebook.Participate in pilot testing.What is our agreed upon list of categories and sub-categories that we will use for our codebook?Do we have the right categories?Do we all understand what they mean and how to apply them?Do any require further refinement?
Participatory codingAssign and distribute a subset of transcripts for coding to each team member.Set deadlines and meeting times.Provide training and support for novices.Coordinate a strategy for managing the data.Review and code each assigned transcript.Return coding work to coordinator (in paper or electronic form).Which sections of the transcript fit into which categories of our codebook?
Inclusive reviewing and summarizing of categoriesGenerate a list of quotes associated with each category.Assign team members a sub-set of categories to summarize.Distribute guiding worksheets for summarizing categories.Work alone or in pairs to develop category summaries.Return work to the coordinator.What are the main ideas?Where is there disagreement?What are some key quotes?Are there silences worth noting? What else is important to note that might help in the analysis of the larger project?
Collaborative analyzingArrange for one or more team meetings.Ensure skilled meeting facilitation.Select a note-taker in advance.Prior to meeting disseminate summaries for review.Ensure that consensus is reached and recorded on new understandings of the data.Review summaries prior to meeting.Participate in a collaborative meeting to make sense of data.Graphically depict or create a figure that illustrates findings.Come to consensus on new understandings emerging from the data and what needs to be shared.What does it all mean?What were our most important findings?What do we need to share and with whom?What questions do we still have?For critical analyses, what structural factors may help us understand why people chose to tell us the stories they shared (e.g. homophobia, neoliberalism).
TranslatingArrange for team meeting(s).Ensure skilled meeting facilitationCirculate meeting report with clear action items.Develop a knowledge translation and exchange plan for sharing research results to all relevant stakeholders.Create a plan for equitably distributing this work.Who needs to know what?How do they need to hear it?Who are the best messengers?How do we get the word out?Who on our team will be responsible for what and by when?
Table 2:

Description of six studies that have used the DEPICT approach to data analysisa

Study nameObjectiveDataTeam members involved in DEPICT analysis
Picture This (McLelland et al., 2012)To investigate understandings of safer sex behaviours among queer and trans youth labelled with intellectual disabilitiesIn-depth interviews, focus groups and arts-based methods with 10 youth3 university-based researchers from different disciplines2 graduate students1 coordinator2 social workers from a community organization, all in Canada
HIV Intervention Evaluation (Nixon et al., 2011)To evaluate the impacts of a school-based HIV prevention programme on adolescents in a high HIV-prevalence setting in South AfricaFocus-group discussions in English and isuZulu with 105 participants, including students, parents, teachers and programme staff3 university-based researchers from Canada5 junior researchers or students from a research centre in South Africa
Positive Youth Project (Flicker et al., 2004)To explore options for better supporting young people living with HIV in CanadaIn-depth interviews with 35 HIV-positive youth1 graduate student4 youth living with HIV2 support workers1 physician, all in Canada
Sepo Study (Wickenden et al., under review)To explore the health-equity experiences of people with disabilities in Zambia who have become HIV-positiveIn-depth interviews with 21 people with disabilities in Zambia who had become HIV-positive and with 11 key informants working in the field of HIV and disability5 university-based researchers in Canada3 graduate students in Canada1 university-based researcher in South Africa2 activists from a community-based disabled people's organization in Zambia
The Poz-Brain Study (Gallagher et al., 2012)To use a disability framework to explore the experiences of women with HIV-associated neurocognitive challengesIn-depth interviews with 12 HIV-positive women1 university-based rehabilitation researcher5 Master's level physiotherapy students1 psychiatrist1 neuropsychologist, all in Canada
Taking Action! (Flicker et al., 2012)To explore how Aboriginal youth link structural inequalities with individual risk, HIV and Aboriginal culture(s)Arts-based methods and in-depth interviews with 89 Aboriginal youth5 university-based health and social science researchers5 graduate students6 Aboriginal youth leaders, all in Canada

aDue to space limitations we have included only one reference for each study.

DEPICT steps, roles and guiding questions Description of six studies that have used the DEPICT approach to data analysisa aDue to space limitations we have included only one reference for each study. As this article is concerned with data analysis and interpretation, we assume that data (e.g. focus groups, individual interviews, field notes, art pieces) have already been collected, transcribed and organized. We recognize that in most qualitative research projects analysis begins before this point. Important analytical decisions are made throughout the research design and data collection stages (Cresswell, 2007). Projects with a strong Community-Based Participatory Research orientation often include diverse stakeholders in these earlier steps (Minkler and Wallerstein, 2003; Daley ). In most cases described herein, trained moderators/interviewers were also part of the analysis team. We start here, because in our experience, this is where there is often a gap in involvement. We also expect that ethical clearances have been negotiated and team members are trained in handling confidential documents. (In Canada and the African contexts where we work, it is the expectation of our universities that all those handling confidential research data be trained on appropriate data management. The form and content of the training are left up to individual research teams. We recognize that systems may be more formal in other research environments.)

THE DEPICT MODEL

Dynamic reading

DEPICT begins with each team member reading a subset of the transcripts. Reviewing many long transcripts can be onerous. A participatory framework divides the work. Someone familiar with the entire data set (e.g. one who conducted interviews or transcription) assigns three to five transcripts to each team member. Ideally, assignments correspond with professional or personal experience. Readers are encouraged to recall the research questions (e.g. post them nearby) and record important topics as they read. To generatively determine significance, team members are encouraged to draw on their lay, professional and academic perspectives. Many find it useful to use highlighters or make marginal notes. Passive reading is discouraged. Instead, readers are encouraged to dynamically engage with material by asking questions, identifying themes and linking ideas. This step can be coordinated virtually and can begin as soon as several transcripts are ready. Reviews can be completed individually over a period of weeks. In settings with limited literacy skills, audio files can be shared. Alternatively, in the Positive Youth Project, team members took turns slowly reading transcripts to each other.

Engaged codebook development

The goal of this step is to develop a list of categories that can subsequently be used to organize, or code, the data. These categories will be based on ideas identified while reading the data (i.e. inductive), but can also include topics based on the research questions, literature or lived or professional experiences of team members (i.e. deductive) (Pope ). This step typically occurs in a meeting with team members in the room. However, in the HIV Intervention Evaluation, team members in Canada and South Africa collaborated on codebook development via Skype. We facilitate this process by distributing sticky notes and asking team members to write down specific ideas that emerged during their transcript review (one idea per sheet). Team members post notes on a wall and cluster them collaboratively into categories (and, if appropriate, sub-categories). Each category (and sub-category) is named and defined. This can be done in a plenary format or smaller groups. The result is a preliminary codebook. To pilot and refine the codebook, team members code several transcripts and then discuss how well the framework was able to capture the range of perspectives within the data set. The aim is to improve shared understanding, or stability, of each category. In the Sepo Study, transcripts were accessed online by team members in three countries who piloted the preliminary codebook independently. In comparing experiences, South African colleagues identified an important missing code (religion/faith) that was then added.

Participatory coding

In this step, the final codebook is circulated. Each team member is given responsibility for coding a subset of the transcripts, typically including those reviewed earlier (Ryan and Bernand, 2000). In lower literacy and/or lower-technology settings, pairs can review paper copies of transcripts, cut them up with scissors, and organize excerpts according to the codebook categories. This low-tech option, which is similar to Krueger's ‘longtable’ method, worked well with Indigenous youth involved in Taking Action (Morgan and Krueger, 1997; Krueger and Casey, 2009). Alternatively, qualitative data management software (e.g. NVIVO) can be used. The Positive Youth Project had trained team members work on a shared file on a project computer on their own time. Alternatively, systems like NVIVO Server allow for concurrent multi-user access to synchronously code. In the Poz-Brain and Sepo Studies, coordinators were responsible for managing data in NVIVO, but exported line-numbered transcripts into Word for coding by all team members using a coding worksheet. Coordinators then inputted worksheet data into NVIVO. Decisions about whether, when and how to use software should be negotiated on a case-by-case basis that takes into account local technological and other literacy needs and resources. To improve rigour, we have every transcript coded by at least two team members. More inclusive coding safeguards against important data being omitted. Sharing the burden of coding means that large amounts of data can be managed efficiently. It also builds deeper familiarity, and a sense of ownership and engagement in the project, which can be challenging to foster in large research teams.

Inclusive reviewing and summarizing of categories

In this step, the coded data in each category are re-reviewed side by side to explore the diverse ways that the idea has been taken up by participants. We encourage dyads to look for convergent and divergent viewpoints, note surprising silences and choose emblematic quotes. We recommend pairing those with more and less experience for this activity (e.g. youth–adult or community member–scholar). This helps to mitigate power imbalances and gives people time to reflect, question, clarify and champion alternative perspectives. Each duo develops a two-page summary for each code. This distillation renders a large quantity of data more accessible to the team. In the HIV Intervention Evaluation, this process illuminated competing viewpoints about whether the intervention was a ‘success’ or ‘failure.’ Drawing on the data, these disagreements were recorded and highlighted in the summaries.

Collaborative analyzing

Next, we move from descriptive to analytical. It is rarely pragmatic for everyone on a research team to review all raw coded data. Summaries can, however, be amalgamated into a draft descriptive report that can springboard informed collaborative discussions. In Taking Action, a draft report was generated by a sub-group and then used by the larger team as the basis for a generative analytic discussion. In Poz-Brain, summaries were discussed one by one. This is the time for teams to convene and revisit the original project objectives, and ask: ‘What did we learn?’ This step requires adequate time for reflection (at least half a day) and the participation of as many team members as possible. We recommend posting the research questions to help ground deliberations. A helpful technique for developing conceptual clarity is creating a graphic that visually represents key findings (Jackson, 2008). A skilled facilitator and a note-taker are necessary. In Picture This, we convened a four hour analysis meeting to identify our three most important sets of results, deliberate on their meanings, and contextualize our findings. Social scientists were particularly interested in findings related to HIV prevention; some of the service providers were concerned with the new ethical dimensions that the research uncovered. Both topics were the focus of discussion, with the ultimate result being separate academic outputs. This stage can sometimes be contentious. In our experience, interpretations were rarely polarized by social location (e.g. academic vs. community). Rather, we understood heterogeneous differences in interpretation as a natural by-product of engaging multiple and diverse stakeholders in the interpretive process. While we often strived to arrive at consensus, it is important to recognize and make room for competing interpretations. Surfacing and exploring diverse viewpoints can lead to richer dialogue, and this added layer of complexity is often welcomed by journal editors and other readers.

Translating

The final DEPICT step is creating a dissemination plan. Different audiences might require unique products and messaging (Reardon ). In Taking Action, youth researchers created a compelling comic for peer outreach that used humour to highlight key results. With Sepo, we wrote several journal articles and two plain-language reports. Leveraging the personal and professional contacts of service providers on the team, results were also shared directly with key Zambian policy-makers and disabled people's organizations. We also decided to pursue further funding to design an intervention based on the findings. It is important to set parameters regarding who will take the lead on various outputs. We usually try to generate two to three person teams to lead particular manuscript or report development (with different team members leading on different products).

BENEFITS ASSOCIATED WITH DEPICT

Table 3 summarizes benefits and challenges associated with DEPICT. The primary strength of DEPICT is that it democratizes and demystifies qualitative analysis. Articulating the process into concrete, feasible steps enables the participation of multiple stakeholders. DEPICT has worked well in diverse collaborations, including with colleagues and community partners in low-income countries as well as marginalized populations in Canada (e.g. youth living with HIV or one or more disabilities).
Table 3:

Benefits and challenges of DEPICT

BenefitsChallenges
Democratizes analysis to allow for participation of diverse local and global stakeholdersFlexible to allow differential engagement according to interest, availability and geographyShares the burden of heavy workloadGreater potential for productivity among the teamRigour enhanced through the explicit articulation of analytic stepsDeeper and more diverse readings of the data due to engagement of multiple forms of expertiseFacilitates ‘knowledge to action’Applicable with a variety of theoretical orientations (e.g. interpretive, critical)Skilled coordination necessary for all stepsExpert facilitation necessary for several stepsRequires principal investigator to share power and controlCollaborative process may require skilled conflict managementIssues of authorship on academic outputs can be contentious if not managed early and transparentlyRequires capacity building, since it is a new approachUngrounded in a particular theoretical traditionFinding appropriate resources to remunerate community stakeholders for their time
Benefits and challenges of DEPICT We attribute this success in part to the model's flexibility. The multiple steps in the analysis process allow individuals to be as engaged as they desire according to interest and availability. There is a place for all members of a large team (regardless of experience) to have meaningful engagement. The process works well in person or virtually, enabling participation in any geographic location with Internet access or teleconference capability. DEPICT more equitably shares the workload burden and offers greater potential for productivity. More people are involved, and the wider range of expertise (academic and experiential) can lead to more creative interpretations of the data. As an example, debates on the Taking Action team about how to frame experiences of ongoing colonization led us to focus instead on more productive conversations about processes of de-colonization. The team approach requires teams to be explicit about the process of analysis. Justifying each analytic step with a group tends to enhance transparency and rigour. The engagement of all team members in analysis typically improves investment in the findings and their dissemination. By promoting dialogue among potentially diverse project partners about the relevance of the results, DEPICT incorporates a knowledge-to-action approach, which is more likely to have real-world impact (Reardon ). For instance, our results have been used to establish new community programming and refine existing programming, spur advocacy campaigns and inform health policy. Many of those engaged in the process have also found the experience to be transformative. Several talked about leveraging the skills they learned through their participation in DEPICT analysis to apply for other opportunities. Several community partners decided to return to school for undergraduate or graduate degrees. Conversely, several of our graduate students went to work with the communities with whom they partnered once they completed their programs of study. Finally, DEPICT has utility within various theoretical paradigms. For example this approach can be used for interpretive studies (e.g. exploring experience of living with HIV) or using a particular theoretical lens (e.g. post-structural).

CHALLENGES ASSOCIATED WITH DEPICT

The success of DEPICT depends on excellent coordination and facilitation skills. Finding the ‘right people’ for these tasks is critical. Managing large, busy health research teams can be difficult, but is a prerequisite of this approach. Learning to share control can be difficult for study leads, particularly if members are used to being perceived as an expert. Participatory approaches ask that we not only allow but also promote alternative viewpoints. Commitment to a respectful, participatory process demands patience, investment in relationship and capacity building, and careful attention to process issues. When working in large groups in which everyone has invested heavily in data analysis, authorship for research outputs can be contentious. We recommend collaboratively crafting transparent authorship guidelines, including ground rules for resolving potential conflicts. Despite multiple opportunities for sharing leadership, success depends heavily on having a champion who is committed to guiding the overall process. Finally, in our experience, it is important to ensure that all members of a research team are adequately remunerated for their time. This means that budgets need to be carefully developed to allow for adequate compensation for community members engaged in these time-consuming activities.

CONCLUSION

Democratizing research processes is viewed as a requirement for more meaningfully addressing health disparities, yet few guidelines exist for engaging diverse team members in data analysis. As Dennis Raphael argues, ‘the idea that health promotion can be carried out without immediate community involvement is antithetical to basic principles of health promotion. Such involvement also makes it likely that results of such activities can be used to good effect (p364)’ (Raphael, 2000). DEPICT responds to this challenge by describing an approach to collaborative, inclusive qualitative health research analysis. When considering implementation, teams should reflect on their size, skills, experience, literacy, access to technology and locations. Sharing power and interpretative control creates the potential for new risks, but also new possibilities. In our experience, the effort has been well worth the investment.

FUNDING

Funding to pay the Open Access publication charges for this article was provided by the Ontario HIV Treatment Network, the Canadian Institutes of Health Research, the Faculty of Environmental Studies at York University and the Department of Physical Therapy at the University of Toronto.
  14 in total

Review 1.  Participatory research maximises community and lay involvement. North American Primary Care Research Group.

Authors:  A C Macaulay; L E Commanda; W L Freeman; N Gibson; M L McCabe; C M Robbins; P L Twohig
Journal:  BMJ       Date:  1999-09-18

Review 2.  Qualitative research in health care. Analysing qualitative data.

Authors:  C Pope; S Ziebland; N Mays
Journal:  BMJ       Date:  2000-01-08

3.  Who benefits from community-based participatory research? A case study of the Positive Youth Project.

Authors:  Sarah Flicker
Journal:  Health Educ Behav       Date:  2006-05-31

4.  Strategies for academic and clinician engagement in community-participatory partnered research.

Authors:  Loretta Jones; Kenneth Wells
Journal:  JAMA       Date:  2007-01-24       Impact factor: 56.272

5.  A snapshot of community-based research in Canada: Who? What? Why? How?

Authors:  Sarah Flicker; Beth Savan; Brian Kolenda; Matto Mildenberger
Journal:  Health Educ Res       Date:  2007-02-25

6.  The power and the promise: working with communities to analyze data, interpret findings, and get to outcomes.

Authors:  Suzanne B Cashman; Sarah Adeky; Alex J Allen; Jason Corburn; Barbara A Israel; Jaime Montaño; Alvin Rafelito; Scott D Rhodes; Samara Swanston; Nina Wallerstein; Eugenia Eng
Journal:  Am J Public Health       Date:  2008-06-12       Impact factor: 9.308

7.  Is 80% a passing grade? Meanings attached to condom use in an abstinence-plus HIV prevention programme in South Africa.

Authors:  Stephanie A Nixon; Clara Rubincam; Marisa Casale; Sarah Flicker
Journal:  AIDS Care       Date:  2011-02

8.  A participatory group process to analyze qualitative data.

Authors:  Suzanne F Jackson
Journal:  Prog Community Health Partnersh       Date:  2008

9.  Using focus groups in community-based participatory research: challenges and resolutions.

Authors:  Christine Makosky Daley; Aimee S James; Ezekiel Ulrey; Stephanie Joseph; Angelia Talawyma; Won S Choi; K Allen Greiner; M Kathryn Coe
Journal:  Qual Health Res       Date:  2010-02-12

10.  HIV-positive youth's perspectives on the Internet and e-health.

Authors:  Sarah Flicker; Eudice Goldberg; Stanley Read; Tiffany Veinot; Alex McClelland; Paul Saulnier; Harvey Skinner
Journal:  J Med Internet Res       Date:  2004-09-29       Impact factor: 5.428

View more
  22 in total

1.  Perspectives of Racialized Physiotherapists in Canada on Their Experiences with Racism in the Physiotherapy Profession.

Authors:  Shrey Vazir; Kaela Newman; Lara Kispal; Amanda E Morin; Yang Yusuf Mu; Meredith Smith; Stephanie Nixon
Journal:  Physiother Can       Date:  2019       Impact factor: 1.037

2.  Research Done in "A Good Way": The Importance of Indigenous Elder Involvement in HIV Community-Based Research.

Authors:  Sarah Flicker; Patricia O'Campo; Renée Monchalin; Jesse Thistle; Catherine Worthington; Renée Masching; Adrian Guta; Sherri Pooyak; Wanda Whitebird; Cliff Thomas
Journal:  Am J Public Health       Date:  2015-04-16       Impact factor: 9.308

3.  Writing peer-reviewed articles with diverse teams: considerations for novice scholars conducting community-engaged research.

Authors:  Sarah Flicker; Stephanie A Nixon
Journal:  Health Promot Int       Date:  2018-02-01       Impact factor: 2.483

4.  Using Community-Driven, Participatory Qualitative Inquiry to Discern Nuanced Community Health Needs and Assets of Chicago's La Villita, a Mexican Immigrant Neighborhood.

Authors:  Jennifer Hebert-Beirne; Sarah Gabriella Hernandez; Jennifer Felner; Jessica Schwiesow; Anna Mayer; Kevin Rak; Noel Chávez; Yvette Castañeda; Joan Kennelly
Journal:  J Community Health       Date:  2018-08

5.  Clinical Instructors' Experiences Working with and Assessing Students Who Perform below Expectations in Physical Therapy Clinical Internships.

Authors:  Olivia W So; Rachael Shaw; Liam O'Rourke; Jacob T Woldegabriel; Brittany Wade; Martine Quesnel; Brenda Mori
Journal:  Physiother Can       Date:  2019       Impact factor: 1.037

6.  Community-Guided Focus Group Analysis to Examine Cancer Disparities.

Authors:  Jennifer C Schaal; Alexandra F Lightfoot; Kristin Z Black; Kathryn Stein; Stephanie Baker White; Carol Cothern; Keon Gilbert; Christina Yongue Hardy; Janet Y Jeon; Lilli Mann; Mary Sherwyn Mouw; Lyn Robertson; Emily M Waters; Michael A Yonas; Eugenia Eng
Journal:  Prog Community Health Partnersh       Date:  2016

7.  An exercise intervention for people with serious mental illness: Findings from a qualitative data analysis using participatory theme elicitation.

Authors:  Jade Yap; Claire McCartan; Gavin Davidson; Chris White; Liam Bradley; Paul Webb; Jennifer Badham; Gavin Breslin; Paul Best
Journal:  Health Expect       Date:  2020-10-09       Impact factor: 3.377

8.  Patients' Perspectives on and Experiences of Home Exercise Programmes Delivered with a Mobile Application.

Authors:  Hillary Abramsky; Puneet Kaur; Mikale Robitaille; Leanna Taggio; Paul K Kosemetzky; Hillary Foster; Barbara E Gibson Bmr Pt MSc PhD; Maggie Bergeron; Patrick Jachyra
Journal:  Physiother Can       Date:  2018       Impact factor: 1.037

9.  What Touched Your Heart? Collaborative Story Analysis Emerging From an Apsáalooke Cultural Context.

Authors:  John Hallett; Suzanne Held; Alma Knows His Gun McCormick; Vanessa Simonds; Sloane Real Bird; Christine Martin; Colleen Simpson; Mark Schure; Nicole Turnsplenty; Coleen Trottier
Journal:  Qual Health Res       Date:  2016-09-21

10.  Enhanced Patient-Centred Care: Physiotherapists' Perspectives on the Impact of International Clinical Internships on Canadian Practice.

Authors:  Giulia Mesaroli; Anne-Marie Bourgeois; Ellen McCurry; Allison Condren; Peter Petropanagos; Michelle Fraser; Stephanie A Nixon
Journal:  Physiother Can       Date:  2015       Impact factor: 1.037

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.