Literature DB >> 34537840

Guidance for publishing qualitative research in informatics.

Jessica S Ancker1, Natalie C Benda2, Madhu Reddy3, Kim M Unertl1, Tiffany Veinot4.   

Abstract

Qualitative research, the analysis of nonquantitative and nonquantifiable data through methods such as interviews and observation, is integral to the field of biomedical and health informatics. To demonstrate the integrity and quality of their qualitative research, authors should report important elements of their work. This perspective article offers guidance about reporting components of the research, including theory, the research question, sampling, data collection methods, data analysis, results, and discussion. Addressing these points in the paper assists peer reviewers and readers in assessing the rigor of the work and its contribution to the literature. Clearer and more detailed reporting will ensure that qualitative research will continue to be published in informatics, helping researchers disseminate their understanding of people, organizations, context, and sociotechnical relationships as they relate to biomedical and health data.
© The Author(s) 2021. Published by Oxford University Press on behalf of the American Medical Informatics Association.

Entities:  

Keywords:  biases; data analysis; data collection; qualitative methods; qualitative research; reliability

Mesh:

Year:  2021        PMID: 34537840      PMCID: PMC8633663          DOI: 10.1093/jamia/ocab195

Source DB:  PubMed          Journal:  J Am Med Inform Assoc        ISSN: 1067-5027            Impact factor:   4.497


INTRODUCTION

Many informatics researchers, including those with training and expertise in quantitative or computational methods, come to recognize the value of applying qualitative methods. Qualitative research is the collection and analysis of nonquantitative and nonquantifiable data through methods such as interviews and observation to understand perspectives, beliefs, and experiences. Qualitative research is invaluable for understanding context, explaining phenomena and processes, understanding the rationale underlying behavior and decisions, generating hypotheses, and developing or extending theory about sociotechnical phenomena. In health and biomedical informatics, qualitative research is often used to gain insights into the patients and professionals who use informatics innovations, the contexts in which they live and work, and the life experiences that are the sources of the medical, technology, and digital trace data that informaticists analyze. As a recent scoping review noted, qualitative research represents a small but important portion of published articles in JAMIA. In this perspective, we provide a brief outline of expectations for qualitative research published in informatics. Our goal is to explain how authors can demonstrate rigor and avoid common pitfalls by reporting how they have sought to reduce bias, improve reliability, and verify their findings (Box 1). Moreover, we aim to provide guidance that researchers can use at the early stages of studies to ensure that they are able to rigorously report methodological details in a subsequent manuscript.

Theoretical rigor

All research, whether quantitative or qualitative, is strengthened with a firm foundation in relevant theory or an explanation of why new theory is needed. Authors should use their literature review to cite the relevant theory or theories that have grounded their work. For example, a study of the adoption of a novel technology would typically begin with a discussion of existing theories of technology diffusion and adoption. In some cases, existing theories are not adequate to describe the phenomenon being studied. For example, theories of technology adoption developed in advantaged populations may not be entirely relevant for understudied groups. In these cases, qualitative work may be needed to develop a new concepts or theories or extend existing ones. Grounded theory research, in particular, is designed to develop new theory. Authors using grounded theory should consider discussing current theories and explaining where they fall short, which will strengthen the rationale for theory-building work.

Rigor in stating the research question and clarifying the study design and methodological perspective

Like other research papers, qualitative research papers should contain a clear statement of the research question. Authors should describe the gap in knowledge, discuss why the question needs to be answered, and explain why qualitative or mixed-methods research is appropriate to answer it. Qualitative research questions should not be phrased as hypotheses. Describing the study design and stating the methodological approach that guided the work is extremely helpful, especially in an inherently multidisciplinary field such as informatics. The methodological approach should be aligned with the research question. For example, phenomenological research is appropriate for developing a nuanced, sensitive understanding of the lived experience of a phenomenon and the meaning attributed to it by those who experience it. Grounded theory researchers seek to develop novel social theories emerging from data analysis, especially around social processes. Recently, grounded theory-based mixed-methods research has also been proposed. Ethnography is useful for describing groups and interpreting their cultures, contexts, and shared meanings; such studies may use a combination of qualitative and quantitative data. A case study approach can prove particularly useful for evaluating interventions or programs or investigating critical events; these studies may also use mixed methods. Participatory design, usability research, and user-centered design are examples of qualitative or mixed-methods approaches intended to produce new technologies adapted to the needs and capabilities of their users and stakeholders. When using mixed methods, researchers should state their selected study design. Examples include sequential exploratory (qualitative then quantitative) to discover a new phenomenon and then determine its generalizability; sequential explanatory (quantitative then qualitative) to explain unexpected findings from quantitative work; measure development (quantitative, qualitative, then quantitative) in which qualitative methods are used to develop instruments for quantitative surveys; and parallel mixed designs such as evaluating the implementation of a technology intervention through qualitative research while simultaneously collecting quantitative usage data. The informatics literature includes many excellent examples of studies conducted from the perspective of grounded theory, phenomenology,, user-centered design, case study,, ethnography,, and other qualitative and mixed-methods frameworks.

Rigor in sampling and justification of sample size

Whether researchers work directly with participants or conduct secondary analyses of existing data, they should demonstrate rigor in their sampling approach. What was the population, community, culture, or phenomenon of interest to the research question, and how did the researchers obtain a sample from it? Researchers should explain their sampling approach.,, Purposive (or purposeful) sampling describes a group of methods for recruiting nonprobability samples of individuals likely to have perspectives or experiences of interest. With purposive sampling, researchers should report how they identified the groups or perspectives for targeted recruiting (eg, whether it was on the basis of theory or empirical observation). If purposive sampling strategies such as quota sampling and typical case sampling were used to increase representativeness or oversample subgroups of interest, these methods should be described. In grounded theory, in which the goal is theory development, the sampling should be shown to be theoretically justified. For mixed-methods studies, the relationship between quantitative and qualitative samples and analyses should also be specified. Throughout, researchers should explain why their sampling method is appropriate to the research question, discuss whether sampling may be subject to biases, and discuss how they sought to address such biases. Informatics journals have shown an increased interest in health equity, and authors are encouraged to describe any sampling approaches intended to maximize inclusion of historically marginalized and underserved populations. An informatics-relevant example is this study with men who have sex with men. Researchers must also provide a convincing rationale for their sample size. A few papers in the qualitative literature have suggested that small sample sizes are sufficient for interviews and focus groups with human subjects., However, these citations are more suited to the planning stage of a qualitative project (as in a funding proposal to justify sample sizes) rather than at the execution and reporting stages of a study. A citation to previous research alone may not provide sufficient assurance that the sample size was adequate for a specific study. Instead, it is the responsibility of the researcher to demonstrate that the sample size is adequate to answer the research question. One well-accepted sample size criterion is saturation. Researchers invoking saturation should explain which of the several definitions of saturation they used and how they determined that saturation was achieved. Researchers must also show how their criteria for sampling and sample size determination are harmonized with the research question and purpose. (For example, inductive thematic saturation can be achieved with small sample sizes when the population of interest is homogeneous or the research question is simple, but larger samples are usually needed when the population of interest has multiple strata of interest.) Usability researchers are encouraged to note potential limitations of the Nielsen and the “10 ± 2” sample size heuristics and to recognize that larger samples will tend to help find more usability problems and allow input by more diverse users. If the researchers sampled units of analysis other than the individual (eg, events, communities, organizations, or social media posts), this should also be justified according to the goals of the study. For example, in a case study, bellwether or ideal case sampling of hospitals might be justified for examining the effects of a clinical informatics intervention. Authors should draw upon any published methodological literature relevant to their sampling approach.

Rigor in data collection and the relationship between researcher and participants

Researchers must describe their data collection methods (eg, interviews, focus groups, observation, etc.), and justify how their choice of data collection method was appropriate for the research question. For mixed-methods studies, the approach to integrating qualitative and quantitative data collection should be explained.,, The selection or development process for any instruments used in the study should be described. For example, for semistructured interviews, how and why were topic areas selected and how were interview questions developed? For observation, if a template was used to guide data collection, how and why were template components selected? Any pilot testing of the instruments should be discussed, including any use of mixed methods to design the instrument. Instruments developed for the study should be included as appendices. Researchers should describe how data collection was carried out. It should be clear who conducted interviews, observations, and focus groups, whether they were conducted in person or by telephone or video conference, and how observers and interviewers were trained. When multiple persons collected data, as in team-based ethnographic research, researchers should describe how they ensured reliability across data collectors. Researchers who created field notes should describe their creation (eg, free text notes, observation template). The interpretivist nature of much qualitative research means that the relationship between the research participants and the researchers is of critical importance. In the research report, it can be helpful to provide a brief summary of the characteristics, training, and perspectives of the qualitative researchers to help readers assess the credibility of their work. Researchers should also be prepared to discuss how they addressed potential data collection biases. These include power differentials that may reduce patient candor when interviewed by physician-researchers or concerns about employment security or professional repercussions that may cause healthcare workers to be concerned about answering questions about their work. These also include the known tendencies for people to provide overly positive assessments of an innovation when interviewed by its developer and to change their behavior when they know they are being observed. For focus groups, researchers should report how they addressed common limitations such as group composition issues (eg, role hierarchies that influence participant discussion), or impacts of dominant personalities. For publication, researchers should describe how they addressed and sought to mitigate these and other potential biases, for example, through reflexivity, prolonged engagement, or persistent observation. If reflexivity practices were used to help researchers understand their own relationship to the research question and the research participants, and their evolving understanding of the data over the course of the project, they should be described. Readers and reviewers value descriptions of reflexivity practices, especially in explaining relationships between researchers and marginalized participants, where the perpetuation of bias is likely. In some research traditions, positionality statements are increasingly used to make reflexivity practices in research with marginalized groups more explicit.

Rigor in data analysis methods

Informatics journals require qualitative researchers to report how they analyzed their qualitative data and provide methodological citations. Many approaches to data analysis are available.,, What they have in common is that each involves 4 stages: (1) a method for systematically identifying patterns or concepts in the data, (2) a method for reliably labeling these patterns or concepts across different transcripts, fieldnotes, or collected documents/images/artifacts, (3) a method of discovering or identifying relationships between these concepts to synthesize themes or groups of themes comprising theories, and (4) methods to verify and test developing analyses. These 4 steps are accomplished differently by different analysis approaches. If researchers used deductive approaches such as directed content analysis (which collect and code data on the basis of an existing theoretical framework that predefines the set of applicable concepts and relationships between them), they should report the theory or framework they used. By contrast, researchers who use inductive approaches such as thematic analysis (analyses conducted in the absence of an existing theoretical framework) should explain how they followed the inductive approach of immersing themselves in the data and allowing concepts and relationships to emerge from reading, review, theorizing, and discussion. Given the large number of first-cycle and second-cycle coding approaches available, researchers should cite which was used. Combined inductive-deductive analysis may use components of each of these analysis approaches. For mixed-methods studies, these 4 steps will typically be followed by at least one approach to integrating qualitative and quantitative data (see excellent texts on the variety of approaches). Although inductive data analysis is a component of grounded theory development, not all forms of inductive data analysis meet the definition of grounded theory. Qualitative authors are encouraged to reserve the term “grounded theory” for projects that seek to develop novel theories about social phenomena and conform to one of the main approaches to doing so., Researchers to describe any approaches to improve dependability of coding, for example, whether multiple coders worked on the transcripts, and if so, how they worked together (eg, consensus meetings or establishment of inter-rater reliability). Audit trails (eg, memoing in grounded theory) can also be used for single-authored projects., Researchers should also describe any methods for improving credibility or verifying their interpretation of the data. If they apply triangulation, they should describe what data sources, researchers, or data types were compared. If they applied negative case analysis, to seek out and analyze data that appear to disconfirm a developing concept or theory, they should describe how they identified the negative cases and how the analysis revealed patterns that did or did not hold true. If they exposed their interpretation for critique and reinterpretation by participants in the community being studied, methods for doing so should be described (eg, updating the interview guide to include emerging themes to be discussed with new participants). Alternately, if they conducted formal, terminal member checking by inviting participant discussion and review of findings, they should describe the method, the feedback provided, and how it was addressed in the analysis. The informatics literature contains many good examples of studies that apply inductive analysis, deductive analysis, mixed inductive–deductive analysis, and mixed qualitative–quantitative methods.

Rigor in reporting results

The results section should contain a description of the sociodemographic and other relevant characteristics of any human subjects in the sample. When themes are reported as part of the results, researchers should recognize that a theme cannot be sufficiently described in a sentence or phrase. Themes must be supported with rich examples of actual extracts, quotes, or images gathered during data collection. Quotes not only give life and interest to the research report but also serve the critical function of connecting the source data to the researchers’ interpretation. In addition to the quotes, researchers should provide synthesis by explaining themes and categories, and the depth and range of findings represented by those categories. Labeling quotes and extracts with study-specific labels such as “Participant 1” is one helpful approach to demonstrate that representative quotes are drawn from the entire sample of participants. Word limits of biomedical research journals can introduce challenges with integrating rich and descriptive source quotations into the text of a results section. While quotes integrated into the results text are a powerful approach to building a high-quality results description, additional tools such as thoughtfully integrated tables of quotations, boxes for longer-form quotations, and summary visualizations such as timelines and network diagrams can also prove helpful. In mixed-methods studies, joint displays of qualitative and quantitative data can also help show linkages between data.

Rigor in discussion and conclusions

Qualitative researchers are invited to discuss the potential transferability of their findings to other settings or populations. To help readers determine whether the findings might be transferable to other settings and populations, authors should provide detail about the context and setting of the research and about their assumptions about the domain of interest. As with any publication, qualitative reports should also include a limitations section. Common limitations in qualitative work may include known and unavoidable lack of representation of certain participant perspectives, unavoidable researcher biases, or issues with the transferability of findings. Approaches used to address limitations, such as the use of bracketing to address potential researcher biases, should be discussed in the limitations section.

CONCLUSION

Qualitative research is integral to health and biomedical informatics. High-quality qualitative research is conducted by many informatics researchers whose original backgrounds are in quantitative methods. Examples of excellent qualitative and social science research honored with AMIA’s Diana Forsythe Award are available here (https://www.amia.org/amia-awards/working-group-awards). With this summary of publication expectations, we hope to encourage the submission of high-quality qualitative research that advances the field of informatics. Researchers might find it helpful to refer to some of the formal reporting checklists to learn more about reporting expectations in qualitative research, even when publishing in journals that do not require them., However, in recognition of restrictive manuscript lengths, the present guidelines are intended to be more selective than prior formal reporting checklists. They are also crafted to emphasize unique issues arising in health informatics, such as frequent use of usability testing methods. Addressing all the elements described here is likely to make the qualitative manuscript very long. Authors are encouraged to consider writing an online-only methodological appendix that would allow them to describe their approach in detail without violating length limits. Using tables, boxes, and figures to illustrate methods and results can also help authors stay within page limits. We hope that this summary will support more complete reporting of research studies, but we recognize that it is not a substitute for training in qualitative research—although this summary can be an instructional tool within broader training programs. We encourage interested researchers to consult the references shared below and the many excellent texts, courses, and mentors available in informatics and beyond. Following the literature to staying abreast of the qualitative research can also alert researchers to innovations in theory and methods that can be applied to continue advancing our understanding of the patients and professionals who use informatics innovations, the contexts in which they live and work, and their beliefs, perspectives, and life experiences. Guidance for reporting qualitative research in informatics Theory Cite theory appropriate to the topic being studied if applicable Research question and study design State the research question State the study design and methodological perspective of the research Sampling Describe the sampling approach Describe any approaches to ensure the inclusion of people from marginalized or underserved groups Report and justify the sample size If using saturation to determine sample size, report what type of saturation was used, and how saturation was assessed* Data collection Report how data were collected Report any methods for reducing bias in data collection and analysis* Data analysis Describe data analysis methods, with appropriate citations* For deductive analysis, report how the theory was used in the data collection and analysis* For inductive analysis, report how the steps of inductive analysis were done* For theory development, report how categories were developed* Describe any methods for improving the dependability of coding* Report any measures for improving the credibility of findings or verifying interpretations* Results Report sample size and characteristics of participants Support thematic findings with extracts, quotes, images, or observations Provide synthesis and interpretation Discussion Describe assumptions of the research and details of setting and context to illustrate transferability of findings Describe relationship of findings, or new theory developed in the study, to existing theory Report limitations *Elements with an asterisk may need to be elaborated in an appendix to avoid lengthening the manuscript.

FUNDING

This work received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

AUTHOR CONTRIBUTIONS

JSA, NCB, MR, KMU, and TV made substantial contributions to the conception and design of this work; drafted the work and revised it critically for important intellectual content; and gave final approval of the version to be published. All agree to be accountable for all aspects of the work.

CONFLICT OF INTEREST STATEMENT

None declared.

DATA AVAILABILITY

No data were generated in the course of this study.
  33 in total

1.  Three approaches to qualitative content analysis.

Authors:  Hsiu-Fang Hsieh; Sarah E Shannon
Journal:  Qual Health Res       Date:  2005-11

2.  Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups.

Authors:  Allison Tong; Peter Sainsbury; Jonathan Craig
Journal:  Int J Qual Health Care       Date:  2007-09-14       Impact factor: 2.038

3.  Theoretical sampling and category development in grounded theory.

Authors:  Claire B Draucker; Donna S Martsolf; Ratchneewan Ross; Thomas B Rusk
Journal:  Qual Health Res       Date:  2007-10

4.  Process and Outcomes of a Recursive, Dialogic Member Checking Approach: A Project Ethnography.

Authors:  Michelle Brear
Journal:  Qual Health Res       Date:  2018-11-30

5.  Evaluation of medium-term consequences of implementing commercial computerized physician order entry and clinical decision support prescribing systems in two 'early adopter' hospitals.

Authors:  Kathrin M Cresswell; David W Bates; Robin Williams; Zoe Morrison; Ann Slee; Jamie Coleman; Ann Robertson; Aziz Sheikh
Journal:  J Am Med Inform Assoc       Date:  2014-01-15       Impact factor: 4.497

6.  The invisible work of personal health information management among people with multiple chronic conditions: qualitative interview study among patients and providers.

Authors:  Jessica S Ancker; Holly O Witteman; Baria Hafeez; Thierry Provencher; Mary Van de Graaf; Esther Wei
Journal:  J Med Internet Res       Date:  2015-06-04       Impact factor: 5.428

7.  Infrastructure Revisited: An Ethnographic Case Study of how Health Information Infrastructure Shapes and Constrains Technological Innovation.

Authors:  Trisha Greenhalgh; Joseph Wherton; Sara Shaw; Chrysanthi Papoutsi; Shanti Vijayaraghavan; Rob Stones
Journal:  J Med Internet Res       Date:  2019-12-19       Impact factor: 5.428

8.  Feeling better on hemodialysis: user-centered design requirements for promoting patient involvement in the prevention of treatment complications.

Authors:  Matthew A Willis; Leah Brand Hein; Zhaoxian Hu; Rajiv Saran; Marissa Argentina; Jennifer Bragg-Gresham; Sarah L Krein; Brenda Gillespie; Kai Zheng; Tiffany C Veinot
Journal:  J Am Med Inform Assoc       Date:  2021-07-30       Impact factor: 7.942

9.  Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis.

Authors:  Albine Moser; Irene Korstjens
Journal:  Eur J Gen Pract       Date:  2017-12-04       Impact factor: 1.904

10.  Series: Practical guidance to qualitative research. Part 2: Context, research questions and designs.

Authors:  Irene Korstjens; Albine Moser
Journal:  Eur J Gen Pract       Date:  2017-12       Impact factor: 1.904

View more
  2 in total

1.  Celebrating Randolph A. Miller, MD, 2021 Morris F. Collen Award winner and pioneer in clinical decision support.

Authors:  Suzanne Bakken
Journal:  J Am Med Inform Assoc       Date:  2021-11-25       Impact factor: 4.497

2.  Use of Mobile Apps in Heart Failure Self-management: Qualitative Study Exploring the Patient and Primary Care Clinician Perspective.

Authors:  Leticia Bezerra Giordan; Rimante Ronto; Josephine Chau; Clara Chow; Liliana Laranjo
Journal:  JMIR Cardio       Date:  2022-04-20
  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.