Literature DB >> 29315417

Interactive or static reports to guide clinical interpretation of cancer genomics.

Stacy W Gray1,2, Jeffrey Gagan3,4, Ethan Cerami5, Angel M Cronin6, Hajime Uno4,6, Nelly Oliver6, Carol Lowenstein6, Ruth Lederman6, Anna Revette6, Aaron Suarez7, Charlotte Lee4, Jordan Bryan7, Lynette Sholl3,4, Eliezer M Van Allen4,6,7.   

Abstract

Objective: Misinterpretation of complex genomic data presents a major challenge in the implementation of precision oncology. We sought to determine whether interactive genomic reports with embedded clinician education and optimized data visualization improved genomic data interpretation. Materials and
Methods: We conducted a randomized, vignette-based survey study to determine whether exposure to interactive reports for a somatic gene panel, as compared to static reports, improves physicians' genomic comprehension and report-related satisfaction (overall scores calculated across 3 vignettes, range 0-18 and 1-4, respectively, higher score corresponding with improved endpoints).
Results: One hundred and five physicians at a tertiary cancer center participated (29% participation rate): 67% medical, 20% pediatric, 7% radiation, and 7% surgical oncology; 37% female. Prior to viewing the case-based vignettes, 34% of the physicians reported difficulty making treatment recommendations based on the standard static report. After vignette/report exposure, physicians' overall comprehension scores did not differ by report type (mean score: interactive 11.6 vs static 10.5, difference = 1.1, 95% CI, -0.3, 2.5, P = .13). However, physicians exposed to the interactive report were more likely to correctly assess sequencing quality (P < .001) and understand when reports needed to be interpreted with caution (eg, low tumor purity; P = .02). Overall satisfaction scores were higher in the interactive group (mean score 2.5 vs 2.1, difference = 0.4, 95% CI, 0.2-0.7, P = .001). Discussion and
Conclusion: Interactive genomic reports may improve physicians' ability to accurately assess genomic data and increase report-related satisfaction. Additional research in users' genomic needs and efforts to integrate interactive reports into electronic health records may facilitate the implementation of precision oncology.

Entities:  

Mesh:

Year:  2018        PMID: 29315417      PMCID: PMC6018970          DOI: 10.1093/jamia/ocx150

Source DB:  PubMed          Journal:  J Am Med Inform Assoc        ISSN: 1067-5027            Impact factor:   4.497


BACKGROUND AND SIGNIFICANCE

The increased availability and decreased cost of tumor genomic profiling promise to provide oncologists with a clear path to “precision medicine.” The goal of precision medicine is to leverage an understanding of alterations in somatic (tumor) and germline DNA to identify therapies matched to a patient’s molecular profile. Genomic data can also reveal prognostic, diagnostic, and cancer risk information that can shape care, and there are now many academic and large (>100) commercial somatic gene panels currently being used for clinical decision-making. Despite the promise of precision medicine, clinical utility studies of somatic panel testing have shown modest impact and highlight implementation obstacles. One major barrier to effective genetic testing implementation is provider knowledge gaps. Low levels of genetic knowledge/confidence have been associated with lower utilization of genetic testing., In order to address these gaps, professional organizations such as the American Society for Clinical Oncology (ASCO), the Association for Molecular Pathology (AMP), and the American Medical Association (AMA) have intensified efforts to train providers in genomics (ASCO tumor boards, ASCO pre–annual meeting genomic courses, AMP webinars, AMA somatic resource: https://cme.ama-assn.org/Activity/4652654/Detail.aspx). Moreover, many institutions have developed “molecular tumor boards” to review cases in real time., Others have developed genomic knowledge banks such as MyCancerGenome, OncoKB, IntOGen, and CIViC to aid in the curation and interpretation of data. Many solutions, however, are constrained by a lack of scalability, exist outside of providers’ clinical workflow, and fail to integrate into the electronic health record. One potential solution is to provide clinicians with education at the point of care through genomic testing laboratory reports. Most lab reports already contain annotated information from molecular pathologists, and they include technical information, such as coverage and sequencing quality, which is essential for interpretation., Reporting of complex genomic data to date has typically required a significant reduction of these data into static documents that eliminate information inherent in genomic analyses. Furthermore, static genomic reports lack the decision support that may be needed to empower physicians to effectively utilize genomic data as part of evidence-based decision-making. To address some of these limitations, we developed a web-based interactive genomic report that includes modern information technology features (eg, improved data visualization and embedded clinician-directed education). We hypothesized that physicians who were exposed to the interactive report would demonstrate increased comprehension of patients’ genomic data and be more satisfied with the report than physicians exposed to a traditional static genomic report. We then conducted a randomized, vignette-based survey study, in the context of a longitudinal study, to determine whether exposure to the interactive genomic report, as compared to the static report, improves physicians’ comprehension of the reported data and their report-related satisfaction.

MATERIALS AND METHODS

Study population

We re-surveyed all faculty members who participated in the baseline survey and who provide clinical care to cancer patients at the Dana-Farber Cancer Institute (DFCI) and the Brigham and Women’s Hospital. In addition, we included pediatric oncologists, whose patients are now eligible for testing, and new faculty. Questions pertained to a representative somatic next-generation sequencing panel–based assay that queries exonic mutations in a set of cancer genes (OncoPanel)., We recruited participants between May and November 2016. The study was approved by the DFCI Institutional Review Board (DFCI #16-101).

Survey instrument

The survey instrument contained questions related to the use of OncoPanel testing, case-based vignettes, genomic confidence, and sociodemographic/practice characteristics. We developed 3 patient vignettes in which the application of cancer genomics could be used to facilitate clinical decision-making (metastatic breast and lung cancer, metastatic melanoma). Each vignette included (1) a patient description, (2) a table on disease-relevant genomic alterations and their clinical significance, (3) a mock genomic report (static or interactive format), and (4) multiple-choice questions. We included the table on disease-relevant genomic alterations and clinical significance to ensure that participants did not need specific knowledge to answer questions. This approach allowed us to provide participants with the answers to the vignette-based questions within the vignette itself, as long as they could find the relevant data on the report. The vignette-based question format was similar across all cases. We queried participants about (1) sample quality, (2) sequencing quality, (3) treatment recommendations, (4) copy number interpretation, (5) the likelihood that a particular alteration was somatic or germline, (6) factors that could lead to a false negative report (eg, low tumor purity or low depth of coverage), and (7) confidence in report interpretation. To evaluate specific domains, we included information (1) consistent with a germline alteration in the lung case, (2) on low-quality sequencing in the melanoma case, and (3) on low tumor purity in the breast case. We measured report satisfaction with a modified Scheuner scale. We also included an open-ended question to obtain participants’ feedback on the reports. We conducted cognitive pretesting of a draft instrument with medical, radiation, and pediatric oncologists, and surgeons (n = 4). We then refined and finalized the survey. The survey took approximately 30 min and was administered online via DatStat Illume 6.1 (DatStat Inc., Seattle, WA, USA). The survey instrument is available as Supplementary Materials 1.

Study procedures

Participants were randomized into 2 groups following stratification by specialty: experimental (interactive report) vs traditional (static report), with a 1:1 allocation rate. SWG and EMV sent potentially eligible physicians an electronic letter that contained study details and a survey link. Furthermore, we randomized the vignette order to minimize potential order effect bias. At the end of the survey, we offered physicians who were randomized to view the static report the option to view the interactive report and provide feedback. We sent electronic reminders to nonresponders 2 and 4 weeks after initial contact. SWG and EMV called/e-mailed all nonresponders. Physicians were offered a $100 gift card as incentive.

Genomics report design and implementation

We used the OncoPanel assay report as our static report (Supplementary Materials 2). The OncoPanel report also served as the source of content for the interactive report; we presented the same genomic information and explanatory text in both formats. All reports were generated for the study and contained fictional accession numbers and specimen collection dates. We further modified the interactive report through an iterative feedback process with trainees to optimize information organization, include hover-over buttons with education (eg, potential importance of allelic fraction, interpretation of tumor purity), and visual cues to provide rapid assessment of quality features (available in Supplementary Materials 3). The interactive report was developed in AngularJS 1.5.

Statistical analysis

We summarized physician characteristics and survey responses descriptively. We evaluated differences in physician characteristics between responders and nonresponders, and between randomization groups, by Fisher’s exact and Wilcoxon rank-sum tests. We analyzed the open-ended data to identify themes in content. The primary outcome was overall comprehension of genomic findings, defined as the sum of the correct responses to 18 items (6 items for each of the 3 vignettes, range 0–18). In the primary analysis, we assigned physician nonresponse as incorrect (ie, assuming that failure to select a response on a comprehension question meant that the physician was unsure of the correct response). We also conducted secondary analyses using a complete case analysis in the subset of physicians who completed all 18 comprehension items. Furthermore, we calculated domain-specific comprehension for the 6 domains included in each of the vignettes, defined as the sum of the correct responses to the domain-specific items (range 0–3). The secondary outcome was physician-reported satisfaction. Responses to 16 items were scored on a 1–4 point Likert scale, with higher scores representing greater satisfaction; we averaged responses to determine the overall satisfaction score. We evaluated the mean differences in overall comprehension and overall satisfaction by randomization group by t-test, while we evaluated differences in domain-specific comprehension by randomization group by Wilcoxon rank-sum test. Statistical analyses were performed using Stata version 13.2 (StataCorp, College Station, TX, USA). When designing the study, we estimated that we would recruit 320 eligible physicians, with an expected participation rate of 50%, for a target of 160 participating physicians. This study design had 80% power to detect an effect size of 0.446 with a 2-sided type I error rate of 5%. We ultimately recruited 358 eligible physicians, of whom 103 completed the items specific to the primary outcome. Updating this power calculation with the actual number of participants included in the analysis of the primary outcome (56 static and 47 interactive), the observed sample had 62% power to detect an effect size of 0.45, and 80% power to detect an effect size of 0.56.

RESULTS

We randomized 358 eligible physicians, of whom 105 participated (29%) (Supplementary Figure 4). The participation rate was slightly higher among physicians randomized to the static report (57/177 = 32%) than those randomized to the interactive report (48/181 = 27%). Medical oncologists had the highest participation rate (36%), followed by pediatric oncologists (33%), radiation oncologists (24%), and surgeons (10%). Participants were more likely to have graduated medical school more recently (P < .001), and among the subset of eligible physicians who participated in the baseline survey, we observed that those who participated in the current study reported higher levels of confidence in knowledge about genomics (P = .03) (Supplementary Table 1). For example, 42% of participants vs 23% of nonparticipants in the current study reported in 2011 that they were “very confident” in their knowledge about genomics. Among the 105 participants, there were no statistically significant differences in physician characteristics by study arm (Table 1).
Table 1.

Characteristics of participants by study arm

CharacteristicsStandardInteractiveP-value**
N = 57N = 48
Gender.68
 Male34 (61)30 (65)
 Female22 (39)16 (35)
 Nonresponse12
Years since medical school graduation.39
 0–51 (2)1 (2)
 6–1015 (26)13 (28)
 11–1512 (21)15 (33)
 16–2014 (25)7 (15)
 21–254 (7)3 (7)
 26–302 (4)0 (0)
 31–352 (4)2 (4)
 36–405 (9)3 (7)
 >402 (4)2 (4)
 Nonresponse02
Department.72
 Medical oncology38 (67)32 (67)
 Pediatric oncology13 (23)8 (17)
 Radiation oncology3 (5)4 (8)
 Surgery3 (5)4 (8)
Confidence in knowledge about genomics.12
 Not confident at all2 (4)1 (2)
 Not very confident5 (9)8 (17)
 Somewhat confident34 (61)19 (40)
 Very confident15 (27)20 (42)
 Nonresponse10
Confidence in ability to explain genomic concepts to patients.10
 Not confident at all0 (0)0 (0)
 Not very confident3 (5)7 (15)
 Somewhat confident30 (54)17 (35)
 Very confident23 (41)24 (50)
 Nonresponse10
Confidence in ability to make treatment recommendations based on genomic information.64
 Not confident at all1 (2)1 (2)
 Not very confident10 (18)12 (25)
 Somewhat confident28 (50)18 (38)
 Very confident17 (30)17 (35)
 Nonresponse10
Principal investigator in clinical trials research?1.00
 No29 (51)25 (52)
 Yes28 (49)23 (48)
Number of newly diagnosed patients seen for treatment or evaluation each month.10
 Median (interquartile range)8 (3–15)10 (6–20)

aItem nonresponse was 7% for new patient volume and <3% for all other items.

**P-values were determined by Wilcoxon rank-sum test (years since medical school graduation, new patient volume) and Fisher’s exact test (all other items).

Characteristics of participants by study arm aItem nonresponse was 7% for new patient volume and <3% for all other items. **P-values were determined by Wilcoxon rank-sum test (years since medical school graduation, new patient volume) and Fisher’s exact test (all other items).

Reported use of OncoPanel

There was wide variability in OncoPanel use, with physicians reporting that a median of 30% of their patients had testing (interquartile range, 10%–50%, range 1%–100%). The majority of physicians (>50%) reported that multiple factors “sometimes” or “often” played a role in a decision to not use Tier 1 (well-established clinical utility) or Tier 2 (clinical utility in some contexts; eg, clinical trial eligibility or Federal Drug Administration drug approved for a different tumor type) test results to inform treatment recommendations (Figure 1). Notably, 34% reported that they often or sometimes found it difficult to make treatment recommendations based on OncoPanel results.
Figure 1.

Physicians' reasons for not using genomic test results to inform treatment recommendations. Categories of responses are described in the figure.

Physicians' reasons for not using genomic test results to inform treatment recommendations. Categories of responses are described in the figure.

Impact on genomics comprehension

Two physicians did not respond to any of these items and were excluded from the comprehension analyses. Nonresponse for the remaining 103 physicians was minimal (<7% for each of the 18 comprehension items, with 86 physicians responding to all 18 items). In the primary analysis, the physicians’ overall comprehension scores did not differ significantly by report type (mean score: interactive 11.6 vs static 10.5, difference = 1.1, 95% CI, −0.3, 2.5, P = .13, Figure 2A). Similar results were obtained when the analysis was limited to the physicians who responded to all items (mean score: interactive 12.2 vs static 11.2, difference = 1.0, 95% CI, −0.4, 2.4, P = .13).
Figure 2.

Physicians’ comprehension scores. (A) Overall: This score is defined as the sum of the correct responses to 18 items (6 items for each of the 3 vignettes; range 0–18). (B) Domain-specific: This score is defined as the sum of the correct responses to the domain-specific items from each of the 3 vignettes (range 0–3). Std: standard report; Int: interactive report. Higher scores correspond to better comprehension.

Physicians’ comprehension scores. (A) Overall: This score is defined as the sum of the correct responses to 18 items (6 items for each of the 3 vignettes; range 0–18). (B) Domain-specific: This score is defined as the sum of the correct responses to the domain-specific items from each of the 3 vignettes (range 0–3). Std: standard report; Int: interactive report. Higher scores correspond to better comprehension. In secondary analyses for domain-specific comprehension, however, physicians who viewed the interactive report were more likely to correctly assess tumor purity (P = .02) and sequencing quality (P < .001) and understand when reports needed to be interpreted with caution (P = .02) (Figure 2B).

Physician satisfaction with reports

Overall satisfaction scores were significantly higher in the interactive group than the static group (mean score 2.5 vs 2.1, difference = 0.4, 95% CI, 0.2, 0.7, P = .001; Figure 3). However, when asked about ease of understanding the test results presented in the genomic report, one-quarter of physicians in both groups responded “not at all easy” (across all vignettes: 27% interactive and 28% static) and nearly one-half responded “somewhat easy” (42% interactive and 53% static). In open-ended comments, providers had differing opinions about the type of genomic information that they found usable (eg, some wanted technical/raw data while others wanted simplified reports and a summary) or suggested additional report functionality (eg, links to external databases, clinical trials information, and institutional pathways), and many reported a need for additional provider-directed genomic education (Supplementary Table 2).
Figure 3.

Physicians’ average satisfaction score. Responses to 16 items were scored on a 1–4 point Likert scale, with higher scores representing greater satisfaction. The overall satisfaction score was defined as the average of the 16 items.

Physicians’ average satisfaction score. Responses to 16 items were scored on a 1–4 point Likert scale, with higher scores representing greater satisfaction. The overall satisfaction score was defined as the average of the 16 items. In aggregate, 85% of physicians reported that existing resources are inadequate to support genomic testing in clinical practice. The need for additional support for providers and patients was endorsed by 88% and 68% of physicians, respectively. Among physicians who desired additional provider support, electronic reports with decision support (66%) and a genomic consult service (53%) were most highly endorsed (Figure 4). Among physicians who desired additional patient support, “patient-friendly” versions of the report (81%) and increased availability of genetic counselors for individual patient sessions (76%) were most highly endorsed (Supplementary Figure 5).
Figure 4.

Physicians’ attitudes about provider genomic support that would be helpful. Categories of responses are described in the figure.

Physicians’ attitudes about provider genomic support that would be helpful. Categories of responses are described in the figure.

DISCUSSION

In this study, physicians participated in a vignette-based survey study in which they were randomized to view either a novel interactive genomic report or a traditional static report. In our primary analyses, we found that mean comprehension scores did not significantly differ between groups. However, in our exploratory analyses, we found that physicians’ ability to correctly assess sequencing quality and tumor purity and understand when reports needed to be interpreted with caution was significantly higher for those who were exposed to the interactive reports. Furthermore, report-related satisfaction was higher among physicians who viewed the interactive report. Our findings suggest that an interactive interface may be beneficial in guiding physicians in genomic interpretation and confirm that there remains a major need to improve current genomic care. There are several possible explanations for the lack of an observed difference in overall comprehension scores between groups. One explanation is that the overall comprehension score did not measure a single construct and that clinically active physicians have greater knowledge in some areas (eg, interpreting copy number variation) than others (eg, assessing sequencing quality). Another explanation is that there may be an actual difference in comprehension by group but our sample size was too small to detect it. The fixed physician sample size at DFCI, combined with a relatively low participation rate, limits our ability to find small to moderate differences in overall comprehension. Alternatively, it is also possible that the proposed interactive report is not, on its own, sufficient to improve comprehension. Indeed, only one-third of physicians endorsed the interactive report as easy to understand, and the average score was only 10–12 out of 18. In contrast to overall comprehension, we found that the average report-related satisfaction score was higher among physicians who were exposed to the interactive report. Given that one of our goals for the interactive report was to provide data to physicians in a clear and minimalist interface, and to allow providers to tailor their viewing experience by using the hover-over functionality, the satisfaction findings are promising. Our open-ended data suggest that providers may have different needs, with some preferring a simplified report and others preferring raw or technical data. Future web-based reports may need to have an increased number of interactive features in order to allow clinicians to further tailor their use depending on their needs. Our findings add to a growing body of literature that investigates the effect of new strategies for genomic reporting. Williams and colleagues have demonstrated that patient and provider reports for Mendelian genetic disorders can affect clinician satisfaction and facilitate provider-patient communication. Other investigators have developed reports for whole-exome or whole-genome sequencing that include features such as a succinct summary of genomics findings, written for a nongenetic specialist audience, and information about the technical limitations of whole-exome or whole-genome sequencing., Although large panel testing is increasingly being incorporated into care, prior research has demonstrated that genomic results can be difficult to interpret., For example, generalists and nongenetic specialists may under- or overinterpret genomic information, which can lead to inaccurate diagnoses and misinformed counseling. We found that 84% of surveyed physicians endorsed the need for additional provider support in the form of electronic reports with embedded decision support, genomic consult services, and the ability to obtain physician-to-physician “curbsides.” Physicians also endorsed additional genomic support for patients, including a patient-friendly genomics report. In service to these ideas, we have incorporated a version of the interactive physician-directed report described here into our MatchMiner clinical trial interface (http://matchminer.org). Furthermore, our group is developing patient-facing genomic reports that will be integrated into our information technology platform. Despite this progress, the benefits and limitations of a variety of genomic interventions need further study. Strengths of our study include the creation of a web-based interactive format for genomic reporting and the evaluation of reports through a randomized experiment. Our study has limitations that are also worth noting. First, we had a relatively low participation rate. Although nonparticipation is a known issue with physician surveys, the rate was lower than that of our baseline survey (61%). This finding is striking, because we did not offer incentives at baseline and did offer an incentive for this work. One reason for the lower response rate is that physicians might not appreciate being “tested.” This hypothesis is supported by the formative work for our baseline survey, in which physicians suggested eliminating knowledge items because of concerns that “testing” would lower response rates. Furthermore, physicians who had lower genomic confidence at baseline were less likely to respond to the current survey. Given that assessments of physicians’ knowledge are needed to develop robust physician education, enhanced provider participation in future studies is needed. However, despite the relatively low participation rate, randomization was successful and we achieved balance across the 2 arms (interactive vs static), thus strengthening confidence in our findings. Second, the standard report was based on reports from our institution and may not be representative of other genomic reports. Third, we studied physicians at a single institution, and our findings might not be generalizable.

CONCLUSION

In summary, we found that physicians’ overall comprehension scores did not differ by report type. However, in exploratory analyses, we found that interactive genomic reports facilitated cancer physicians’ ability to correctly interpret technical aspects of the reports, such as sequencing quality and tumor purity. Furthermore, physicians who were exposed to the interactive reports had higher report-related satisfaction. Taken together, our findings suggest that innovations in genomic reporting hold the potential to decrease providers’ sequencing-related knowledge gaps and improve accurate genomic interpretation. Additional work is needed to determine whether dynamic reports are helpful for broader provider populations and to integrate interactive reports into the electronic health record. Ultimately, in order to fulfill the promise of precision medicine, point-of-care interventions will be needed to increase providers’ confidence in their ability to use genomic data.

COMPETING INTERESTS

SWG, JG, EC, AMC, HU, NO, CL, RL, AR, AS, CL, JB: none to report. LS: consultant for Research to Practice. EMV is a consultant/advisor for Genome Medical, Tango Therapeutics, and Novartis; receives research funding from Novartis and Bristol Myers Squibb; and has equity in Genome Medical, Tango Therapeutics, and Syapse.

FUNDING

This work was supported by a DFCI Medical Oncology grant (EMV), National Institutes of Health U01HG006492, K08CA188615 (EMV), the American Cancer Society 120529-MRSG-11-006-01-CPPB (SWG), and the Agency for Healthcare Research and Quality NIH R21HS024984 (SWG).

CONTRIBUTORS

SWG, JG, EC, AMC, HU, NO, CL, RL, AR, AS, CL, JB, LS, and EMV contributed to the conception and design of the project, survey execution, survey data collection, and interpretation of results. SWG, JG, EC, AS, CL, JB, and EMV contributed to the design and coding of the web-based report. SWG, NO, CL, RL, AR, and EMV contributed to operations related to the survey instrument and deployment. AMC and HU performed statistical analyses. All authors contributed to drafting the manuscript and approved the manuscript.

SUPPLEMENTARY MATERIAL

Supplementary material is available at Journal of the American Medical Informatics Association online. Click here for additional data file.
  39 in total

1.  Computational approaches to identify functional genetic variants in cancer genomes.

Authors:  Abel Gonzalez-Perez; Ville Mustonen; Boris Reva; Graham R S Ritchie; Pau Creixell; Rachel Karchin; Miguel Vazquez; J Lynn Fink; Karin S Kassahn; John V Pearson; Gary D Bader; Paul C Boutros; Lakshmi Muthuswamy; B F Francis Ouellette; Jüri Reimand; Rune Linding; Tatsuhiro Shibata; Alfonso Valencia; Adam Butler; Serge Dronov; Paul Flicek; Nick B Shannon; Hannah Carter; Li Ding; Chris Sander; Josh M Stuart; Lincoln D Stein; Nuria Lopez-Bigas
Journal:  Nat Methods       Date:  2013-08       Impact factor: 28.547

2.  OncoKB: A Precision Oncology Knowledge Base.

Authors:  Debyani Chakravarty; Jianjiong Gao; Sarah M Phillips; Ritika Kundra; Hongxin Zhang; Jiaojiao Wang; Julia E Rudolph; Rona Yaeger; Tara Soumerai; Moriah H Nissan; Matthew T Chang; Sarat Chandarlapaty; Tiffany A Traina; Paul K Paik; Alan L Ho; Feras M Hantash; Andrew Grupe; Shrujal S Baxi; Margaret K Callahan; Alexandra Snyder; Ping Chi; Daniel Danila; Mrinal Gounder; James J Harding; Matthew D Hellmann; Gopa Iyer; Yelena Janjigian; Thomas Kaley; Douglas A Levine; Maeve Lowery; Antonio Omuro; Michael A Postow; Dana Rathkopf; Alexander N Shoushtari; Neerav Shukla; Martin Voss; Ederlinda Paraiso; Ahmet Zehir; Michael F Berger; Barry S Taylor; Leonard B Saltz; Gregory J Riely; Marc Ladanyi; David M Hyman; José Baselga; Paul Sabbatini; David B Solit; Nikolaus Schultz
Journal:  JCO Precis Oncol       Date:  2017-05-16

3.  Breast Cancer Experience of the Molecular Tumor Board at the University of California, San Diego Moores Cancer Center.

Authors:  Barbara A Parker; Maria Schwaederlé; Michael D Scur; Sarah G Boles; Teresa Helsten; Rupa Subramanian; Richard B Schwab; Razelle Kurzrock
Journal:  J Oncol Pract       Date:  2015-08-04       Impact factor: 3.840

4.  Assigning clinical meaning to somatic and germ-line whole-exome sequencing data in a prospective cancer precision medicine study.

Authors:  Arezou A Ghazani; Nelly M Oliver; Joseph P St Pierre; Andrea Garofalo; Irene R Rainville; Elaine Hiller; Daniel J Treacy; Vanesa Rojas-Rudilla; Sam Wood; Elizabeth Bair; Michael Parello; Franklin Huang; Marios Giannakis; Frederick H Wilson; Elizabeth H Stover; Steven M Corsello; Tom Nguyen; Huma Q Rana; Alanna J Church; Carol Lowenstein; Carrie Cibulskis; Ali Amin-Mansour; Jennifer Heng; Lauren Brais; Abigail Santos; Patrick Bauer; Amanda Waldron; Peter Lo; Megan Gorman; Christine A Lydon; Marisa Welch; Philip McNamara; Stacey Gabriel; Lynette M Sholl; Neal I Lindeman; Judy E Garber; Steven Joffe; Eliezer M Van Allen; Stacy W Gray; Pasi A Ja Nne; Levi A Garraway; Nikhil Wagle
Journal:  Genet Med       Date:  2017-01-26       Impact factor: 8.822

5.  Effective communication of molecular genetic test results to primary care providers.

Authors:  Maren T Scheuner; Maria Orlando Edelen; Lee H Hilborne; Ira M Lubin
Journal:  Genet Med       Date:  2012-12-06       Impact factor: 8.822

Review 6.  Increasing response rates from physicians in oncology research: a structured literature review and data from a recent physician survey.

Authors:  Y Martins; R I Lederman; C L Lowenstein; S Joffe; B A Neville; B T Hastings; G A Abel
Journal:  Br J Cancer       Date:  2012-02-28       Impact factor: 7.640

7.  Oncologists' and cancer patients' views on whole-exome sequencing and incidental findings: results from the CanSeq study.

Authors:  Stacy W Gray; Elyse R Park; Julie Najita; Yolanda Martins; Lara Traeger; Elizabeth Bair; Joshua Gagne; Judy Garber; Pasi A Jänne; Neal Lindeman; Carol Lowenstein; Nelly Oliver; Lynette Sholl; Eliezer M Van Allen; Nikhil Wagle; Sam Wood; Levi Garraway; Steven Joffe
Journal:  Genet Med       Date:  2016-02-11       Impact factor: 8.822

8.  The fuzzy world of precision medicine: deliberations of a precision medicine tumor board.

Authors:  Sarah A McGraw; Judy Garber; Pasi A Jänne; Neal Lindeman; Nelly Oliver; Lynette M Sholl; Eliezer M Van Allen; Nikhil Wagle; Levi A Garraway; Steven Joffe; Stacy W Gray
Journal:  Per Med       Date:  2016-12-15       Impact factor: 2.512

Review 9.  Next-generation sequencing to guide cancer therapy.

Authors:  Jeffrey Gagan; Eliezer M Van Allen
Journal:  Genome Med       Date:  2015-07-29       Impact factor: 11.117

Review 10.  Genomic sequencing in clinical practice: applications, challenges, and opportunities.

Authors:  Joel B Krier; Sarah S Kalia; Robert C Green
Journal:  Dialogues Clin Neurosci       Date:  2016-09       Impact factor: 5.986

View more
  7 in total

1.  A Framework for Promoting Diversity, Equity, and Inclusion in Genetics and Genomics Research.

Authors:  Timothy R Rebbeck; John F P Bridges; Jennifer W Mack; Stacy W Gray; Jeffrey M Trent; Suzanne George; Norah L Crossnohere; Electra D Paskett; Corrie A Painter; Nikhil Wagle; Miria Kano; Patricia Nez Henderson; Jeffrey A Henderson; Shiraz I Mishra; Cheryl L Willman; Andrew L Sussman
Journal:  JAMA Health Forum       Date:  2022-04-15

2.  Prospective Decision Analysis Study of Clinical Genomic Testing in Metastatic Breast Cancer: Impact on Outcomes and Patient Perceptions.

Authors:  Daniel G Stover; Raquel E Reinbolt; Elizabeth J Adams; Sarah Asad; Katlyn Tolliver; Mahmoud Abdel-Rasoul; Cynthia D Timmers; Susan Gillespie; James L Chen; Siraj Mahamed Ali; Katharine A Collier; Mathew A Cherian; Anne M Noonan; Sagar Sardesai; Jeffrey VanDeusen; Robert Wesolowski; Nicole Williams; Clara N Lee; Charles L Shapiro; Erin R Macrae; Bhuvaneswari Ramaswamy; Maryam B Lustberg
Journal:  JCO Precis Oncol       Date:  2019-11-18

Review 3.  Collaborative, Multidisciplinary Evaluation of Cancer Variants Through Virtual Molecular Tumor Boards Informs Local Clinical Practices.

Authors:  Shruti Rao; Beth Pitel; Alex H Wagner; Simina M Boca; Matthew McCoy; Ian King; Samir Gupta; Ben Ho Park; Jeremy L Warner; James Chen; Peter K Rogan; Debyani Chakravarty; Malachi Griffith; Obi L Griffith; Subha Madhavan
Journal:  JCO Clin Cancer Inform       Date:  2020-07

4.  Precision oncology: separating the wheat from the chaff.

Authors:  Jordi Remon; Rodrigo Dienstmann
Journal:  ESMO Open       Date:  2018-10-30

5.  Generation and Implementation of a Patient-Centered and Patient-Facing Genomic Test Report in the EHR.

Authors:  Jessica M Goehringer; Michele A Bonhag; Laney K Jones; Tara Schmidlen; Marci Schwartz; Alanna Kulchak Rahm; Janet L Williams; Marc S Williams
Journal:  EGEMS (Wash DC)       Date:  2018-06-26

6.  A platform for oncogenomic reporting and interpretation.

Authors:  Caralyn Reisle; Laura M Williamson; Erin Pleasance; Anna Davies; Brayden Pellegrini; Dustin W Bleile; Karen L Mungall; Eric Chuah; Martin R Jones; Yussanne Ma; Eleanor Lewis; Isaac Beckie; David Pham; Raphael Matiello Pletz; Amir Muhammadzadeh; Brandon M Pierce; Jacky Li; Ross Stevenson; Hansen Wong; Lance Bailey; Abbey Reisle; Matthew Douglas; Melika Bonakdar; Jessica M T Nelson; Cameron J Grisdale; Martin Krzywinski; Ana Fisic; Teresa Mitchell; Daniel J Renouf; Stephen Yip; Janessa Laskin; Marco A Marra; Steven J M Jones
Journal:  Nat Commun       Date:  2022-02-09       Impact factor: 14.919

7.  Characterizing the relationships between tertiary and community cancer providers: Results from a survey of medical oncologists in Southern California.

Authors:  Nicholas J Salgia; Alexander Chehrazi-Raffle; JoAnn Hsu; Zeynep Zengin; Sabrina Salgia; Neal S Chawla; Luis Meza; Jasnoor Malhotra; Nazli Dizman; Ramya Muddasani; Nora Ruel; Mary Cianfrocca; Jun Gong; Sidharth Anand; Victor Chiu; James Yeh; Sumanta K Pal
Journal:  Cancer Med       Date:  2021-07-31       Impact factor: 4.452

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.