Michaela Kiernan1, Marily A Oppezzo1, Kenneth Resnicow2, Gwen L Alexander3. 1. Stanford Prevention Research Center. 2. Department of Health Behavior & Health Education, School of Public Health, University of Michigan. 3. Department of Public Health Sciences, Henry Ford Health System.
Abstract
OBJECTIVE: Given participants' research literacy is essential for clinical trial participation, evidence-based strategies are needed that improve literacy and easily accessed online. We tested whether an infographic letter-that illustrated how dropouts can distort study conclusions-improved participant knowledge about the impact of dropouts relative to a control letter. METHOD: In three distinct online samples purposely recruited to assess reproducibility, young ethnically diverse adults were randomized to read an infographic letter or control letter in a hypothetical scenario. Secondary outcomes included participants' perceived transparency of the research organization, perceived value of retention, and perceived trust of the organization. We purposely included two discriminant items, perceived value for the trial outcome and keeping commitments in general, both hypothesized not to change. RESULTS: Across samples, ∼20% more infographic participants correctly answered how dropouts affected study conclusions than control participants. For example (Experiment 3), nearly 90% of infographic participants correctly answered versus only two thirds of controls (88.7% vs. 66.7%, absolute percentage difference 22.0%, p < .0001). Infographic participants had substantially higher transparency, perceived value for retention, and trust (Cohen's ds = 0.4-1.0, ps < .0001), yet importantly did not value the study outcome or report keeping commitments more than control participants (Cohen's ds = 0.0-0.1, ps > .10). CONCLUSIONS: Promisingly, this transparent, visually powerful methodological infographic improved knowledge and trust. Future trials could embed and experimentally test whether such low-cost online infographics improve not only research literacy, but also trial retention, especially among populations with less initial trust about research. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
RCT Entities:
OBJECTIVE: Given participants' research literacy is essential for clinical trial participation, evidence-based strategies are needed that improve literacy and easily accessed online. We tested whether an infographic letter-that illustrated how dropouts can distort study conclusions-improved participant knowledge about the impact of dropouts relative to a control letter. METHOD: In three distinct online samples purposely recruited to assess reproducibility, young ethnically diverse adults were randomized to read an infographic letter or control letter in a hypothetical scenario. Secondary outcomes included participants' perceived transparency of the research organization, perceived value of retention, and perceived trust of the organization. We purposely included two discriminant items, perceived value for the trial outcome and keeping commitments in general, both hypothesized not to change. RESULTS: Across samples, ∼20% more infographic participants correctly answered how dropouts affected study conclusions than control participants. For example (Experiment 3), nearly 90% of infographic participants correctly answered versus only two thirds of controls (88.7% vs. 66.7%, absolute percentage difference 22.0%, p < .0001). Infographic participants had substantially higher transparency, perceived value for retention, and trust (Cohen's ds = 0.4-1.0, ps < .0001), yet importantly did not value the study outcome or report keeping commitments more than control participants (Cohen's ds = 0.0-0.1, ps > .10). CONCLUSIONS: Promisingly, this transparent, visually powerful methodological infographic improved knowledge and trust. Future trials could embed and experimentally test whether such low-cost online infographics improve not only research literacy, but also trial retention, especially among populations with less initial trust about research. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Authors: Rhonda G Kost; Laura M Lee; Jennifer Yessis; Robert A Wesley; David K Henderson; Barry S Coller Journal: N Engl J Med Date: 2013-12-05 Impact factor: 91.245
Authors: Sam S Oh; Joshua Galanter; Neeta Thakur; Maria Pino-Yanes; Nicolas E Barcelo; Marquitta J White; Danielle M de Bruin; Ruth M Greenblatt; Kirsten Bibbins-Domingo; Alan H B Wu; Luisa N Borrell; Chris Gunter; Neil R Powe; Esteban G Burchard Journal: PLoS Med Date: 2015-12-15 Impact factor: 11.069
Authors: Lauren R Powell; Elizabeth Ojukwu; Sharina D Person; Jeroan Allison; Milagros C Rosal; Stephenie C Lemon Journal: Med Care Date: 2017-02 Impact factor: 2.983
Authors: Danielle E Jake-Schoffman; Susan D Brown; Michael Baiocchi; Jessica L Bibeau; Jennifer Daubenmier; Assiamira Ferrara; Maren N Galarce; Wendy Hartogensis; Frederick M Hecht; Monique M Hedderson; Patricia J Moran; Sherry L Pagoto; Ai-Lin Tsai; Molly E Waring; Michaela Kiernan Journal: Am J Prev Med Date: 2021-10 Impact factor: 6.604
Authors: Corrine I Voils; Jane Pendergast; Sarah L Hale; Jennifer M Gierisch; Elizabeth M Strawbridge; Erica Levine; Megan A McVay; Shelby D Reed; William S Yancy; Ryan J Shaw Journal: Transl Behav Med Date: 2021-04-26 Impact factor: 3.046
Authors: Kerstin A Kessel; Marco M E Vogel; Carmen Kessel; Henning Bier; Tilo Biedermann; Helmut Friess; Peter Herschbach; Rüdiger von Eisenhart-Rothe; Bernhard Meyer; Marion Kiechle; Ulrich Keller; Christian Peschel; Florian Bassermann; Roland Schmid; Markus Schwaiger; Stephanie E Combs Journal: Clin Transl Radiat Oncol Date: 2018-10-04
Authors: Holly C Gooding; Samuel S Gidding; Andrew E Moran; Nicole Redmond; Norrina B Allen; Fida Bacha; Trudy L Burns; Janet M Catov; Michael A Grandner; Kathleen Mullan Harris; Heather M Johnson; Michaela Kiernan; Tené T Lewis; Karen A Matthews; Maureen Monaghan; Jennifer G Robinson; Deborah Tate; Kirsten Bibbins-Domingo; Bonnie Spring Journal: J Am Heart Assoc Date: 2020-09-30 Impact factor: 5.501