Suzanne C Hughes1, Carol J Hogue2, Melissa A Clark3, Jessica E Graber4,5, Elaine D Eaker6,7, Amy H Herring8,9. 1. Graduate School of Public Health, San Diego State University, 9245 Sky Park Court, Suite 102, San Diego, CA, 92123, USA. shughes@cbeachsdsu.org. 2. Rollins School of Public Health, Emory University, Atlanta, GA, 30322, USA. 3. School of Public Health, Brown University, Providence, RI, 02912, USA. 4. Eunice Kennedy Shriver National Institute of Child Health and Human Development, National Institutes of Health, Bethesda, MD, 20847, USA. 5. United States Census Bureau, Washington, DC, 20233, USA. 6. WESTAT, Rockville, MD, 20850, USA. 7. Eaker Epidemiology Enterprises, LLC, Walla Walla, WA, 99362, USA. 8. Department of Biostatistics, Gillings School of Global Public Health and Carolina Population Center, CB 7420, Chapel Hill, NC, 27599, USA. 9. Department of Statistical Science, Duke University, Durham, NC, 27705, USA.
Abstract
OBJECTIVES: Population-based recruitment of a cohort of women who are currently pregnant or who may become pregnant in a given timeframe presents challenges unique to identifying pregnancy status or the likelihood of future pregnancy. Little is known about the performance of individual eligibility items on pregnancy screeners although they are critical to participant recruitment. This paper examined the patterns and respondent characteristics of key pregnancy screener items used in a large national study. METHODS: Cross-sectional analyses were conducted. Descriptive statistics and multivariable logistic regression models were used to examine nonresponse patterns to three questions (currently pregnant, trying to get pregnant and able to get pregnant). The questions were asked of 50,529 women in 17 locations across the US, as part of eligibility screening for the National Children's Study Vanguard Study household-based recruitment. RESULTS: Most respondents were willing to provide information about current pregnancy, trying, and able to get pregnant: 99.3% of respondents answered all three questions and 97.4% provided meaningful answers. Nonresponse ranged from 0.3 to 2.5% for individual items. Multivariable logistic regression results identified small but statistically significant differences in nonresponse by respondent age, marital status, race/ethnicity-language, and household-based recruitment group. CONCLUSIONS FOR PRACTICE: The high levels of response to pregnancy-related items are impressive considering that the eligibility questions were fairly sensitive, were administered at households, and were not part of a respondent-initiated encounter.
OBJECTIVES: Population-based recruitment of a cohort of women who are currently pregnant or who may become pregnant in a given timeframe presents challenges unique to identifying pregnancy status or the likelihood of future pregnancy. Little is known about the performance of individual eligibility items on pregnancy screeners although they are critical to participant recruitment. This paper examined the patterns and respondent characteristics of key pregnancy screener items used in a large national study. METHODS: Cross-sectional analyses were conducted. Descriptive statistics and multivariable logistic regression models were used to examine nonresponse patterns to three questions (currently pregnant, trying to get pregnant and able to get pregnant). The questions were asked of 50,529 women in 17 locations across the US, as part of eligibility screening for the National Children's Study Vanguard Study household-based recruitment. RESULTS: Most respondents were willing to provide information about current pregnancy, trying, and able to get pregnant: 99.3% of respondents answered all three questions and 97.4% provided meaningful answers. Nonresponse ranged from 0.3 to 2.5% for individual items. Multivariable logistic regression results identified small but statistically significant differences in nonresponse by respondent age, marital status, race/ethnicity-language, and household-based recruitment group. CONCLUSIONS FOR PRACTICE: The high levels of response to pregnancy-related items are impressive considering that the eligibility questions were fairly sensitive, were administered at households, and were not part of a respondent-initiated encounter.
Authors: Philip J Landrigan; Leonardo Trasande; Lorna E Thorpe; Charon Gwynn; Paul J Lioy; Mary E D'Alton; Heather S Lipkind; James Swanson; Pathik D Wadhwa; Edward B Clark; Virginia A Rauh; Frederica P Perera; Ezra Susser Journal: Pediatrics Date: 2006-11 Impact factor: 7.124
Authors: Joanne H E Promislow; Christina M Makarushka; Jessica R Gorman; Penelope P Howards; David A Savitz; Katherine E Hartmann Journal: Paediatr Perinat Epidemiol Date: 2004-03 Impact factor: 3.980
Authors: David Wendler; Raynard Kington; Jennifer Madans; Gretchen Van Wye; Heidi Christ-Schmidt; Laura A Pratt; Otis W Brawley; Cary P Gross; Ezekiel Emanuel Journal: PLoS Med Date: 2005-12-06 Impact factor: 11.069
Authors: Peter K Gilbertson; Susan Forrester; Linda Andrews; Kathleen McCann; Lydia Rogers; Christina Park; Jack Moye Journal: Front Public Health Date: 2021-03-05