Domino Determann1, Mattijs S Lambooij2, Ewout W Steyerberg3, Esther W de Bekker-Grob3, G Ardine de Wit4. 1. Centre for Nutrition, Prevention and Health Services Research, National Institute for Public Health and the Environment (RIVM), Bilthoven, The Netherlands; Department of Public Health, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands. 2. Centre for Nutrition, Prevention and Health Services Research, National Institute for Public Health and the Environment (RIVM), Bilthoven, The Netherlands. Electronic address: mattijs.lambooij@rivm.nl. 3. Department of Public Health, Erasmus MC, University Medical Center Rotterdam, Rotterdam, The Netherlands. 4. Centre for Nutrition, Prevention and Health Services Research, National Institute for Public Health and the Environment (RIVM), Bilthoven, The Netherlands; Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht, The Netherlands.
Abstract
BACKGROUND: Electronic data collection is increasingly being used for discrete choice experiments (DCEs). OBJECTIVES: To study whether paper or electronic administration results in measurement effects. METHODS: Respondents were drawn from the same sample frame (an Internet panel) and completed a nearly identical DCE survey either online or on paper during the same period. A DCE on preferences for basic health insurance served as a case study. We used panel mixed logit models for the analysis. RESULTS: In total, 898 respondents completed the survey: 533 respondents completed the survey online, whereas 365 respondents returned the paper survey. There were no significant differences with respect to sociodemographic characteristics between the respondents in both samples. The median response time was shorter for the online sample than for the paper sample, and a smaller proportion of respondents from the online sample were satisfied with the number of choice sets. Although some willingness- to-pay estimates were higher for the online sample, the elicited preferences for basic health insurance characteristics were similar between both modes of administration. CONCLUSIONS: We find no indication that online surveys yield inferior results compared with paper-based surveys, whereas the price per respondent is lower for online surveys. Researchers might want to include fewer choice sets per respondent when collecting DCE data online. Because our findings are based on a nonrandomized DCE that covers one health domain only, research in other domains is needed to support our findings.
BACKGROUND: Electronic data collection is increasingly being used for discrete choice experiments (DCEs). OBJECTIVES: To study whether paper or electronic administration results in measurement effects. METHODS: Respondents were drawn from the same sample frame (an Internet panel) and completed a nearly identical DCE survey either online or on paper during the same period. A DCE on preferences for basic health insurance served as a case study. We used panel mixed logit models for the analysis. RESULTS: In total, 898 respondents completed the survey: 533 respondents completed the survey online, whereas 365 respondents returned the paper survey. There were no significant differences with respect to sociodemographic characteristics between the respondents in both samples. The median response time was shorter for the online sample than for the paper sample, and a smaller proportion of respondents from the online sample were satisfied with the number of choice sets. Although some willingness- to-pay estimates were higher for the online sample, the elicited preferences for basic health insurance characteristics were similar between both modes of administration. CONCLUSIONS: We find no indication that online surveys yield inferior results compared with paper-based surveys, whereas the price per respondent is lower for online surveys. Researchers might want to include fewer choice sets per respondent when collecting DCE data online. Because our findings are based on a nonrandomized DCE that covers one health domain only, research in other domains is needed to support our findings.
Authors: Michelle L Weber Rawlins; David Welch Suggs; Laura Bierema; L Stephen Miller; Fred Reifsteck; Julianne D Schmidt Journal: J Clin Transl Res Date: 2020-04-16
Authors: Julianne D Schmidt; David Welch Suggs; Michelle L Weber Rawlins; Laura Bierema; Lloyd Stephen Miller; Ron Courson; Fred Reifsteck Journal: J Clin Transl Res Date: 2020-05-26
Authors: Katherine M Livingstone; Karen E Lamb; Gavin Abbott; Tony Worsley; Sarah A McNaughton Journal: Int J Behav Nutr Phys Act Date: 2020-12-01 Impact factor: 6.457
Authors: Jamie O'Hara; Antony P Martin; Diane Nugent; Michelle Witkop; Tyler W Buckner; Mark W Skinner; Brian O'Mahony; Brendan Mulhern; George Morgan; Nanxin Li; Eileen K Sawyer Journal: Haemophilia Date: 2021-02-17 Impact factor: 4.287
Authors: Dong Dong; Richard Huan Xu; Eliza Lai-Yi Wong; Chi-Tim Hung; Da Feng; Zhanchun Feng; Eng-Kiong Yeoh; Samuel Yeung-Shan Wong Journal: Health Expect Date: 2020-10-06 Impact factor: 3.377