| Literature DB >> 34880497 |
Katherine L Milkman1, Dena Gromet2, Hung Ho3,4, Joseph S Kay2, Timothy W Lee2,5, Pepi Pandiloski6, Yeji Park7, Aneesh Rai3, Max Bazerman8, John Beshears8, Lauri Bonacorsi9, Colin Camerer10, Edward Chang8, Gretchen Chapman11, Robert Cialdini12, Hengchen Dai13, Lauren Eskreis-Winkler14, Ayelet Fishbach14, James J Gross15, Samantha Horn11, Alexa Hubbard16, Steven J Jones17, Dean Karlan18, Tim Kautz19, Erika Kirgios3, Joowon Klusowski20, Ariella Kristal21, Rahul Ladhania22, George Loewenstein11, Jens Ludwig6, Barbara Mellers20, Sendhil Mullainathan14, Silvia Saccardo11, Jann Spiess23, Gaurav Suri24, Joachim H Talloen11, Jamie Taxer15, Yaacov Trope16, Lyle Ungar25, Kevin G Volpp26, Ashley Whillans8, Jonathan Zinman27, Angela L Duckworth28,29.
Abstract
Policy-makers are increasingly turning to behavioural science for insights about how to improve citizens' decisions and outcomes1. Typically, different scientists test different intervention ideas in different samples using different outcomes over different time intervals2. The lack of comparability of such individual investigations limits their potential to inform policy. Here, to address this limitation and accelerate the pace of discovery, we introduce the megastudy-a massive field experiment in which the effects of many different interventions are compared in the same population on the same objectively measured outcome for the same duration. In a megastudy targeting physical exercise among 61,293 members of an American fitness chain, 30 scientists from 15 different US universities worked in small independent teams to design a total of 54 different four-week digital programmes (or interventions) encouraging exercise. We show that 45% of these interventions significantly increased weekly gym visits by 9% to 27%; the top-performing intervention offered microrewards for returning to the gym after a missed workout. Only 8% of interventions induced behaviour change that was significant and measurable after the four-week intervention. Conditioning on the 45% of interventions that increased exercise during the intervention, we detected carry-over effects that were proportionally similar to those measured in previous research3-6. Forecasts by impartial judges failed to predict which interventions would be most effective, underscoring the value of testing many ideas at once and, therefore, the potential for megastudies to improve the evidentiary value of behavioural science.Entities:
Mesh:
Year: 2021 PMID: 34880497 PMCID: PMC8822539 DOI: 10.1038/s41586-021-04128-4
Source DB: PubMed Journal: Nature ISSN: 0028-0836 Impact factor: 49.962