Molly Offer-Westort

Selected projects

Adaptive experimentation tutorial

Developed with Vitor Hadad and Susan Athey.

Much of my recent work has been on developing tools for designing and analyzing data from adaptive experiments. This tutorial provides an overview of adaptive experimental design, descibes some basic algorithms for treatment assignment, and discusses considerations for inference.

shiny tutorial

Optimal Policies to Battle the Coronavirus “Infodemic” Among Social Media Users in Sub-Saharan Africa,

Joint work with Leah Rosenzweig and Susan Athey.

Alongside the outbreak of the novel coronavirus, an “infodemic” of myths and hoax cures is spreading over online media outlets and social media plat- forms. Building on the literature on combating fake news, we evaluate experi- mental interventions designed to decrease sharing of false COVID-19 cures. We use Facebook advertisements to recruit social media users in Kenya and Nigeria, and deliver our interventions with a Messenger chatbot, facilitating observation of treatment effects in a realistic setting. We use a contextual adaptive experimental design to target the most effective interventions, and learn and evaluate a contextual policy, improving our understanding of how to tackle harmful misinformation during an ongoing health crisis. Finally, we bring comparative data to a global problem for which the existing research has largely been limited to the U.S. and Europe.

preanalysis plan

Adaptive Experimental Design: Prospects and Applications in Political Science

Joint work with Alexander Coppock and Donald P. Green.

Experimental researchers in political science frequently face the problem of infer- ring which of several treatment arms is most effective. They may also seek to estimate mean outcomes under that arm, construct confidence intervals, and test hypotheses. Ordinarily, multi-arm trials conducted using static designs assign participants to each arm with fixed probabilities. However, a growing statistical literature suggests that adaptive experimental designs that dynamically allocate larger assignment probabili- ties to more promising treatments are better equipped to discover the best-performing arm. Using simulations and empirical applications, we explore the conditions under which such designs hasten the discovery of superior treatments and improve the pre- cision with which their effects are estimated. Recognizing that many scholars seek to assess performance relative to a control condition, we also develop and implement a novel adaptive algorithm that seeks to maximize the precision with which the largest treatment effect is estimated.

paper, forthcoming at AJPS

Experimentation for homogeneous policy change

Joint work with Drew Dimmery.

When the Stable Unit Treatment Value Assumption (SUTVA) is violated and there is interference among units, there is not a uniquely defined Average Treatment Effect (ATE), and alternative estimands may be of interest, among them average unit-level differences in outcomes under different homogeneous treatment policies. We term this target the {Homogeneous Assignment Average Treatment Effect} (HAATE). We consider approaches to experimental design with multiple treatment conditions under partial interference and, given the estimand of interest, we show that difference-in-means estimators may perform better than correctly specified regression models in finite samples on root mean squared error (RMSE). With errors correlated at the cluster level, we demonstrate that two-stage randomization procedures with intra-cluster correlation of treatment strictly between zero and one may dominate one-stage randomization designs on the same metric. Simulations demonstrate performance of this approach; an application to online experiments at Facebook is discussed.