University of Limerick Institutional Repository

Crowdsourcing hypothesis tests: making transparent how design choices shape research results

DSpace Repository

Show simple item record

dc.contributor.author Landy, Justin F.
dc.contributor.author Jia, Miaolei (Liam)
dc.contributor.author Ding, Isabel L.
dc.contributor.author Viganola, Domenico
dc.contributor.author Tierney, Warren
dc.contributor.author Dreber, Anna
dc.contributor.author Johannesson, Magnus
dc.contributor.author Pfeiffer, Thomas
dc.contributor.author Ebersole, Charles R.
dc.contributor.author Gronau, Quentin F.
dc.contributor.author Ly, Alexander
dc.contributor.author van den Bergh, Don
dc.contributor.author Marsman, Maaten
dc.contributor.author Derks, Koen
dc.contributor.author Wagenmakers, Eric-Jan
dc.contributor.author Proctor, Andrew
dc.contributor.author Bartels, Daniel M.
dc.contributor.author Bauman, Christopher W.
dc.contributor.author Brady, William J.
dc.contributor.author Cheung, Felix
dc.contributor.author Cimpian, Andrei
dc.contributor.author Dohle, Simone
dc.contributor.author Donnellan, Brent M.
dc.contributor.author Hahn, Adam
dc.contributor.author Hall, Michael P.
dc.contributor.author Jiménez-Leal, William
dc.contributor.author Johnson, David J.
dc.contributor.author Lucas, Richard E.
dc.contributor.author Monin, Benoit
dc.contributor.author Montealegre, Andres
dc.contributor.author Mullen, Elizabeth
dc.contributor.author Pang, Jun
dc.contributor.author Ray, Jennifer
dc.contributor.author Reinero, Diego A.
dc.contributor.author Reynolds, Jesse
dc.contributor.author Sowden, Walter
dc.contributor.author Storage, Daniel
dc.contributor.author Su, Runkun
dc.contributor.author Tworek, Christina M.
dc.contributor.author Van Bavel, Jay J.
dc.contributor.author Walco, Daniel
dc.contributor.author Wills, Julian
dc.contributor.author Xu, Xiaobing
dc.contributor.author Yam, Chi Kai
dc.contributor.author Yang, Xiaoyu
dc.contributor.author Cunningham, William A.
dc.contributor.author Schweinsberg, Martin
dc.contributor.author Urwitz, Molly
dc.contributor.author Uhlmann, Eric Luis
dc.date.accessioned 2020-02-04T19:48:48Z
dc.date.available 2020-02-04T19:48:48Z
dc.date.issued 2019
dc.identifier.uri http://hdl.handle.net/10344/8477
dc.description peer-reviewed en_US
dc.description.abstract To what extent are research results influenced by subjective decisions that scientists make as they design studies? Fifteen research teams independently designed studies to answer five original research questions related to moral judgments, negotiations, and implicit cognition. Participants from two separate large samples (total N > 15,000) were then randomly assigned to complete one version of each study. Effect sizes varied dramatically across different sets of materials designed to test the same hypothesis: materials from different teams rendered statistically significant effects in opposite directions for four out of five hypotheses, with the narrowest range in estimates being d = -0.37 to +0.26. Meta-analysis and a Bayesian perspective on the results revealed overall support for two hypotheses, and a lack of support for three hypotheses. Overall, practically none of the variability in effect sizes was attributable to the skill of the research team in designing materials, while considerable variability was attributable to the hypothesis being tested. In a forecasting survey, predictions of other scientists were significantly correlated with study results, both across and within hypotheses. Crowdsourced testing of research hypotheses helps reveal the true consistency of empirical support for a scientific claim. en_US
dc.language.iso eng en_US
dc.publisher American Psychological Association en_US
dc.relation.ispartofseries Psychological Bulletin; 146 (5), pp. 451-479
dc.relation.uri https://doi.org/10.1037/bul0000220
dc.rights © American Psychological Association, 2019. This paper is not the copy of record and may not exactly replicate the authoritative document published in the APA journal. Please do not copy or cite without author's permission en_US
dc.subject crowdsourcing en_US
dc.subject scientific transparency en_US
dc.subject stimulus sampling en_US
dc.subject forecasting en_US
dc.subject conceptual replications en_US
dc.subject research robustness en_US
dc.title Crowdsourcing hypothesis tests: making transparent how design choices shape research results en_US
dc.type info:eu-repo/semantics/article en_US
dc.type.supercollection all_ul_research en_US
dc.type.supercollection ul_published_reviewed en_US
dc.identifier.doi 10.1037/bul0000220
dc.rights.accessrights info:eu-repo/semantics/openAccess en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search ULIR


Browse

My Account

Statistics