• Main Website
  • Youtube
  • Harvard Data Science Review
  • Archives@Cornell
  • Archives@Zenodo

On this page

  • 1 Summary
  • 2 Introduction
    • 2.1 Intellectual Merit (as originally stated)
    • 2.2 Broader Impacts (as originally stated)
  • 3 Methods
    • 3.1 Sessions
  • 4 Audience
  • 5 Outputs
    • 5.1 Web artifacts
    • 5.2 Online Book
    • 5.3 Journal publications
    • 5.4 Sessions
    • 5.5 Recordings
  • 6 References

Other Formats

  • PDF

Conference on Reproducibility and Replicability in Economics and Social Sciences (CRRESS) Final Outcomes Report

Authors
Affiliations

Aleksandr Michuda

Swarthmore College

Lars Vilhuber

Cornell University

Published

September 7, 2025

This final outcomes report is for NSF Award #2217493.

1 Summary

The Conference on Reproducibility and Replicability in Economics and the Social Sciences (CRRESS) was a series of virtual and in-person panels on the topics of reproducibility, replicability, and transparency in the social sciences. The purpose of scientific publishing is the dissemination of robust research findings, exposing them to the scrutiny of peers and other interested parties. Scientific articles should accurately and completely provide information on the origin and provenance of data and on the analytical and computational methods used. Yet in recent years, doubts about the adequacy of the information provided in scientific articles and their addenda have been voiced. The conferences addressed various topics in this area: the initiation of research, the conduct of research, the preparation of research for publication, and the scrutiny after publication. Undergraduates, graduate students, and career researchers were able to learn about best practices for transparent, reproducible, and scientifically sound research in the social sciences. The materials produced during the conference series are permanently archived and freely available to all interested parties.

2 Introduction

The purpose of scientific publishing is the dissemination of robust research findings, exposing them to the scrutiny of peers. Key to this endeavor is documenting the provenance of those findings. Scientific practices during the course of research and subsequent publication, peer review, and dissemination practices and tools, all interact to (hopefully) enable a discourse about the veracity of scientific claims.

Whether or not one actually believes there is a “replication crisis” (see Fanelli (2018) for a discussion), recent years have seen an increased emphasis on various methods that support improved provenance documentation. This includes pre-registration (Nosek et al. 2018, 2019), pre-analysis plans (Banerjee et al. 2020; Olken 2015), registered reports (Hardwicke and Ioannidis 2018; Chambers 2014; Journal of Development Economics 2019), greater availability of working papers and pre-prints in disciplines other than economics, statistics, and physics (Vilhuber 2020), and increasingly more stringent journal policies surrounding data and code availability, including active review and verification of replication packages (Jacoby et al. 2017; Christian et al. 2018; Editors 2021; Vilhuber 2019).

In what follows, we adopt the NASEM definition of [computational] reproducibility as “obtaining consistent results using the same input data, computational steps, methods, and code, and conditions of analysis” and replicability as “obtaining consistent results across studies aimed at answering the same scientific question, each of which has obtained its own data” (National Academies of Sciences, Engineering, and Medicine 2019,ch. 3). We use “replication packages” to refer to those materials (data, computer code, and instructions) linked to a specific publication that facilitate the replication of the manuscript’s results by others, but should also be computationally reproducible. Note that the literature sometimes uses other definitions.

The verification of replication packages, which includes not just checks of the computational reproducibility of the provided materials, but also verifies the documented data provenance and completeness of such materials, is not a magical solution that will solve the “replicability crisis.” Replication packages may be reproducible, but wrong (see f.i. the recent discussion surrounding Simonsohn et al. 2021). It also faces educational and procedural barriers. Should journals, which act at the tail end of the scientific production process, be the verifiers of reproducibility, as some journals have been doing (Vilhuber 2021; Christian et al. 2018), or should verification be a natural part of the post-publication assessment by the scientific community, with non-reproducible articles being cited less (as claimed by (Hamermesh 2007) or being retracted (Journal of Finance 2021)? Should scientists’ work be reproducible at every stage of the research process, even prior to submission to journals, and what does that imply for funding, for technical infrastructure, and for the training of undergraduate and graduate students? The consensus on answers to these questions is still emerging, and needs to be discussed by all researchers in the discipline, as such a consensus will guide how disciplinary research is conducted. Most discussions on these topics, however, occur in workshops and conferences that are not the core disciplinary conferences attended by the typical social scientist, other than those specifically interested in reproducibility as a research topic.

The goal of the webinar and conference series, which we called “Conference on Reproducibility and Replicability in Economics and Social Sciences (CRRESS),” was to make these topics accessible to all researchers, by pulling them out of specialized conferences, and making them available to a broad audience, through a consistent and logical sequence of sessions. CRRESS gave us an opportunity to be free of the oftentimes limited ability to have multiple sessions that focus on these topics accepted at general conferences.

The topics that CRRESS covered were selected to inform researchers about themes, tools, infrastructure, and approaches that are not typically taught or learned in current or past curricula. The list of topics and sessions is listed in Sessions.

2.1 Intellectual Merit (as originally stated)

The audience of the webinar and conference series will gain insights into the full gamut of topics related to the initiation of research, the conduct of research, the preparation of research for publication, and possibly the scrutiny after publication related to reproducibility and replicability. The topics chosen for the series are not usually part of disciplinary seminars or conferences, and are brought to a broader audience here for the first time. The availability of permanent artifacts (presentations, recordings, manuscripts) after the conference will allow this to be resource with persistent impacts.

2.2 Broader Impacts (as originally stated)

The webinar and conference series will be available to any non-participant through persistent artifacts: videos, presentations, and manuscripts. Undergraduates, graduate students, and career researchers will be able to learn about best practices for transparent, reproducible scientifically sound research in the social sciences, greatly expanding the impact of the series. In turn, this will allow for research to be made more credible, perceived and verified as such by policy makers and a public that wishes to implement evidence-based policymaking.

3 Methods

We put together an organizing committee, composed of the PIs, Ian Schmutte (University of Georgia) and Marie Connolly (Université du Québec à Montréal, Canada). The committee organized 10 webinars over the course of the 2022-2023 academic year. They were broadly advertised through social media, email lists, and personal networks. Each session was moderated by one of organizers, and featured 2-3 panelists who presented on the topic of the session. Each session lasted approximately 90 minutes, with presentations lasting approximately 45 minutes, followed by a moderated discussion and Q&A from the audience. All sessions were recorded and made available on Youtube and archived in Cornell University’s eCommons repository.

3.1 Sessions

3.1.1 Session 1: Institutional support: Should journals verify reproducibility?

Date: 2022-09-27

Participants: Moderator: Lars Vilhuber; Panelists: Guido Imbens, Tim Salmon, Toni Whited

Recording: YouTube

Archive: eCommons

3.1.2 Session 2: Reproducibility and ethics - IRBs and beyond

Date: 2022-10-25

Participants: Moderator: Lars Vilhuber; Panelists: Michelle N. Meyer, Shea Swauger, Sarah Kopper

Recording: YouTube

Archive: eCommons

3.1.3 Session 3: Should teaching reproducibility be a part of undergraduate education or curriculum?

Date: 2022-11-20

Participants: Moderator: Ian Schmutte; Panelists: Diego Mendez-Carbajo, Richard Ball, Lars Vilhuber

Recording: YouTube

Archive: eCommons

3.1.4 Session 4: Reproducibility and confidential or proprietary data: can it be done?

Date: 2022-12-13

Participants: Moderator: Aleksandr Michuda; Panelists: John Horton, Paulo Guimarães, Lars Vilhuber

Recording: YouTube

Archive: eCommons

3.1.5 Session 5: Disciplinary support: why is reproducibility not uniformly required across disciplines?

Date: 2023-01-31

Participants: Moderator: Lars Vilhuber; Panelists: Kim Weeden, Betsy Sinclair, Hilary Hoynes

Recording: YouTube

Archive: eCommons

3.1.6 Session 6: Institutional support: How do journal reproducibility verification services work?

Date: 2023-02-28

Participants: Moderator: Marie Connolly; Panelists: Christophe Perignon, Ben Greiner, Thu-Mai Christian

Recording: YouTube

Archive: eCommons

3.1.7 Session 7: Why can or should research institutions publish replication packages?

Date: 2023-03-28

Participants: Moderator: Aleksandr Michuda; Panelists: Graham MacDonald, Limor Peer, Courtney Butler

Recording: YouTube

Archive: eCommons

3.1.8 Session 8: Should funders require reproducible archives?

Date: 2023-04-25

Participants: Moderator: Lars Vilhuber; Panelists: Martin Halbert, Sebastian Martinez, Stuart Buck

Recording: YouTube

Archive: eCommons

3.1.9 Session 9: Reproducibility, confidentiality, and open data mandates

Date: 2023-05-30

Participants: Moderator: Marie Connolly; Panelists: Kimberly McGrail, S. Martin Taylor, Matthew Lucas

Recording: YouTube

Archive: eCommons

3.1.10 Session 10: The integration of reproducibility into social science graduate education.

Date: 2023-06-27

Participants: Moderator: Marie Connolly; Panelists: Julian Reif, Jeremy Freese, David Wasser

Recording: YouTube

Archive: eCommons

4 Audience

The audience are researchers, academic or otherwise, in the social sciences. Speakers were drawn from economics, sociology, political science, with affiliations in academia, think-tanks, government agencies and non-governmental funding organizations. The materials produced during the conference have been used and referenced in tutorials provided to undergraduate and graduate students in various workshops, and researchers at various presentations at academic conferences and

5 Outputs

5.1 Web artifacts

The Conference on Reproducibility and Replicability in Economics and the Social Sciences website is found at https://labordynamicsinstitute.github.io/crress/ as of 2025-10-30. It will be maintained there indefinitely. The main page is preserved on the Wayback Machine. The underlying Github repository https://github.com/labordynamicsinstitute/crress/ is archived on Zenodo (Vilhuber and Michuda 2025) prior to the creation of this report. An update will include this report.

5.2 Online Book

Various authors contributed written texts, which are available at https://labordynamicsinstitute.github.io/crress-book/ as an online book. The book is preserved on the Wayback Machine. The underlying Github repository https://github.com/labordynamicsinstitute/crress-book has been preserved on Zenodo (Michuda et al. 2025).

5.3 Journal publications

A number of the author contributions were organized into a special section of the Harvard Data Science Review (HDSR) in Issue 5.3, Summer of 2023, edited by the moderators of the webinar series Lars Vilhuber, Ian Schmutte, Aleksandr Michuda, and Marie Connolly (Vilhuber et al. 2023). Subsequently, PI Lars Vilhuber was designated as Column Editor for the (ongoing) HDSR column on Reinforcing Reproducibility and Replicability.

As of October 2025, the following articles have been published in the special section as well as the column, in addition to the aforementioned introduction: Ball (2023); Buck (2024); Butler (2023); Guimarães (2023); Hoynes (2023); Jones (2024); MacDonald (2023); Mendez-Carbajo and Dellachiesa (2023); Ottone and Peer (2025); Peer (2024); Pérignon (2024); Salmon (2023); Weeden (2023); Whited (2023) .

Vilhuber continues, for now, to edit the column. The most cited article, Weeden´s Crisis? What Crisis? Sociology’s Slow Progress Toward Scientific Transparency has so far been cited 6 times.

5.4 Sessions

Sessions were live-moderated by one of the organizers. A total of 149 people registered for at least one session. Unfortunately, not all per-session live participant data was preserved. However, from the 5 sessions from which we did have information on live participants, between 19 and 112 attended each session. Presentation slides from most authors can be found on the MetaArXiv OSF Preprint server, but are not otherwise preserved.

5.5 Recordings

All sessions were recorded.

5.5.1 Youtube

Session recordings are available for immediate viewing on Youtube. On YouTube, the three most watched sessions were the sessions on “Should journals verify reproducibility?”, “Reproducibility and ethics - IRBs and beyond”, and “Disciplinary support: why is reproducibility not uniformly required across disciplines?”. Youtube recordings are under a Creative Commons Attribution CC-BY 4.0 license. Auto-generated captions are available.

Title Views Published
Should journals verify reproducibility? 204 Oct 1, 2022
Reproducibility and ethics - IRBs and beyond 107 Nov 1, 2022
Disciplinary support: why is reproducibility not uniformly required across disciplines? 94 Feb 5, 2023
Should teaching reproducibility be a part of undergraduate education or curriculum? 79 Dec 4, 2022
Reproducibility and confidential or proprietary data: can it be done? 74 Jan 22, 2023
Reproducibility, confidentiality, and open data mandates (at CEA) 70 Jul 30, 2023
Why can or should research institutions publish replication packages? 56 Apr 7, 2023
Institutional support: How do journal reproducibility verification services work? 45 Mar 6, 2023
Should funders require reproducible archives? 35 May 14, 2023
The integration of reproducibility into social science graduate education 34 Aug 23, 2023
Should funders require reproducible archives? (v1) NA May 8, 2023

5.5.2 Curated Recordings

The recordings are preserved at the Cornell Library’s eCommons.1 eCommons archives are also under a Creative Commons Attribution CC-BY 4.0 license. The captions generated by Youtube are preserved as part of the eCommons deposit.

  • “The integration of reproducibility into social science graduate education.” (2023). Reif, Julian; Freese, Jeremy; Wasser, David; Connolly, Marie. https://doi.org/10.7298/bj42-e619
  • “Reproducibility, confidentiality, and open data mandates” (2023). McGrail, Kimberly; Taylor, S. Martin; Lucas, Matthew; Connolly, Marie. https://doi.org/10.7298/hcry-nz34
  • “Should funders require reproducible archives?” (2023). Halbert, Martin; Martinez, Sebastian; Buck, Stuart; Vilhuber, Lars. https://doi.org/10.7298/21b2-yt23
  • “Institutional support: How do journal reproducibility verification services work?” (2023). Perignon, Christophe; Greiner, Ben; Christian, Thu-Mai; Connolly, Marie. https://doi.org/10.7298/0g2q-d958
  • “Reproducibility and confidential or proprietary data: can it be done?” (2023). Horton, John; Guimarães, Paulo; Vilhuber, Lars; Michuda, Aleksandr. https://doi.org/10.7298/3mjp-3h26
  • “Disciplinary support: why is reproducibility not uniformly required across disciplines?” (2023). Weeden, Kim; Sinclair, Betsy; Hoynes, Hilary; Vilhuber, Lars. https://doi.org/10.7298/PKWJ-GM89
  • “Why can or should research institutions publish replication packages?” (2023). MacDonald, Graham; Peer, Limor; Butler, Courtney; Michuda, Aleksandr. https://doi.org/10.7298/pntg-rw59
  • “Should teaching reproducibility be a part of undergraduate education or curriculum?” (2022). Mendez-Carbajo, Diego; Ball, Richard; Vilhuber, Lars; Schmutte, Ian. https://doi.org/10.7298/KBZA-0K11
  • “Reproducibility and ethics - IRBs and beyond” (2022). Meyer, Michelle N.; Swauger, Shea; Kopper, Sarah; Vilhuber, Lars. https://doi.org/10.7298/cvqw-v588
  • “Institutional support: Should journals verify reproducibility?” (2022). Imbens, Guido; Salmon, Tim; Whited, Toni; Vilhuber, Lars. https://doi.org/10.7298/992J-RF71

6 References

Ball, Richard. 2023. “‘Yes We Can!’: A Practical Approach to Teaching Reproducibility to Undergraduates.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.9e002f7b.
Banerjee, Abhijit, Esther Duflo, Amy Finkelstein, Lawrence Katz, Benjamin Olken, and Anja Sautmann. 2020. In Praise of Moderation: Suggestions for the Scope and Use of Pre-Analysis Plans for RCTs in Economics. w26993. National Bureau of Economic Research. https://doi.org/10.3386/w26993.
Buck, Stuart. 2024. “We Should Do More Direct Replications in Science.” Harvard Data Science Review 6 (3). https://doi.org/10.1162/99608f92.4eccc443.
Butler, Courtney R. 2023. “Publishing Replication Packages: Insights From the Federal Reserve Bank of Kansas City.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.aba61304.
Chambers, Chris. 2014. “Registered Reports: A Step Change in Scientific Publishing.” In Reviewers’ Update. https://www.elsevier.com/reviewers-update/story/innovation-in-publishing/registered-reports-a-step-change-in-scientific-publishing.
Christian, Thu-Mai, Sophia Lafferty-Hess, William Jacoby, and Thomas Carsey. 2018. “Operationalizing the Replication Standard: A Case Study of the Data Curation and Verification Workflow for Scholarly Journals.” International Journal of Digital Curation 13 (1). https://doi.org/10.2218/ijdc.v13i1.555.
Editors. 2021. “Supporting Computational Reproducibility Through Code Review.” Nature Human Behaviour 5 (8): 965–66. https://doi.org/10.1038/s41562-021-01190-w.
Fanelli, Daniele. 2018. “Opinion: Is Science Really Facing a Reproducibility Crisis, and Do We Need It To?” Proceedings of the National Academy of Sciences 115 (March): 2628–31. https://doi.org/10.1073/pnas.1708272114.
Guimarães, Paulo. 2023. “Reproducibility With Confidential Data: The Experience of BPLIM.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.54a00239.
Hamermesh, Daniel S. 2007. “Viewpoint: Replication in Economics.” Canadian Journal of Economics 40 (3): 715–33. https://doi.org/10.1111/j.1365-2966.2007.00428.x.
Hardwicke, Tom E., and John P. A. Ioannidis. 2018. “Mapping the Universe of Registered Reports.” Nature Human Behaviour 2 (11): 793–96. https://doi.org/10.1038/s41562-018-0444-y.
Hoynes, Hilary. 2023. “Reproducibility in Economics: Status and Update.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.80a1b88b.
Jacoby, William G., Sophia Lafferty-Hess, and Thu-Mai Christian. 2017. “Should Journals Be Responsible for Reproducibility?” In Inside Higher Ed. https://www.insidehighered.com/blogs/rethinking-research/should-journals-be-responsible-reproducibility.
Jones, Maria. 2024. “Introducing Reproducible Research Standards at the World Bank.” Harvard Data Science Review 6 (4). https://doi.org/10.1162/99608f92.21328ce3.
Journal of Development Economics. 2019. Registered Reports at JDE: Lessons Learned so Far. https://www.journals.elsevier.com/journal-of-development-economics/announcements/registered-reports-at-jde.
Journal of Finance. 2021. “Retracted: Risk Management in Financial Institutions.” The Journal of Finance n/a (n/a). https://doi.org/10.1111/jofi.13064.
MacDonald, Graham. 2023. “Open Data and Code at the Urban Institute.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.a631dfc5.
Mendez-Carbajo, Diego, and Alejandro Dellachiesa. 2023. “Data Citations and Reproducibility in the Undergraduate Curriculum.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.c2835391.
Michuda, Aleksandr, Lars Vilhuber, Tim Salmon, et al. 2025. Conference on Reproducibility and Replicability in Economics and Social Sciences (CRRESS) Book. Zenodo. https://doi.org/10.5281/zenodo.17477929.
National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. National Academies Press. https://doi.org/10.17226/25303.
Nosek, Brian A., Emorie D. Beck, Lorne Campbell, et al. 2019. “Preregistration Is Hard, And Worthwhile.” Trends in Cognitive Sciences 23 (10): 815–18. https://doi.org/10.1016/j.tics.2019.07.009.
Nosek, Brian A., Charles R. Ebersole, Alexander C. DeHaven, and David T. Mellor. 2018. “The Preregistration Revolution.” Proceedings of the National Academy of Sciences 115 (11): 2600–2606. https://doi.org/10.1073/pnas.1708274114.
Olken, Benjamin A. 2015. “Promises and Perils of Pre-analysis Plans.” Journal of Economic Perspectives 29 (3): 61–80. https://doi.org/10.1257/jep.29.3.61.
Ottone, Nicholas, and Limor Peer. 2025. “Code Review, Reproducibility, and Improving the Scholarly Record.” Harvard Data Science Review 7 (3). https://doi.org/10.1162/99608f92.f9d748d4.
Peer, Limor. 2024. “Why and How We Share Reproducible Research at Yale University’s Institution for Social and Policy Studies.” Harvard Data Science Review 6 (1). https://doi.org/10.1162/99608f92.dca148ba.
Pérignon, Christophe. 2024. “The Role of Third-Party Verification in Research Reproducibility.” Harvard Data Science Review 6 (2). https://doi.org/10.1162/99608f92.6d4bf9eb.
Salmon, Timothy C. 2023. “The Case for Data Archives at Journals.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.db2a2554.
Vilhuber, Lars. 2019. “Report by the AEA Data Editor.” AEA Papers and Proceedings 109 (May): 718–29. https://doi.org/10.1257/pandp.109.718.
Vilhuber, Lars. 2020. “Reproducibility and Replicability in Economics.” Harvard Data Science Review 2 (4). https://doi.org/10.1162/99608f92.4f6b9e67.
Vilhuber, Lars. 2021. “Report by the AEA Data Editor.” AEA Papers and Proceedings 111 (May): 808–17. https://doi.org/10.1257/pandp.111.808.
Vilhuber, Lars, and Aleksandr Michuda. 2025. Labordynamicsinstitute/Crress: CRRESS Github Repository. Zenodo. https://doi.org/10.5281/zenodo.17477769.
Vilhuber, Lars, Ian Schmutte, Aleksandr Michuda, and Marie Connolly. 2023. “Reinforcing Reproducibility and Replicability: An Introduction.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.9ba2bd43.
Weeden, Kim A. 2023. “Crisis? What Crisis? Sociology’s Slow Progress Toward Scientific Transparency.” Harvard Data Science Review 5 (4). https://doi.org/10.1162/99608f92.151c41e3.
Whited, Toni. 2023. “Costs and Benefits of Reproducibility in Finance and Economics.” Harvard Data Science Review 5 (3). https://doi.org/10.1162/99608f92.63de8e58.

Footnotes

  1. From the eCommons Preservation Support page: “Cornell University Library is committed to responsible and sustainable management of works deposited in eCommons and to ensuring long-term access to those works. […] Current long-term preservation strategies and technologies employed by eCommons are shaped by the Open Archival Information System (OAIS) reference model (ISO 14721:2012) and informed by relevant international standards and emerging best practices.”↩︎

Copyright 2025, Aleksandr Michuda, Lars Vilhuber