Guidelines for Computational Reproducibility in Economics
Introduction
Beyond binary judgments
Stages of the exercise
Recording the results of the exercise
Reproduction Strategies
1
Scoping
1.1
From candidate to declared paper
1.1.1
Candidate paper entries in the ACRE Database
1.2
Scoping your declared paper
1.2.1
Read and summarize the paper
1.2.2
Record scope of the exercise
1.3
Identify your relevant timeline.
2
Assessment
2.1
Describe the inputs.
2.1.1
Describe the data sources and raw data.
2.1.2
Describe the analytic data sets.
2.1.3
Describe the code scripts.
2.2
Connect each output to all its inputs
2.2.1
Complete workflow information
2.2.2
Incomplete workflow information
2.2.3
Unused Data Sources
2.3
Assign a reproducibility score.
2.3.1
Levels of Computational Reproducibility for a Specific Output
2.3.2
Reproducibility dimensions at the paper level
3
Improvements
3.1
Types of output-level improvements
3.1.1
Adding raw data: missing files or metadata
3.1.2
Adding missing analytic data files
3.1.3
Adding missing analysis code
3.1.4
Adding missing data cleaning code
3.1.5
Debugging analysis code
3.1.6
Debugging cleaning code
3.1.7
Reporting results
3.2
Types of paper-level improvements
3.2.1
Reporting improvements
4
Checking for Robustness
4.1
Mapping the universe of robustness checks
4.2
Proposing a specific robustness check
5
Concluding the Reproduction
5.1
Outputs
5.2
Possible anonymity and data sharing
6
Guidance for a Constructive Exchange Between Reproducers and Original Authors
6.1
For Reproducers Contacting the Authors of the Original Study
6.1.1
Contacting the original author(s) when there is no reproduction package
6.1.2
Contacting the original author(s) to request specific missing items of a reproduction package
6.1.3
Asking for additional guidance when some materials have been shared
6.1.4
Response when the original author has refused to share due to
undisclosed reasons
6.1.5
Response when the original author has refused to share due to legal or ethical restrictions of the data
6.1.6
Contacting the original author to share the results of your reproduction exercise
6.1.7
Responding to hostile responses from original authors
6.2
For Original Authors Responding to Requests from Reproducers
6.2.1
Responding to a repeated request
6.2.2
Acknowledging that some information is missing
6.2.3
Acknowledging that some material is still embargoed for future research
6.2.4
Responding to incomplete/aggressive requests from reproducer
7
Reproduction Diagrams
7.1
Different Scenarios
7.1.1
Complete
7.1.2
Raw data and analytic data, but cleaning code is missing.
7.1.3
Possible Missing Code to produce a display item. (Reproduction attempt from an anonymous student conducted as part of the Graduate Development Economics course, UC Berkeley)
7.1.4
Long, complicated tree. (Reproduction attempt from an anonymous student conducted as part of the Graduate Development Economics course, UC Berkeley)
8
Tips and Resources for Reproducible Workflow
8.1
Summary on reproducible workflow (Chapter 11) from
Christensen, Freese, and Miguel (
2019
)
:
8.2
Links
9
Contributions
9.1
Contributing feedback on these guidelines
9.2
List of Contributors: Guidelines content and source code:
10
Acknowledgments
11
Definitions
11.1
Concepts in reproducibility
11.2
Concepts in the ACRE exercise and the platform
References
Published with bookdown
Guidelines for Verification of
Computational Reproducibility in Economics
Chapter 10
Acknowledgments
Support for the development of these guidelines was provided by
Arnold Ventures
.