We organize the course around randomized design methods, a widely used identification strategy both in development research and in industry A/B testing. The main purpose of this course is to provide a deep dive into the design and then data analysis from RCTs.
We cover development topics as examples of such strategy (e.g. microfinance, secondary schooling, and women’s labor force participation) and discuss other methods as we compare RCT results to quasi-experimental results within topical case studies. This course provides students with tangible skills that they can take away.
The participants in this course typically work in policy-oriented institutions and hubs and/or are conducting graduate studies at the Master or PhD level.
Faculty
Are you passionate about RCTs and want to deepen your knowledge on the topic? This could be the course for you!
Participants typically work in policy-oriented institutions and hubs and/or are conducting graduate studies at the Master’s or PhD level
Deepen your knowledge and understanding of RCTs
By the end of the course, participants will have:
A deep understanding of randomized design methods used in development research and A/B testing
Learned to design and analyze data from randomized controlled trials (RCTs)
Explored development topics such as microfinance, secondary schooling, and women’s labor force participation through RCT examples
Compared RCT results with quasi-experimental methods in topical case studies
Be equipped with practical, transferable skills for applying RCTs and other evaluation methods in real-world contexts
Program Syllabus for Randomized Control Trials (RCTs) in Development Economics
The course will cover the following topics:
The Randomization Revolution in Development Economics
Topics covered: potential outcomes, selection bias, counterfactuals, the experimental ideal, the history of randomized experiments in the social sciences
Research Design for Randomistas
Topics covered: statistical tests, power calculations, stratification, clustering
Analyzing Data From Randomized Experiments, Part 1
Analysing Data From Randomized Experiments, Part 2
Topics covered: attrition, permutation tests, randomization inference, multiple test corrections
Replication and Pre-Analysis Plans
Topics covered: the replication crisis in the social sciences, pre-analysis plans, practical issues in research design and data collection, covariate selection
List of References
The following texts will help participants prepare for the course. Texts are split by topics covered.
The Randomization Revolution in Development Economics
Glennerster and Takavarasha (2013): Running Randomized Evaluations, chapters 1 to 3 (available on JSTOR).
Fisher (1935): Design of Experiments, chapter II (available online).
Additional Related Readings:
Angrist and Pischke (2015): Mastering Metrics, chapter 1 (available online).
Gerber and Green (2012): Field Experiments, chapters 1 and 2.
Jamison (2019): “The Entry of Randomized Assignment into the Social Sciences,” Journal of Causal Inference, 7(1).
Parker (2010): “The Poverty Lab,” The New Yorker.
Research Design for Randomistas
Bruhn and McKenzie (2009): “In Pursuit of Balance: Randomization in Practice in Development Field Experiments,” American Economic Journal: Applied Economics, 1(4): 200–232.
Duflo, Glennerster, and Kremer (2007): “Using Randomization in Development Economics Research: A Toolkit,” Handbook of Development Economics, Volume 4, 2007, Chapter 61, pages 3895–3962 (available from Elsevier or MIT/CEPR).
McKenzie (2012): “Beyond baseline and follow-up: The case for more T in experiments,” Journal of Development Economics, 99(2): 210–221.
Young (2019): “Channeling Fisher: Randomization Tests and the Statistical Insignificance of Seemingly Significant Experimental Results,” Quarterly Journal of Economics, 134(2): 557–598.
Additional Related Readings:
Gerber and Green (2012): Field Experiments, chapters 3 and 4
Glennerster and Takavarasha (2013): Running Randomized Evaluations, chapters 4 to 7.
Analyzing Data from Randomized Experiments, Part 1
Bruhn and McKenzie (2009): “In Pursuit of Balance: Randomization in Practice in Development Field Experiments,” American Economic Journal: Applied Economics, 1(4): 200–232.
Athey and Imbens (2016): “Recursive partitioning for heterogeneous causal effects,” Proceedings of the National Academy of Sciences, 113(27): 7353- 7360.
Wager and Athey (2018): “Estimation and inference of heterogeneous treatment effects using random forests,” Journal of the American Statistical Association, 113(523): 1228-1242.
Additional Related Readings:
Glennerster and Takavarasha (2013): Running Randomized Evaluations, chapter 8
James, Witten, Hastie, and Tibshirani (2021): “Tree-Based Methods.” In An Introduction to Statistical Learning, second edition.
Chernozhukov, Fernandez-Val, and Melly (2013): “Inference on Counterfactual Distributions,” Econometrica, 81(6): 2205-2268.
Analyzing Data from Randomized Experiments, Part 2
Young (2019): “Channeling Fisher: Randomization Tests and the Statistical Insignificance of Seemingly Significant Experimental Results,” Quarterly Journal of Economics, 134(2): 557–598.
Lee (2009): “Training, Wages, and Sample Selection: Estimating Sharp Bounds on Treatment Effects,” Review of Economic Studies, 76(3): 1071-1102.
Anderson (2008): “Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects,” Journal of the American Statistical Association, 103(484): 1481-1495.
Replication and Pre-Analysis Plans
Ozier (2021): “Replication Redux: The Reproducibility Crisis and the Case of Deworming,” World Bank Research Observer, 36(1): pp. 101-130.
Jakiela, Ozier, Fernald, and Knauer (2020): “Evaluating the Effects of an Early Literacy Intervention,” Journal of Development Economics, registered report conditionally accepted based on Stage 1 Pre-Results Review.
James, Witten, Hastie, and Tibshirani (2021): “Linear Model Selection and Regularization.” In An Introduction to Statistical Learning, second edition.
Leaver, Ozier, Serneels, and Zeitlin (2021): “Recruitment, effort, and retention effects of performance contracts for civil servants: Experimental evidence from Rwandan primary schools,” American Economic Review, forthcoming.
Leaver, Ozier, Serneels, and Zeitlin (2018): “Power to the Plan” (available online on the World Bank’s Development Impact blog).
Additional Related Readings
Brodeur, Lé, Sangnier, and Zylberberg (2016): “Star Wars: The Empirics Strike Back,” American Economic Journal: Applied Economics, 8(1): 1-32.
Christensen and Miguel (2018): “Transparency, Reproducibility, and the Credibility of Economics Research,” Journal of Economic Literature, 56(3): 920-980.
Coffman and Niederle (2015): “Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible,” Journal of Economic Perspectives, 29(3): 81-98.
Olken (2015): “Promises and Perils of Pre-Analysis Plans,” Journal of Economic Perspectives, 29(3): 61-80.
Software / Hardware
Every participant taking this course will receive a time-limited personal free license of STATA several days before the start of the Summer School
Empirical exercises will be offered in parallel in R and Stata
Participants should install the STATA software on their laptops for use during the practical sessions or should be comfortable working in R and have RStudio installed
Why join our Summer School?
All BSE Summer courses are taught to the same high standard as our Master’s programs. Join us to:
1
Network with like-minded peers
2
Study in vibrant Barcelona
3
Learn from world-renowned faculty
Admissions and Requirements
Thinking of applying? Please check the admissions requirements below.
Program date: July 7 - 11, 2025
Application deadline: June 30, 2025
Requirements
Summer School applicants normally demonstrate one or more of the following:
A strong background in Economics or a field closely related to the course topic (Statistics, Law, etc.)
Postgraduate degree or current Master’s/PhD studies related to the course topic
Relevant professional experience
Schedule
Here is your schedule for this edition of BSE Randomized Control Trials (RCTs) in Development Economics course.
Time
7
mon
8
tue
9
wed
10
thu
11
fri
11:30 - 13:30
Lecture
16:15 - 17:45
Practical
Credit transfers (ECTS)
To be eligible for credit transfer, students will be assessed through problem sets given to them during the course