Please note the dates for this course:
- Jun 3 – Jun 30, 2024 for self-paced study
- Jul 1 – 3, 2024 for the 3 live days
- Jun 3 – Jul 24, 2024 for online access to the video materials
Early access to the lecture materials will be granted from May 20 to those registering early.
Prices include VAT.
TRAINER’S BIO
Please click here for CV.
COURSE OVERVIEW
This comprehensive course delves into the principles of research design and explores all main quantitative evaluation methods: randomised experiments, instrumental variables, sharp and fuzzy regression discontinuity designs, regression methods, matching methods and longitudinal approaches such as before-after, difference-in-differences and synthetic controls.
The lectures are thoughtfully structured to accommodate participants with diverse backgrounds. Each topic is presented at various levels, ensuring accessibility through intuitive explanations, optional formal derivations, in-depth discussions, and practical examples. The live classes offer a summary of each topic and provide an interactive platform for participants to ask questions and actively engage in guided practical sessions where they will apply each evaluation method to real-world data.
By the end of this course, participants will have acquired a solid foundation in research design principles and possess practical skills to implement quantitative evaluation methods effectively.
FORMAT OF THE COURSE
The Remote PEM Course offers a flexible learning experience through a combination of self-paced study and live online meetings. The course is structured into two parts:
Part 1) Pre-recorded Lectures (21.5 hours): The pre-recorded lecture videos will be made available four weeks (6 weeks for early registrants) prior to the course start date, allowing participants to study the material at their own pace. While the videos remain accessible for an additional three weeks starting from the live sessions, it is recommended that participants complete the lectures before the live meetings, which are designed to focus on questions and practical exercises.
Part 2) Live Sessions (19.5 hours): The live classes are spread over three days and consists of the following:
- Summary and Recap: Barbara, the instructor, will provide a brief summary and recap of each topic covered in the pre-recorded lectures. Participants will have the opportunity to ask questions related to the specific topic being discussed.
- Guided Practical Work: Participants will be given dedicated time to work on guided practical exercises either individually or in breakout rooms. These exercises will involve utilizing Stata or R software to implement each evaluation method to real data.
- Live Demonstration: Barbara will conduct a live demonstration of the practical exercise using Stata. She will guide participants through the process, explaining each step and discussing the key aspects of the exercise.
Breakdown of Live Sessions:
Day 1 | Day 2 | Day 3 |
Evaluation Problem Naive Estimator Randomised Experiments Instrumental Variables 1/2 | Instrumental Variables 2/2 RDD-sharp RDD-fuzzy Regression Methods | Matching methods Before-After Difference-in-Differences Synthetic Control Method |
Indicative Class Timings:
10:00 | Class starts |
11:30 | 15-minute break |
13:15 | 40-minute break |
15:00 | 15-minute break |
16:30 | Class concludes |
The live classes, barring tech glitches, will be recorded and remain available for three weeks after the end of the course.
The course structure ensures a balanced approach between theoretical knowledge gained from pre-recorded lectures and practical implementation during live sessions, promoting an interactive and hands-on learning experience.
IMPORTANT NOTE:
- Please make sure you have access to Zoom (via the app, or simply accessed via web browser), as the live sessions will be conducted through this platform.
- To actively engage in the practical exercises, both during the live sessions and for self-guided practice later on, it is necessary to have either Stata or R software.
COURSE CONTENT
This comprehensive course focuses on the application of econometric and statistical tools to estimate the causal impact of various treatments on one or more outcomes of interest. The treatments can range from policy interventions in the labour market, health reforms and educational programs to studying the returns to education or the effects of smoking on personal and children’s health.
To conduct a successful evaluation, it is not enough to be able to draw on a wide range of approaches and to implement them impeccably. What is also and foremost needed is to establish a sound study design. For this reason, design elements, issues and principles are emphasised, exemplified and discussed throughout the learning journey.
The course begins by carefully setting up the Evaluation Framework and highlighting the “Evaluation Problem” and the challenges it presents. After providing a comprehensive understanding of why simple methods like the “naive estimator” (for cross-sectional designs) or the “before-after” approach (for longitudinal designs) are likely to fall short in adequately addressing these problems, the course extensively explores the main empirical methods for estimating causal effects:
- Randomised experiments
- Natural experiments or instrumental variables
- Regression Discontinuity Design (sharp and fuzzy)
- Regression analysis
- Matching methods
- Difference-in-differences
- Synthetic control method
Each of these approaches is thoroughly explained at different levels, following a consistent structure:
- provide the basic intuition behind each method by drawing from real-world applications in the literature,
- discuss the assumptions required for the validity of each approach,
- highlight the specific causal question each method answers,
- formally demonstrate how the underlying assumptions enable the identification of the parameter of interest,
- discuss in-depth the strengths and weaknesses associated with each method,
- provide a practical “take-away” checklist, and
- facilitate hands-on implementation of each approach on real data during practical sessions.
The lecture component of the course covers all points except the last one, which is addressed in the live classes.
LEARNING OUTCOMES
Upon completion of the course, participants will have acquired practical skills and in-depth knowledge in the field of quantitative evaluation research. The key learning outcomes include:
- Framing micro-econometric problems as causal questions: Participants will develop the ability to skillfully position a wide range of micro-econometric problems within the evaluation framework, while placing utmost importance on design considerations.
- Choice of evaluation methods: Participants will be able to select and employ appropriate evaluation strategies for estimating causal effects in diverse contexts. They will gain the expertise to identify the most suitable approach for specific research questions, institutional contexts and data characteristics.
- Implementation of evaluation methods: Participants will acquire the competence and hands-on experience to utilize simple statistical packages, such as Stata or R, to apply various evaluation methods to real-world data.
- Interpretation of evaluation work: Participants will become discerning users of econometric output, able to interpret and analyze the results of applied work in the evaluation literature, while critically assessing both the strengths and limitations of the findings.
- Further learning: Participants will be equipped with the frameworks, terminology and tools to access and navigate the evaluation literature independently. They will thus be able to further deepen their knowledge and understanding of evaluation research through self-directed learning.
LEVEL OF KNOWLEDGE REQUIRED
This is an intermediate-level course on quantitative empirical methods for policy evaluation. As such, familiarity with basic formal notation, basic statistical concepts (e.g. conditional expectation, significance testing) and basic econometric tools (like OLS and probit regression) is recommended. However, it is important to note that all formal parts of the course are optional and supplemented with intuitive, formulae-free “take-away points”.
While the course offers a comprehensive overview and in-depth discussion of various evaluation methods, it is not an advanced post-graduate level course. Most emphasis is placed on understanding the underlying issues, selecting appropriate methods for specific contexts, and implementing evaluation methods in practice.
The practical, hands-on portion of the course requires the use of statistical software. Stata is utilized to illustrate some concepts (notably related to the mechanics of matching methods) in the lectures, as well as for going through and discussing each exercise during the classes. At the end of the course participants will also receive a PDF document containing the full solutions to the exercises; these solutions will be based on Stata logged output and will include the trainer’s comments and all embedded graphs. However, the aim is to make this hands-on part widely accessible to participants with different backgrounds and interests:
- For those new to or not familiar with Stata, a short handout and a video tutorial highlight all that’s needed to know about this package in order to be able to do the guided practicals.
- The guided practicals include syntax tips for both Stata and R.
- Complete code for solving all the practical exercise is provided both in Stata and R. Additionally, R commands include translated versions of Barbara’s pstest and film Stata ados. (Note that there may be slight differences in functionality between the two software packages).
- Furthermore, these solutions in the form of Stata do-files and R scripts are shared in advance for participants who may need guidance or who prefer to just run the full solution file and focus on interpreting the resulting output.
It’s important to note that while participants from a range of backgrounds have enjoyed the course, the examples and discussions primarily stem from a (labour) economics perspective.
BACKGROUND READING
No preparatory reading is required for this course, as it is designed to be accessible to participants without prior knowledge of evaluation methods and issues.
However, some past participants have found it beneficial to scan the following paper before the course, as it provides an overview of the key issues, notation and general material covered:
Blundell, R., Dearden, L. and Sianesi, B. (2005), “Evaluating the Effect of Education: Models, Methods and Results from the National Child Development Survey”, Journal of the Royal Statistical Society, Series A, 168, 473-512. IFS Working Paper.
Alternatively, a gentle introduction to evaluation issues and methods is offered by Ravallion, M. (1999), “The Mystery of the Vanishing Benefits: Ms. Speedy Analyst’s Introduction to Evaluation”, World Bank Policy Research Working Paper No. 2153
PAST PARTICIPANTS’ FEEDBACK
Please click here for latest feedback
Public sector(January 2021): Brilliant course. Lectures fantastically presented and explained. Barbara is clear in her explanations and explains things in multiple ways, as well as explaining the intuition well. Online pre-recorded lectures done very well – use of slides and annotations done well. Recording also done really well. The pre-recordings gave me time to really think about things deeply before the live sessions.I really like the split of pre-recorded sessions and live sessions. Overall fantastic! Barbara is super engaging and explains this so well. I thought there was a great level of detail while still being very accessible since Barbara starts with the basics and explains things informally as well as formally. Really good use of slides/annotations etc when presenting virtually despite this being the first time the course was run online! It doesn’t seem that way! Thanks a lot
Private sector or other (January 2021): The amount of material covered in this course is very impressive, the lectures are clear and well explained and making them compulsory prior to the live course is very important.
HE lecturer or other academic(January 2021): Barbara’s excellent – very patient with questions, very professional, and gives very helpful advice.
HE lecturer or other academic(January 2021): Just to say thank you Barbara, this was a great course and there was something in it for all levels of learner
HE lecturer or other academic(March 2021): This is an excellent course! Extremely helpful for my own research. Watching the recorded lectures are particularly useful for me to have a good understanding of the lectures, and then the live sessions are ideal for getting my questions answered.
HE student (March 2021): I have no comments other than thank you for the very clear explanations and plenty of examples! This was a great course for participants of all levels!
HE student (March 2021): This is a great course – it was recommended to me by several of my colleagues and I’ll keep recommending it to people in turn.
HE lecturer/academic(May 2021): Barbara is an excellent teacher, explaining everything well with authentic examples. An excellent course! Many approaches, done thoroughly.
HE student (May 2021): Very interactive. Barbara did a great job with the format and content of the sessions. The content is intensive but she simplified the material in an easy to follow and understand manner. Also the format to do a topic and then the practical exercise was good. I also like that the lecture slides and vidoes were shared in advance for the self paced learning before the live session. It was also great to have flexibility whether to join a breakout room or not. Excellent course and excellently delivered by Barbara.
For more information please contact cemmapevent@ifs.org