Training Course

Remote Policy Evaluation Methods Course

Speaker

Barbara Sianesi

Date & Time

From: 30 January 2023
Until: 1 February 2023

Type

Training Course

Venue

Online only

Prices

HE Delegates: £504
Charity or Government: £924
Other Delegates: £1994.40

Please note the dates for this course:

Please note the dates for this course:

  • January 2 – 29, 2023 for self-paced study
  • January 30, 31 and February 1, 2023 for the 3 live days
  • January 2 – February 22, 2023 for online access to the video materials

Prices include VAT.

TRAINER’S BIO

Please click here for CV.

COURSE OVERVIEW

The course covers research design principles and all main quantitative evaluation methods: randomised experiments, instrumental variables, sharp and fuzzy regression discontinuity designs, regression methods, matching methods and longitudinal methods (before-after, difference-in-differences and synthetic controls).

The lectures present the material at different levels to cater for different backgrounds (e.g. intuition, optional formal derivations, in-depth discussion, examples), while the live classes, after a quick summary of each topic, are devoted to answering questions, as well as to implementing each evaluation method hands-on on real data in guided practicals.

FORMAT OF THE COURSE

The Remote PEM Course consists of a mix between self-paced study and live online meetings:

PART 1) 21.5 hours of pre-recorded lectures. Participants will receive access to the lecture videos for four weeks prior to the course to go through them at their own pace. Note while the videos continue to be accessible for an additional three weeks starting from the live classes, it is highly recommended that participants have finished watching the lectures before the live meetings, which are devoted to questions and hands-on practical work.

PART 2) 16.5 hours of live sessions split over 3 days consisting of

  • A summary/recap of each topic, taking questions relating to that topic
  • Time for participants to work on the corresponding guided practical on their own or in breakout rooms
  • Barbara going through and discussing that practical live using Stata

A rough breakdown of the live sessions:

 Day 1Day 2Day 3
10.00 Evaluation Problem Naive EstimatorInstrumental Variables cont. RDD – sharpMatching Methods Before-After
11.30 BreakBreakBreak
11.45 Randomised ExperimentsRDD – fuzzyDifference-in-Differences
13.15 BreakBreakBreak
13.45 Instrumental VariablesRegression MethodsSynthetic Control Method
15.30 End of dayEnd of dayEnd of day

PLEASE NOTE

  • To participate in the remote course you will need Zoom.
  • To be able to work on the practical part on your own (during the live sessions and/or later in your own time) you will need Stata or R.

COURSE CONTENT

This course deals with the econometric and statistical tools that have been developed to estimate the causal impact on one or more outcomes of interest of any generic ‘treatment’ – from policy interventions in the labour market, health reforms and educational programs, to the returns to education, or the impact of smoking on own and children’s health.

For a successful evaluation it is however not enough to be able to draw on a wide range of approaches and to implement them impeccably. What is also and foremost needed is the establishment of a sound study design. For this reason, design elements, issues and principles are emphasised, exemplified and discussed throughout the course.

After highlighting the ‘Evaluation Problem’ and the challenges it poses, we first discuss how traditional statistical methods and in particular the ‘naive estimator’ may fail to deal with it appropriately, before considering in detail the main empirical methods for estimating causal effects:

  • Randomised experiments
  • Natural experiments or instrumental variables
  • Regression Discontinuity Design (sharp and fuzzy)
  • Regression analysis
  • Matching methods
  • Before-after
  • Difference-in-differences
  • Synthetic control method

Each of these approaches is explained at different levels, following a common structure:

  • give the basic intuition drawing from example applications in the literature,
  • discuss the assumptions needed for its validity,
  • highlight the question it answers,
  • formally show how the assumptions allow the identification of the parameter of interest,
  • discuss in-depth the strengths and weaknesses of the method,
  • provide a ‘take-away’ checklist, and
  • implement the approach ‘hands-on’ on real data in a practical session.

The lectures cover (a)-(f) and the live classes (g).

LEARNING OUTCOMES

By the end of the course, participants will have a practical understanding of the ideas underpinning the evaluation literature, as well as of the main quantitative evaluation methods. In particular, they will be able to:

  • frame a variety of micro-econometric problems into the evaluation framework, and be aware of the concomitant methodological and modelling issues;
  • be discerning users of econometric output – able to interpret the results of applied work in the evaluation literature and to assess its strengths and limitations;
  • access the evaluation literature to further deepen knowledge on their own;
  • choose the appropriate evaluation method and strategy to estimate causal effects in different contexts; and
  • use simple statistical packages (like Stata or R) to implement the different evaluation methods to real data.

LEVEL OF KNOWLEDGE REQUIRED

This is an intermediate-level course on quantitative empirical methods for policy evaluation. As such, familiarity with basic formal notation, basic statistical concepts (e.g. conditional expectation, significance testing) and basic econometric tools (like OLS and probit regression) is recommended. Note however that all formal parts are optional and followed by intuitive, formulae-free “take-away points”.

On the other hand, while offering an in-depth and thorough overview and discussion of the various evaluation methods, this is not an advanced course at the post-graduate level. Most emphasis is devoted to understanding the issues, to the choice of the most appropriate method for a given context and to the implementation of evaluation methods in practice.

The practical, hands-on part of the course requires the use of statistical software. The course relies mainly on Stata, as the guided exercises provide syntax and additional tips for Stata, and we will go through and discuss together each exercise making use of Stata. As with the lectures, however, the aim is to make this hands-on part widely accessible to participants with different backgrounds and interests:

  • For those new to or not familiar with Stata, a short handout and a video tutorial highlight all that’s needed to know about this package in order to be able to do the guided practicals.
  • Commands to solve the practicals are provided also in R, for those who prefer to use R when working on the exercises on their own. (Note that in some cases the commands in R and Stata will not be doing exactly the same thing; what is however important is to understand the basic principles underlying the implementation of a given evaluation method and what to look out for in terms of implementation details and output).
  • Finally, solutions in the form of Stata ‘do-files’ and R scripts are shared in advance for those who would like to take a peek if stuck at any point, or who prefer to just run the full solution file and focus on interpreting the resulting output.

Note also that while participants have often come from a range of backgrounds and have reported enjoying the course, the examples and discussions come from a mostly (labour) economics perspective.

BACKGROUND READING

No preparatory reading is required as the course does not assume prior knowledge of evaluation methods and issues.

Some past participants have however found it useful to scan the following paper in advance of the course to get an idea of the issues, notation and more generally some of the material:

Blundell, R., Dearden, L. and Sianesi, B. (2005), “Evaluating the Effect of Education: Models, Methods and Results from the National Child Development Survey”, Journal of the Royal Statistical Society, Series A, 168, 473-512. IFS Working Paper.

Alternatively, a gentle introduction to evaluation issues and methods is offered by Ravallion, M. (1999), “The Mystery of the Vanishing Benefits: Ms. Speedy Analyst’s Introduction to Evaluation”, World Bank Policy Research Working Paper No. 2153


PAST PARTICIPANTS’ FEEDBACK

Public sector (January 2021): Brilliant course. Lectures fantastically presented and explained. Barbara is clear in her explanations and explains things in multiple ways, as well as explaining the intuition well. Online pre-recorded lectures done very well – use of slides and annotations done well. Recording also done really well. The pre-recordings gave me time to really think about things deeply before the live sessions.I really like the split of pre-recorded sessions and live sessions. Overall fantastic! Barbara is super engaging and explains this so well. I thought there was a great level of detail while still being very accessible since Barbara starts with the basics and explains things informally as well as formally. Really good use of slides/annotations etc when presenting virtually despite this being the first time the course was run online! It doesn’t seem that way! Thanks a lot 

Private sector or other (January 2021): The amount of material covered in this course is very impressive, the lectures are clear and well explained and making them compulsory prior to the live course is very important.

HE lecturer or other academic (January 2021): Barbara’s excellent – very patient with questions, very professional, and gives very helpful advice.

HE lecturer or other academic (January 2021): Just to say thank you Barbara, this was a great course and there was something in it for all levels of learner

HE lecturer or other academic (March 2021): This is an excellent course! Extremely helpful for my own research. Watching the recorded lectures are particularly useful for me to have a good understanding of the lectures, and then the live sessions are ideal for getting my questions answered.

HE student (March 2021): I have no comments other than thank you for the very clear explanations and plenty of examples! This was a great course for participants of all levels!

HE student (March 2021): This is a great course – it was recommended to me by several of my colleagues and I’ll keep recommending it to people in turn.

HE lecturer/academic (May 2021): Barbara is an excellent teacher, explaining everything well with authentic examples. An excellent course! Many approaches, done thoroughly.

HE student (May 2021): Very interactive. Barbara did a great job with the format and content of the sessions. The content is intensive but she simplified the material in an easy to follow and understand manner. Also the format to do a topic and then the practical exercise was good. I also like that the lecture slides and vidoes were shared in advance for the self paced learning before the live session. It was also great to have flexibility whether to join a breakout room or not. Excellent course and excellently delivered by Barbara.

For more information please contact cemmapevent@ifs.org.uk