K. Muralidharan & V. Sundararaman

NBER Working Paper No. 15323 (2009) 

Principal Research Question and Key Result Does performance based pay for teachers improve student performance? In an experiment in India, students who had teachers subject to performance incentives performed between 0.28 and 0.16 standard deviations better than those in comparison schools.


Theory It is not clear that monetary incentives will always align the preferences of the principal and the agent. In some cases they may crowd out intrinsic motivation leading to inferior outcomes. Psychological literature indicates that if incentives are perceived by workers as a means of exercising control they will tend to reduce motivation, whereas if they are seen as reinforcing the norms of professional behaviour then this can enhance intrinsic motivation.

Additionally whether incentives are at a class or school level will be of importance. This is because in the school results model (how schools perform on aggregate) there will be incentives to free ride. This is not the case if incentives operate at the individual teacher level. The problem may be reduced in small schools where teachers are better able to monitor each other’s efforts at a relatively low cost.


Motivation There are generally two lines of thought regarding how to improve school quality. The first argues that increase inputs are needed. This might include text books, extra teachers, better facilities etc. The other option is to implement incentive based policies to improve existing infrastructure, and perhaps improve individual selection into the teaching sector.


Experiment/Data The experiment took place in Andhra Pradesh which has been part of the Education for All campaign in India, but sees absence rates of around 25% and low student level outcomes. There were 100 control schools, 100 group bonus schools (all teachers received same bonus based on average performance of the school), and 100 individual bonus schools (incentive based on performance of students of a particular teacher). Focussing on average scores ensures that teachers do not just focus on getting those kids near the threshold up, thus excluding less able children. No student is likely to be wholly excluded given the focus on averages. Additionally, there was no incentive to cheat, as children that took the baseline test, but not the end of year test were assigned a grade of 0 which would reduce the average of the class.

A test was administered at the start of the programme/school year which covered material from the previous school year. Then at the end of the programme a similar test was given, with similar content, and then a further test which examined the material from the current school year (that they have just completed). The same procedure was done at the end of the second year. Having overlap in the exams means that day specific measurement error is reduced. The tests included mechanical and conceptual questions.



Tijkm(Yn) = α + β[Tijkn(Y0)] + γ(Incentives) + δ(Zm) +εk + εjk+ εijk 

T is the test score, where i j k m indicate student, grade, school, and mandal (region) respectively. Y0 indicates baseline tests, and Yn indicates the end of year tests. The baseline results are included to improve efficiency by controlling for autocorrelation between the test scores across multiple years. Zm is a vector of mandal dummies (fixed effects) and standard errors are clustered at the school level.  Delta is the coefficient of interest.


Results Students in incentive schools scored 0.15 standard deviations higher than the comparison schools at the end of the first year and 0.22 at the end of the second. This averages across maths and languages (although disaggregated the effect for maths was higher). NB Whilst the year 1 to year 0 comparison is valid, and the year 2 to year 0 is valid as well, technically the comparison of year 2 to year 1 (column three of table II) is not experimental estimation as year 1 results are already post experimental outcomes.

They examine for heterogeneous treatment effects by including relevant variables and interacting them with the INCENTIVE dummy., and find that none of them (no. students/school proximity/school infrastructure/parental literacy/caste/sex/) see differential effects from the programme indicating that the benefits are widely based and not conditional on a set of predetermined characteristics. The only interaction for which there is a mall effect, is household affluence. These then are broad-based gains. As the variance of test scores in individual school went up, this might indicate that teachers responded differently, as it seems there were no barriers for all types of children and schools to benefit from the programme (no heterogeneous effects).

When they include teacher characteristics such as education and training, the see no significant effect, but when they interact these measures with the INCENTIVES dummy they are positive and significant, indicating that high quality teachers alone may not sufficient if they are not incentivized to use their skills to maximum effect.

Teachers that were paid more responded less, presumably as they are more experienced (less conducive to change) and the bonus represented a smaller fraction of their total income.

Happily the results were similar for both the conceptual and mechanical questions, indicating that real learning is taking place, rather than just rote reproduction. Additionally students in incentive schools performed better in non-incentive subjects like science. NB it is possible that teachers diverted energy from teaching non-incentive subjects to teaching incentive subjects for obvious reasons. This result does not disprove that, but it says that in the context studied improvement in teaching in certain subjects can have spillovers into other subjects.

Both group and individual incentives were effective. However, schools size was typically between 3 and 5 teachers, so probably too small to separate effects. Group incentives may not work in larger schools.

Interestingly there was no increase in teacher attendance. In interviews after the experiment teachers said they gave extra classes, and were more likely to have set and graded homework.

  • They tested the equality of observable characteristics across the control/treatment groups and could not reject the null that they were equal indicating that randomization was successful. Additionally, all schools (including control) were given the same information and monitoring, to ensure that differences in the treatment were not merely due to the Hawthorne effect.
  • There was no significant difference in attrition, and the average teacher turnover was the same across schools indicating that there was no sorting of teachers into the incentive schools.
  • They control for school and household characteristics which does not change the estimated value of delta, thus confirming the randomization.
  • A parallel study provided schools with money to purchase extra inputs, and the incentive levels were set such that they came to a similar amount of funds as the input schools. The input schools did see a positive effect, but to a much lesser degree. Additionally, the incentive programme actually ended up costing much less.


Interpretation Programme design is extremely important. In particular how the teachers feel about incentives may affect performance, and the size of schools may mean that benefits from group incentives are not seen due to the ability of teachers to freeride on the back of their colleagues.

Given that the study was compared with an input study in the same region and found improved results, it would seem that funding should be allocated to incentive schemes rather than input schemes. In addition, rather than raiding pay by 3% each year, that 3% could be allocated to the bonus scheme, and thus it would actually cost virtually nothing to run (other than the administering of the tests etc.). However, a mix of policies is probably a good idea, especially since the incentive scheme did not improve absence rates. As other literature has shown improving infrastructure etc. can lead teachers to be present more, so this could be one option for the input schemes.



Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: