Deaths in Iraq

Given its potential to attract hordes of nutcases to my comments section, I have been steadfastly ignoring the Lancet’s report on the increased mortality rate in Iraq since the invasion. Since Peter Cairns has asked me about it in the comments section below and I have been tagged by Matthew Turner I suppose I must reluctantly enter the fray. Thanks Peter & Matthew ;)

The study in Iraq was, for all intents and purposes, an opinion poll. The researchers from Johns Hopkins University sent interviewers to a sample of houses in Iraq and asked them how many people had died in that house since the invasion (they also asked how many people had died for a prior before the invasion, to act as a baseline of “normal” mortality). This allowed them to say that x% of people had died in Iraq over the period, therefore given Iraq’s total population the “excess” deaths amounted to about 655,000 people.

Obviously such a suggestion is politically charged, especially since it conflicts strongly with the Iraqi government’s official figures and the figures from third parties like Iraq Body Count. In my own view this doesn’t matter – the Iraq government has good reason to downplay casualties and is governing a country in a state approaching civil war where much of the government infrastructure has broken down. Why Iraq Body Count was ever viewed as anything other than a minimum number of casulties is beyond me – I have not the slightest reason to think that there are not vast numbers of deaths in Iraq that are not reported in two separate English language media sources (something I’ve often pondered doing is taking the UK murder statistics, and seeing if I can track down newspaper reports for every single murder in a given year. My guess is that every single killing in a media-heavy, highly developed state like the UK is reported in the local press at the very least, but it would be interesting to actually check). I wouldn’t regard official figures from a country in a state of chaos, or a count of deaths mentioned in the media as being in any sense reliable enough to use to gauge the accuracy of a study that at least attempts to be a rigorous scientific inquiry.

So, to the study itself. I am not going to explain the basic logic of extrapolating findings based on a small sample to a larger population here, though some of the dismissals have been based on the fallacy that you cannot extrapolate 629 deaths up to 655,000. The method of cluster sampling used is well established, has been successfully used in the past. In short, in the same way that we can accurately predict an election result to within a few percentage points based on interviewing just 1,000 people, we can estimate how many people have died in the whole country by seeing how many people died in a smaller group of residences and factoring up. If polls work, and they do, then so should this, despite it being a relatively small number. Unless something has gone wrong with the methodology of the survey the chances of the researchers just happening to have knocked on doors with a disproportionate number of deaths that do not reflect the country as a whole, and of there actually being fewer than 392,979 excess deaths is only 2.5% (the same chance that there have been more than 942,636 excess deaths).

So, what could have gone wrong? The more excitable fringes of the US blogosphere have come out with some interesting stuff. Let’s look at criticisms that don’t hold water first.

Firstly, the turnout is unbelievably high. The report suggests that over 98% of people contacted agreed to be interviewed. For anyone involved in market research in this country the figure just sounds stupid. Phone polls here tend to get a response rate of something like 1 in 6. However, the truth is that – incredibly – response rates this high are the norm in Iraq. Earlier this year Johnny Heald of ORB gave a paper at the ESOMAR conference about his company’s experience of polling in Iraq – they’ve done over 150 polls since the invasion, and get response rates in the region of 95%. In November 2003 they did a poll that got a response rate of 100%. That isn’t rounding up. They contacted 1067 people, and 1067 agreed to be interviewed.

Secondly, people have been understandably confused by the mention of death certificates. Whenever possible interviewers asked if they could see the death certificate of people reported dead during the study. In 92% of cases those asked produced the certificate. This presents an apparant discrepancy – if over 80% of the deaths had been officially recorded, how come official Iraqi estimates of the dead were so low? The explanation given by the report – which seems perfectly reasonable – is that hospitals have continued to issue death certificates, but the system of collating the figures centrally has broken down to a large extent. In other words, a doctor in Iraq may still be giving out the paper certificates, but the figures are not necessarily passed on or registered with any higher authority.

Thirdly, some people have pondered whether Iraq’s mortality rate from before the invasion as determined by the study seems unfeasibly low at 5.5 per 1000. This compares to mortality figures of 10.1 for the European Union, a group of far more developed countries with better nutrition and health care. If Iraq’s pre-invasion mortality figure is artifically low, then it would wrongly inflate the number of excess deaths. However, the difference is actually because Iraq has a far younger population than the EU. Apart from countries in Southern Africa where AIDS is endemic, developed countries tend to have a higher mortality rate because they have more elderly people in proportion to young people, and an old person in a “safe” country is still more likely to die than a young fit person in an “unsafe” country. It seems that 5.3 is a perfectly reasonable figure when compared to mortality rates for similar countries like Egypt (5.2), Iran (5.6), Tunisia (5.1), Syria (4.8), Qatar (4.7), Bahrain (4.1).

Moving on, some people have raised more substantial concerns. Firstly, two of Iraq’s governates were not sampled because of errors, these areas were in the extreme North and South of the country, away from the Sunni dominated centre where the violence has been worst. It seems reasonable to assume that these two areas are likely to have reported a low casualty rate, and therefore the remaining figures are artificially high. That may be so, but they contain only 5% of the population so even if they had a very low casualty rate indeed the effect would be very small. The headline figure of 655,000 extra deaths was based on the population excluding those areas, so is still sturdy.

Secondly, the populations of the governates were estimated using 2004 figures, if there has been substantial population movements since then it could skew the figures. Again, this is a legitimate concern, but population movements would have to be very large to make a truly significant difference.

Thirdly, several people have pondered the “word of mouth” effect. The researches state in their report that having explained to the first house in a cluster their good intentions, word of mouth travelled ahead of them and made it easier to presuade the rest of the cluster of their good intentions. Some people have, quite reasonably, asked whether this could skew the result – could people with deaths in the family have become more or less likely to take part in the survey? In theory yes, they could, but given the response rate of 98% there is very little space for it to have made a difference. If it made people with deaths more likely to take part, they are 98% likely to have done so anyway. If if made them less likely to take part, it obviously didn’t have much effect.

Fourthly, interviewers had some leeway to change the area they were interviewing if the designated area was too dangerous. This could result in bias, but it seems more likely that it would lead to an underestimate in the number of deaths, as interviewers avoided the sort of area where lots of people get killed. Of course, interviewers could avoid a safer area because they had to travel through a dangerous area to get to it, but it still seems unlikely that a systemic bias of this sort could overall lead to a upwards bias in the number of deaths.

Finally, there have been questions over whether the sampling technique was biased towards urban areas. This is the most substantial problem in my view. The way the location of clusters was determined was thus – first the governate was determined, then a main road in the province was randomly selected, then a road leading off of that main road was randomly selected, and then a house on that road. It seems to be that this approach should skew the location of clusters towards urban areas, and make it less likely that rural areas and areas with informal housing like refugee camps would be selected. If violence is concentrated in urban areas this could skew the sample and give an artifically high number of casualties.

Overall, the study seems sound. There are some legitimate questions about the effect of the two missing provinces and any large population movements, but at the end of the day they would have quite minor effects on the total: they are not suddenly going to bring the figures into line with the Iraqi government figures or the Iraq Body Count figures. The possible effect of an urban bias is more worrying, potentially this could skew the figures upwards. That said, 77% of Iraq’s population live in urban areas, so even if there is a systemic bias here, in a worst case scenario of non-urban areas being entirely missed out and 23% of the country actually having a much lower mortality rate, it is not going to be a drastic change. For example – the report found the post-election death rate to be 13.2, if that actually applied only to urban areas, and the mortality rate in non-urban areas was still 5.5 (the pre-invasion figure), the overall rate would still be 11.4, which still equates to hundreds of thousands of extra deaths.

The study is based on a survey in a country in a state of near civil war and without accurate population estimates, it is not a perfect situation to be working in and obviously there are going to be question marks here and there. Surveys in developed countries aren’t perfect, let alone in war zones. The bottom line though is that the study suggests that the increase in deaths since the invasion is indeed far higher than estimates from other sources suggest.


Comments are closed.