Aid work can be a dangerous occupation. Aid workers face a variety of hazards, including intentional violence, traffic accidents, disease and stress. Many humanitarian disasters involve conflict, and most conflicts occur against a backdrop of collapsed states, where treaties and UN Charters are not observed. Under these conditions, civilians and aid workers are afforded little protection. Some level of exposure to risks is inevitable, and may even be necessary to get the job done. But the imperative that drives most aid workers to help people in need may encourage them to put themselves into excessively dangerous situations.
Mortality rates among aid workers
In an article in the British Medical Journal in 2000, Mani Sheik and colleagues at the Johns Hopkins School of Hygiene and Public Health reported 375 deaths among civilian UN and NGO aid workers and UN peacekeepers in the 14 years from 1985. Their conclusions are presented in the table below.
The number of deaths rose from 1985, and peaked in 1994 at the time of the Rwandan crisis. The high number of intentional deaths compared to road accident fatalities reflects the increasingly violent conditions that aid workers face.
Against this background, this article describes research that looks at the risks aid workers are typically exposed to, and how they perceive them. It is based on interviews with experienced aid workers carried out in June last year.
What concerns aid workers?
Most of the aid workers interviewed were mainly concerned with two matters: the quality of aid they were able to give, and their security. Their driving motive was to make real contact with people and to help them. Security was a major problem especially if provided by the military, since this could compromise their impartiality (and make them more of a target). Another problem sometimes encountered was hostility from the beneficiaries themselves. This hostility was particularly difficult for aid workers to deal with because, in part, it was due to their inability to provide beneficiaries with adequate relief supplies.
Among the first people to look into our perceptions of risk were the psychologists Amos Tversky and Daniel Kahneman, in the 1970s. They suggested that, in order to manage information and simplify decision-making, we tend to use three basic rules of thumb when we analyse risk.
Vivid memories and images
Our assessment of risk in a given situation may be greatly affected by personal memories. High-profile, dramatic events that we identify with are more easily available to us, and so we judge the associated risk to be higher. This operated in all the interviews I conducted with aid workers. Events that they or their colleagues had experienced conjured up powerful, easily-retrievable images. One interviewee explained how the memory of the death of his brothers had affected his behaviour towards the risk of HIV: I know the consequences. I come from a big family. We were 11 of us and three of my brothers died of AIDS.
A vivid image can also affect the level of perceived risk. One woman explained why she would not wear a red T-shirt when on a field trip: I dont wear a red T-shirt if I go into the field because if I have to run from the car and theres nothing in the field, if I lie down Im still a target. She has never been involved in such an incident, but imagines it happening, and so has made contingency plans. Security briefings and training may have contributed to the vividness of such thinking.
One interviewee classed herself as being at low risk of HIV infection: With HIV/AIDS Im not in any high risk behaviour group. This stereotyped image of herself leads her to ignore the fact that it is not the risk group she is in but what she does to avoid HIV that is important. Another interviewee working in a refugee camp saw the refugees as former combatants, and therefore felt that they represented a greater threat. So, even though the refugees were unarmed, they were still seen as more dangerous than ordinary citizens. This may be a sensible precaution or it may be an over-estimate of the threat they pose.
Anchoring and adjustment
Once we make an initial judgement about a risk, we become anchored to it and may fail to make adequate adjustments later. Anchoring also involves an over-reliance on others judgements. One interviewee described how they relied on the decisions of a security officer about which areas were safe to visit: He makes sure that he knows what is going on, where we can go, what time we can go in and what time we have to get out of that area. So thats been taken care of. Security officers may be in the best position to judge risks, but becoming too reliant on their judgements may discourage people from questioning their decisions.
Unrealistic optimism and over-confidence
Unrealistic optimism is a common bias: we tend to believe that positive things will happen to us, and negative things to other people. Another bias is over-confidence in our own judgements. One interviewee commented that now that Ive lived there for two years, nothing has ever happened on my compound or in my neighbourhood. Its pretty safe. This judgement may be over-confident and over-optimistic because the implication is that things only happen in other peoples neighbourhoods.
Escalation is when we plough on with a particular course of action despite poor results. We may do this because we feel we must justify our previous decisions and investment. For instance, my research unearthed one incident, where aid workers decided to continue with a visit to a dangerous region despite the fact that they could not find their bullet-proof vests and helmets, which they would normally wear when visiting the area. This may be because they had worked hard to prepare for the visit, and had invested significant time and energy. Further exploration of the incident showed another reason for not turning back: we do recognise that if a person does not feel comfortable in going, that person should not have to be forced into going. Now obviously, its a bit of a funny judgement to make because you dont want to be seen by the rest of your colleagues as the wimp. So increased risk-taking may be an attempt to preserve ones reputation, or to save the investment made in building friendships and developing trust and respect. Many aid workers operate in small teams, where strong friendships and rivalries may be formed. This may be the ideal breeding-ground for escalation within the group.
Relying on incomplete data
In making a judgement about risk, an individual may have incomplete data, or may not consider all the data that is available. One interviewee only knew two people who had died of AIDS, and so doubted official mortality figures. This idea is based only on knowledge of a small number of people; moreover, many more of the interviewees acquaintances, while not having died of AIDS, may well be HIV positive. Because of this error, the perceived risk of HIV is reduced.
Becoming inured to risk can sometimes desensitise us to potential dangers. Thus, our perception of risks changes over time as we become more familiar with them. For instance, if aid workers arrive safely at their destination despite passing through a dangerous area a number of times, they may become blasé.
Desensitisation can be used as a defence mechanism against worrying too much. In the interviews, many risks were referred to as one of those things; something you accept; something you have to live with; risk becomes an integral part of life. The danger of getting used to risks and of becoming a bit too relaxed about things was also noted by interviewees. One person commented about being interrogated by the army: when it happens several times it becomes routine to you. Its like, I know what is going to happen, but I think it shouldnt be treated like that because at any stage they can decide otherwise. It can get more dangerous. Another recognised the problem and sometimes deliberately stopped his team from visiting areas, just to try to keep them aware of the risks.
We can only speculate on the degree of error in these judgements about risks. Biases may cause people to be either over-cautious or reckless. In reality, decisions have to be made, and the question is: how can people make better decisions about the risks they face? Is it possible to arrive at a concept of optimal risk, where peoples actions are neither over-cautious nor reckless?
Although assessing risk is a complex and personal matter, it may be possible to improve peoples decision-making skills. One problem is that, in the field, the feedback we receive is usually either wrong, non-existent, delayed or open to interpretation. We need to examine the decision-making process to become aware of the biases and errors we are subject to.
Statistical models have been developed that can be used to analyse risks and predict outcomes. These models out-perform experts in their consistency of results. This is because people are better at collecting information than they are at processing it. Most models have been used in static situations, such as the decision to offer someone a bank loan; it remains to be seen if they can be adapted for field conditions.
While the individual perception of risk is important, the aid organisation and its risk-management strategy may also have an effect on the amount of risk their personnel expose themselves to. Organisations should reward good decisions even if the outcome was not what was intended. This may encourage better decision-making, a greater appreciation of risks and a reduction in risk exposure.
Risks can never be eliminated, but these are some things to think about when we are weighing up the risk:
1) Do I have the facts?: a) do I know enough to make a judgement? b) do I need to rethink earlier decisions and adjust my estimate of the risk?
2) Am I exaggerating risk because of past experiences (mine or others)?
3) Am I under-estimating the risk?: a) because Ive been lucky so far? b) because Ive become too used to the situation? c) because I dont want to be seen as a wimp? d) because I think bad things wont happen to me?
4) Do I need to pull back despite the hard work that Ive put in?
Aid workers should be made more aware of the biases that may affect decision-making, and encouraged to analyse their judgements. More information is needed about the risks aid workers face. Incidents could then be analysed to identify possible errors of judgement, and the lessons to be learned. Further research is necessary to establish whether training aid workers in risk perception leads to better decision-making, and whether statistical models could be developed to help decision-making in field situations.
References and further reading
Thought for the Safety of Aid Workers in Dangerous Places, The Lancet, vol. 354, no. 9,179, 1999.
Mani Sheik, et al., Deaths Among Humanitarian Workers, British Medical Journal, vol. 321, 15 July 2000.
Amos Tversky and Daniel Kahneman, Judgement Under Uncertainty: Heuristics and Biases, Science, 185: 1,1241,131, 1974.
Max Bazerman, Judgement in Managerial Decision-Making (New York: John Wiley, 1986).
Koenraad Van Brabant, Operational Security Management in Violent Environments, HPN Good Practice Review 8 (London: ODI, 2000).