Why is the response to drought almost always too little too late? Evaluations find the same failures and make the same recommendations again and again, and the response to the Horn crisis is no exception. The draft Disasters Emergency Committee (DEC) evaluation classified it as a qualified success, and highlights the general failure of preventive action from late 2010. Much the same was said in evaluations from the Sahel in 2005 and 2010, and in Kenya in 2005/6 and 2008/9.
Whilst humanitarian response is improving in many areas, drought is not one of them. Paradoxically, we are better at responding to fast-onset crises. This means that lives, livelihoods and dignity are lost, with greater impacts on women who generally eat last and least. Drought can also permanently retard childrens development, and thus damage future generations. This is a failure of the international system both humanitarian and development. Late response also appears to contravene the principles of Good Humanitarian Donorship, which commit donors to prevent and strengthen preparedness for disasters, and the Sphere Standards, which commit us to preventing the significant loss of livelihood assets. So where is our accountability?
Late response is also costly financially. One estimate from a previous drought in West Africa put the cost of preventing a child from suffering malnutrition at $1 per day, compared to $80 per day for treating acute malnutrition and saving that childs life.
The UNs appeal for the Horn crisis was $2.4 billion. That such a large sum was needed is not in doubt, but what is also not in doubt is that at least part of this cost was incurred because the international response to the crisis was so late. Figure 1 shows that major funding was only received from July onwards, after major media coverage of the suffering and when the UN had declared a famine in two areas of South Central Somalia.
What went wrong?
Did the early warning system (EWS) fail? The simple answer is no. The early warning systems in the Horn of Africa are now highly sophisticated. FEWSNET was born out of the Ethiopia famine of 1984, but it has come a long way since then; FSNAU is one of the most respected systems in the region, with a huge amount of information and analysis, producing high-quality output. And the Integrated Food Security Phase Classification (IPC) system has been a major step forward in regional early warning, developing standardised criteria and boosting understanding of the food security situation.
Ultimately, the early warning systems performed, but decisionmakers chose not to respond. The scale (numbers of people) and depth (severity) of the crisis still caught many by surprise. There is perhaps some fine-tuning to be done to the EWS experience suggests that the system is more likely to be used appropriately if decision-makers have a stake in it but the fundamental problem is not the early warning system, but the lack of response from decision-makers. They need to be challenged to develop early warning systems which they will respond to or perhaps the money is better spent elsewhere.
So it was only when the crisis reached a tipping-point when the MarchMay rains had definitively failed and the only possible trajectory was down that the humanitarian system began to respond at scale. Arguably, the system then responded adequately, but how can we do better next time? Clearly, it is ultimately national governments that bear the responsibility for food security, and there is much work to be done in developing institutions, policies and practices to respond better to impending crises and to build resilience for the long term. In Somalia, more support needs to be provided to traditional leadership in the communities to bear this responsibility.
From an international perspective, we need to move away from standalone, quick in-and-out humanitarian interventions, which keep people alive but do little to protect livelihoods. We need to change our long-term programmes, and ensure that our humanitarian work is more preventative.
Long-term programmes must be flexible to crises and reduce risk
It is clear that, where agencies already have long-term programming, where they are already working with communities and understand their vulnerabilities, their emergency response is better this was one of the outcomes of the DEC evaluation. So is it not better to explicitly combine our development and humanitarian work? Can we work to one programme with both development and emergency elements, to deal with both the acute/transitory food crisis phase, whilst also reducing risk and building resilience?
Drought cycle management is one practical tool that can be used to prompt a different suite of interventions in the different phases of normal, alert/alarm, emergency, recovery. However, donors rarely fund in this holistic way and often prefer to support work in just one of these phases. This inevitably means that work is less well connected, and also requires greater administration. There is a clear need for more advocacy with donors to break this down, to encourage the use of crisis modifiers, pioneered by USAID/OFDA in Ethiopia, thus enabling a more integrated, agile and flexible approach.
Self-evidently and empirically, prevention is better than cure. However, in practice, too often long-term programmes are not disaster-proofed and their monitoring and evaluation do not consider risk reduction. Disaster risk reduction is abysmally funded according to the Global Humanitarian Assistance Report, DRR represents a mere 0.5% of total ODA. We do not have figures for government expenditure, but there are no indications that spending is much higher. No one argues with the principle of insurance or vaccination paying upfront to prevent high losses but for some reason there is less support for disaster prophylaxis.
Humanitarian work must be preventative
Currently, the humanitarian system is not finely tuned for preparedness and early response. This is partly due to overstretch there are competing demands from crises happening today that will be given more weight over any crisis in the future, no matter how robust the prediction and partly due to a lack of prioritisation and funding. This must change.
A major shift is required to manage food security risk responsibly through disaster risk reduction and early response, rather than transferring this burden to vulnerable people who are least able to cope. An organisational stance of risk management rather than risk aversion is essential in order to stimulate early response to the crisis and thereby save livelihoods as well as lives. Key is the recognition amongst practitioners, governments and donors that sometimes the predictions may be wrong, but that overall this is better risk management, and that governments and the international community, rather than poor people, should absorb this risk.
Practically, we need to work together to develop triggers for early action and an associated suite of measures that can be undertaken on the basis of forecasts, rather than certainty. Developing this together will improve donor confidence just as the IPC has improved confidence in food security information. We need to develop no regrets measures that build capacity and disaster preparedness but have no negative effect even if the worst forecasts are not realised either because the cost is very low or because they build resilience. This would include activities such as putting human resource systems in place, developing proposals and talking to donors, building links with private sector partners and a range of practical measures such as assessing borehole operations, prepositioning stocks, market assessments and mapping the capacity and coverage of traders.
Most of these ideas have been around before, and certainly the problem is well known, so why are we still struggling with these issues? Perhaps previous attempts to address this problem have only looked at certain aspects when what we actually need to do is look at the whole system; we need to take an organisational development approach.
Figure 2 shows Oxfams approach to organisational development; all six aspects must be addressed in order to achieve sustainable change. Currently, we are not systematically implementing integrated programming, disaster risk reduction and early response, and we need to consider what changes need to be made, in all these spheres, to make this happen.
Certainly people skills are key. In terms of capability, do we have staff and partners who are able to build risk analysis into their work and are thus able to adapt what they do, and how they do it, as the situation and the needs change? Have our teams developed a state of readiness or preparedness, so that they can be more dynamic in their approach to risk management and adaptable to whatever crisis occurs? Just as managing security risks is a key element of day-to-day work in insecure environments, so should be discussing and managing other types of risks.
Leadership is of course important. Very few senior managers have strong experience in both emergency and development contexts. Have we even developed a clear understanding of the competences required for senior managers in contexts with recurrent disasters? They and others may need significant ongoing training and mentoring to maximise their skills and understanding, as well as appropriate systems in place to support them.
Structures can also be a major block: typically, organisations separate development and humanitarian work. What can be done to overcome the silo approach? Humanitarian and development strategies are often developed separately, whereas a risk management approach requires common thinking and planning. Practically, physical proximity (yes, it does actually make a difference where people sit!) and being part of one team matter. An effective coordination and integration approach with various mechanisms for direct cooperation, joint programming and implementation, in combination with shared learning cycles, can help to merge development and response.
There seems to be significant momentum on these issues now. Already we are seeing a much more timely response to the possible crisis in West Africa from national governments, the UN, NGOs and some donors although funding has only just topped $100,000 and needs to increase significantly if a crisis is really to be averted.
Three issues then remain:
- How can we make the most of this momentum and embed some significant changes in our organisations? Whilst West Africa is indeed showing us what early response might look like, we should not be complacent there is still much to do to institutionalise this learning, adapt our structures and systems and invest in our staff.
- How can we get political commitment that the Horn of Africa will be the worlds last famine? The Charter to End Extreme Hunger offers an opportunity to garner political and financial support.
- And finally what happens if we are successful in West Africa? If this early action does indeed avert a crisis, will we be accused of crying wolf? Aid detractors will say that we exaggerated the problem and suggest that we are not to be trusted, and thus funding for the next potential crisis will not be so forthcoming. We need to do more work on the counterfactual we need to be able to show clearly to funders and decision-makers that the early response did prevent a crisis, otherwise we risk losing our moral standing and financial support. This is perhaps the greatest danger of getting it right in West Africa.
Debbie Hillier is Humanitarian Policy Adviser at Oxfam. The Oxfam/Save the Children report A Dangerous Delay: The Cost of Late Response to Early Warnings in the 2011 Drought in the Horn of Africa is available at http://policy-practice.oxfam.org.uk/publications.