j
A damaged building in Chechnya A damaged building in Chechnya Photo credit: ECHO

Data are not dangerous: A response to recent MSF CRASH critiques

by Abby Stoddard, Adele Harmer, and Katherine Haver
11 May 2016

In a May 4 post, Michaël Neuman, MSF CRASH, warns that misleading data are suggesting humanitarian aid work has become more dangerous, taking particular aim at the Aid Worker Security Database (AWSD)+ The Aid Worker Database (www.aidworkersecurity.org) is a longrunning project of Humanitarian Outcomes. It tracks major incidents of violence against humanitarian personnel (killings, kidnappings, and attacks resulting in serious injury), with data going back to 1997. for helping perpetuate this myth. As the creators of the AWSD and longtime researchers of operational security, we welcome the chance to respond to this latest piece, as well as the book it draws from, Saving Lives and Staying Alive, co-edited by Neuman with Fabrice Weissman.

Starting with the points on which we agree:

Aid work is not becoming more dangerous overall. Correct. This is a point we have made in virtually every report released on AWSD figures. The rise in the total number of major attacks is driven by a small number of extreme cases, which for a long time has included Afghanistan and Somalia, and also currently includes South Sudan and Syria. On balance, the humanitarian presence in these countries has shrunk over successive years, as our multi-year field research study has demonstrated. Yet kidnappings, killings, and other attacks on aid workers in those countries have all increased. If we remove those cases, however, the overall attack rate remains fairly stable, in some years even declining, suggesting that in most places in the world providing humanitarian aid has become safer.

It is unfortunate that the media pays the most attention to our numbers when they are on the upswing. It is also frustrating that the nuance of our analysis is often lost in press coverage. However, it is good for political actors to be publicly reminded that their failure to resolve conflicts not only kills civilians, but also those seeking to help them. Yes, there always has been – and always will be – danger involved in aid work, but that doesn’t mean anyone should stop loudly lamenting aid worker attacks.

Context is everything. Amen. A global dataset can never be truly useful for field-level decision-making, a point we ourselves have repeatedly made. In our field research we have never met a practitioner who treated the AWSD as anything but a single reference point, with only limited relevance to their particular situation.

In this regard, we have been encouraged by the recent development of field-level platforms to track and share information on security incidents. These platforms are more comprehensive in their scope than the AWSD; for example, they record incidents of threatening letters and harassment, which may be bellwethers of changing security conditions. At the same time, the AWSD offers practitioners the ability to compare differing contexts, which can help inform the allocation of scarce resources.

Now let’s move on to the areas of the Neuman/Weissman analysis with which we fundamentally disagree.

Methodological issues. The authors’ critique is weakest on the discussion of the methodology, revealing a very superficial familiarity with how the AWSD incident data are gathered, coded, and verified with the affected agencies and field security consortia. It further misstates how the aid worker population denominator is estimated to calculate attack rates. Truly, what is most disappointing about the book and associated articles, however, is that they do not attempt to provide a rigorous technical critique of the methodology or suggest more robust alternatives. Rather, the authors take a curiously ideological stance against global data-gathering and the formalised practice of risk management.

The article correctly observes that ’there is no consensus on what defines humanitarian work. It is often very hard to know on what basis the employee of a humanitarian organization was targeted by violence. Was it as a private individual, as someone from a particular country, as a member of a relief organization, or something else?’  This conflates the different questions of unit of analysis and motive, but it is certainly true that humanitarian work and aid workers can be defined very differently depending on whom you speak to. The AWSD uses a very specific definition but for the purpose of measuring trends accurately, it is important mainly that definitions be clear and internally consistent.

The criticism is that this definition encompasses ’a very wide range of people,’ leading to ambiguous findings. But this ignores the fact that the AWSD data are sorted – and hence can be disaggregated – by type of agency affected (UN, Red Cross/Crescent, INGO, NNGO), the gender and type of staff (national or international) as well as the means and context of the attacks and presumed motives (where possible to state). All of these data are verified with the agency concerned on an annual basis.

‘Studies that attempt to calculate attack and victim rates… are hampered by the lack of a reliable denominator.’ Indeed, this is the most difficult component of all our data initiatives. For this reason, each year we endeavor to make a more rigorous estimate of the aid worker population. In pursuit of this aim, we have developed a methodology that combines human research and a technique called conditional mean imputation, which divides NGOs into similarly sized tiers; tier averages are then used to extrapolate missing data. Rather than making the imputation algorithm proprietary, we have published it in studies and online, and shared it extensively among colleagues in the sector, to encourage feedback and constructive critique to help guide its continual development. To date, the methodology has been peer-reviewed by three different statisticians who have deemed it sound, but we still feel it could be stronger. Our invitation to colleagues to improve upon these methods remains open.

Give aid workers a bit more credit. Rather puzzlingly, the authors seem to implicate the practices of incident tracking and risk management in what they see as increased risk aversion in the field. Apart from not presenting empirical evidence to support their assertion, they make some illogical assumptions about the influence and consequences of these practices and ’global figures’ like AWSD. According to the book, ‘global figures convey the misleading notion that the violence is a global phenomenon obeying general laws.’ Do they? We would argue that this is no more the case than knowing the number of war casualties would lead one to think that all wars are caused by the same thing. Numbers are neutral, and the fact that they can be misinterpreted or misused is no reason to dispense with them.

Similarly, the book implies that formalised systems of risk management, increasingly used by aid agencies, do more harm than good. Such systems, the authors maintain, bureaucratise decision-making and rob aid workers of their independence and ingenuity. This runs counter to the widespread opinion of both national and international humanitarian field workers, as evidenced in surveys and hundreds of interviews across multiple studies. Practitioners understand perfectly well that guidelines, such as the widely used Good Practice Review 8, are simply tools. It is ungenerous, at best, to suggest that the typical aid worker will abandon their own judgment and personal agency to blindly follow a manual, or will allow their situational awareness to be solely determined by numbers in a global dataset. And it betrays a lack of in-depth inquiry on how these measures are actually applied in the field.

Humanitarian values and risk management are not mutually exclusive. Fears that humanitarian action can be compromised by risk aversion and creeping corporatism are not unjustified. But fretting, without persuasive evidence, that risk management ‘has led to disenchantment with humanitarian action, whose chivalrous spirit has been drowned in the icy waters of actuarial calculation and remote control’ just indulges romantic notions. The qualities of altruism, empathy, and compassion, which underpin humanitarian action, are not threatened by a clear-eyed appraisal of risks, and the means to manage them.

Making the problem about datasets misses the mark. The authors are right to point out that the humanitarian sector needs to get better at contextual analysis. But placing the blame on global databases and risk assessment templates is off base. The more direct causes – as over two years of sustained field research in our forthcoming study will show – are a lack of investment in negotiations with armed actors, inappropriate staff profiles, and the under-development of necessary skillsets.

Some final words on the construction and maintenance of datasets and denominators. It’s boring. It’s painstaking, tedious and time-consuming. On the whole, it is far more satisfying to write opinion pieces and to follow the ‘research by chatting’ model that is standard in much of the humanitarian literature. But before the AWSD existed there was much rhetoric and speculation about the ’shrinking of humanitarian space’ and the increasing targeting of aid workers, with virtually nothing in the way of supporting evidence. We would argue that the AWSD has contributed to the sector knowing a bit better what it is talking about. The answer is never less information, but more and better information, and more thoughtful interpretations of it.


Abby Stoddard, Adele Harmer, and Katherine Haver are Partners in Humanitarian Outcomes (@HumOutcomes), an independent team of specialist consultants providing research and policy advice for humanitarian aid agencies and donor governments.

Share
FacebookTwitterLinkedIn

  • Michaël Neuman

    The numbness of numbers

    Michaël Neuman & Fabrice Weissman

    We welcome Abby Stoddard, Katherine Haver and Adele Harmer’s response to our critical article on the production and the use of security data in the humanitarian sector and to our book in general – (which is accessible online http://msf-crash.org/livres/en/saving-lives-and-staying-alive/humanitarian-security-in-the-age-of-risk-management). In a field that has been very much lacking debate, if not controversies, we’re extremely glad to see a various range of readers engaging in the discussion. After reading their comments, we might have to acknowledge that what we have here are two very distinct visions of what humanitarian security should look like. These two positions might not be irreconcilable, but for now they are quite far from one another. On the one hand, Humanitarian Outcomes consultants behind the development of the Aid Worker Security Database believe in the intrinsic value of numbers in making humanitarian workers, policy makers and the general public aware of the ‘occupational’ risks, as much as they believe in contemporary risk management as the right way to minimize these risks. On the other hand, we, as authors of “Saving Lives and Staying Alive’, are wary of both.

    In their response, the three authors take issue with our way of analyzing their work. Rather than engaging in a series of quote-based endless exchange, we would advise readers interested in the issue to read the book – especially the chapters dedicated to numbers and guidelines (http://msf-crash.org/livres/en/saving-lives-and-staying-alive/2-theories) – and make their own mind.

    However, central to their argument is the idea that “numbers are neutral” and “guidelines simply tools”. Using analytical frameworks developed by sociologists such as Alain Desrosieres (http://www.hup.harvard.edu/catalog.php?isbn=9780674009691), Theodore M. Porter (http://press.princeton.edu/titles/5653.html#reviews) and Jean-Pierre Olivier de Sardan (http://press.uchicago.edu/ucp/books/book/distributed/A/bo20847647.html), we argue the contrary. We show that datasets are not neutral from a methodological, ethical and political point of view. The definition and encoding of the measured variables are the result of conventions, negotiations, interpretations, power struggles, influence strategies, that are historically located. In this regard, we demonstrate that the AWSD as well as other quantification exercise of insecurity are primarily driven by the desire to “support with evidence” three preconceived ideas: that the “humanitarian space is shrinking”, that aid actors are increasingly targeted for political reasons, and that humanitarian workers should entrust the management of their security to professional experts.

    Having deconstructed the “shrinking humanitarian space” and the “blurring of lines” narratives in our previous book (Humanitarian Negotiations Revealed, The MSF Experience http://msf-crash.org/livres/en/humanitarian-negotiations-revealed), our most recent opus is mainly dedicated to the neoliberal security management ideology supported by manuals such as the GPR8. In this regard, Abby Stoddard, Katherine Haver and Adele Harmer are right to say that most aid personnel do not follow blindly such security manuals and are still using their own judgment and personal agency. But, as we illustrate through three empirical studies (http://msf-crash.org/livres/en/saving-lives-and-staying-alive/3-practices) and two historical accounts (http://msf-crash.org/livres/en/saving-lives-and-staying-alive/1-history), they do so despite the tremendous pressure toward centralization and authoritarian control of their behavior and public expression promoted by security manuals.

    One more time, we would invite all those interested, especially aid volunteers working at headquarters and field levels to read the book and tell us if we are so far away from their experience.

    Michaël Neuman and Fabrice Weissman are directors of studies at MSF-Crash. They can respectively be found on Twitter as @mikafromparis and @fabweissman. They co-edited “Saving Lives and Staying Alive. Humanitarian Security in the Age of Risk Management” (Hurst & Co, 2016) and, with Claire Magone, “Humanitarian Negotiations Revealed. The MSF experience” (Hurst & Co, 2016). MSF-Crash tweets as @MSF_Crash.