Accountability: the DEC’s experience
- Issue 52 Humanitarian accountability
- 1 Reflections on the accountability revolution
- 2 United we stand? Collective accountability in the humanitarian sector
- 3 Only as strong as our weakest link: can the humanitarian system be collectively accountable to affected populations?
- 4 Real Time Evaluations: contributing to system-wide learning and accountability
- 5 NGO certification: time to bite the bullet?
- 6 Accountability – don’t forget your staff
- 7 Humanitarian leadership and accountability: contribution or contradiction?
- 8 The role of donors in enhancing quality and accountability in humanitarian aid
- 9 Accountability: the DEC’s experience
- 10 A framework for strengthening partnering accountability and effectiveness
- 11 Community feedback and complaints mechanisms: early lessons from Tearfund's experience
- 12 Sexual exploitation and abuse by UN, NGO and INGO personnel: a self-assessment
- 13 Corruption in the NGO world: what it is and how to tackle it
- 14 Delivering communications in an emergency response: observations from Haiti
- 15 Local perspectives of the Haiti earthquake response
- 16 NGO accountability: findings from South Sudan
The core function of the Disasters Emergency Committee (DEC) is to raise funds from the public on behalf of our members following a major emergency. The DEC has 14 members: ActionAid, Age UK, the British Red Cross, CAFOD, CARE International UK, Christian Aid, Concern, Islamic Relief, Merlin, Oxfam, Save the Children, Tearfund, World Vision and Plan International UK. These funds are allocated according to a formula, recalculated annually, that takes account of the relative size of each agency, based on their overseas expenditure. The DEC is, therefore, not a donor but a funder, and has a responsibility to account for how funds are distributed and used down to community level.
To demonstrate accountability the DEC used to commission external joint evaluations. These were generally popular within the sector, but did little to effect change due to their generalised findings. Nor did they help build trust with the public as journalists tended to focus on negative conclusions. This led the DECs Trustees to consult on an alternative, and a new model for accountability was developed. The last DEC external evaluation, of the Niger crisis response, was completed in 2007.
The DEC Accountability Framework, known as DECAF, has now been in use for four years. Over those four years, many DEC members have strengthened their approaches to programme management. This in turn has increased their ability to deliver programme objectives and given them a lever for improving accountability to beneficiaries. This is not to say that there is nothing more to be done: embedding learning and making the participation of disaster-affected communities meaningful are just two of the challenges agencies face in fast-moving, complex humanitarian emergencies.
Developing the DECAF
Led by the DEC Secretariat in consultation with member agencies, a set of principles was agreed to guide the development of the framework:
- Encourage learning and accountability, whilst recognising the tensions between them.
- Build on members own performance management and reporting systems.
- Maintain independent scrutiny by engaging external consultants.
- Ensure that the investment made in the framework delivers benefits for member agencies and the DEC.
A set of objectives for the framework was also formulated to take account of the need to consider accountability in the round and to incorporate the prevailing thinking on the subject:
- Strengthen the Secretariats accountability to donors and member agencies accountability to the intended beneficiaries of emergency responses.
- Enable the Board to hold DEC members to account for the effective use of DEC funds and for adherence to DEC membership criteria.
- Establish mechanisms to measure and improve the performance of the DEC Secretariat.
- Maintain an independent overview of collective DEC performance.
- Provide a trusting environment for lesson learning/sharing between members in order to improve humanitarian responses by separating this from accountability mechanisms.
- Improve the quality of reporting by members and increase the volume of reporting the DEC puts into the public domain
In a series of workshops DEC members identified their key priorities to ensure a quality emergency response. These were described through 35 ways of working under five priority areas. As far as possible harmony was sought with other accountability initiatives such as the Humanitarian Accountability Partnership (HAP), with which the DEC shares several members, and Groupe URDs Quality Compass, with elements of each included. The original five priorities were:
- We use funds as stated.
- We achieve intended programme objectives and outcomes.
- We are committed to agreed humanitarian principles, standards and behaviours.
- We are accountable to beneficiaries.
- We learn from our experiences.
Since 2007, agencies have assessed themselves against the ways of working. Assessment scores are rated Red, Amber or Green, with Red indicating the absence of any policy or procedure, Amber confirming that there is a policy or procedure with some evidence of application and Green showing that there is an assurance mechanism in place to ensure that the way of working is systematically implemented.
To justify their ratings, members are required to submit evidence of policies or procedures, of application and of assurance, drawn from recent DEC-funded emergency responses. Central to the process is the requirement for each member to set out their improvement commitments, in line their own strategic objectives. Evidence is scrutinised and delivery against improvement commitments tracked by an external validator, a role performed by Ernst & Young for the first three years and now by One World Trust.
In addition to these annual assessments, DECAF encompasses a number of activities designed to promote best practice, learning and accountability. The agreed priorities span the four main pillars of the framework (as in Figure 1).
- Annual self-assessments against the DEC accountability priorities and the ways of working.
- Appeal-specific reporting on plans and progress against these.
- A rotating system of independent external evaluation.
- Collective learning activities designed to ensure that DEC members share learning and experiences.
Taking stock
Whilst the core of the assessments did not change over the first three years, the DEC Secretariat continually sought feedback from members. Key messages were that in headquarters evidence collection was found to be resource heavy; for field offices, the process was perceived as largely extractive and therefore not providing any obvious benefits at programme level.
During 2010, the Secretariat facilitated an in-depth review of DECAF, drawing on workshops and a survey of member agency staff as well discussions with Trustees and external stakeholders. Four clear conclusions emerged from this process:
- The DECAF assessments had led to improvements, particularly around systems for learning and accountability to beneficiaries. However, the priorities needed updating to reflect the new frontiers of best practice and to harmonise better with other accountability initiatives.
- The process of the annual DECAF assessments needed adjusting in order to reduce the administrative burden and make the ratings more consistent.
- The DECs learning activities (including workshops and collective initiatives) had been successful forums for sharing learning and there was scope for future joint learning.
- The DEC could do more to share information with and be accountable to the public.
DECAF 2: refinement and reformulation
In response to this feedback, changes have been made to both the DECAF assessment process and to the ways of working themselves.
Two refinements were made to the DECAF process:
- Sampling. The validators now select a sample of 57 ways of working, pulling out those they feel would benefit from particular scrutiny, with evidence now provided to support this subset rather than for all of the ways of working. This reduces the burden and allows for more in-depth analysis of the issues which the validators can then turn into clear feedback for each agency.
- Peer challenge. In workshops facilitated by the validators, two or three agencies challenge each other over the strength of their systems and the consistency of their ratings. Trustees also take part in separate peer challenge sessions which bring together two member trustees (i.e. agency chief executives) with an independent trustee to discuss the key issues arising from that years assessment.
While generating consensus over changes to the process was relatively straightforward, formulating a revised list of ways of working, which all members agreed represented current good practice, was more challenging.
A recurring theme in these discussions was the diversity of the DEC membership. Our agencies are united by their commitment to humanitarian work but can be divided along various axes when it comes to how they deliver this work. For example, some agencies operate directly in the field while others work only through local partners; some have limited influence over their global families while others sit at the top of international structures; and some are the giants of the sector while others are smaller, with fewer resources and perhaps a narrower focus in terms of location or sector. All of these factors affect what the UK-based agencies see themselves as accountable for and critically by whom they can be held to account.
To address these issues we took a further step towards the other initiatives in which our members are engaged: mirroring the HAP benchmarks within our accountability to beneficiary ways of working; incorporating ideas from the Sphere Core Standards around coordination, collaboration and local capacity; and making explicit reference to People in Aid. To identify the ways of working which sit beyond the scope of these initiatives, we concentrated on areas where agencies had common aspirations (e.g. value for money) and where the first three years of DECAF showed there was still progress to be made (e.g. learning). While this process led to a very different set of ways of working, at the priority level there was only one significant change: a move away from using funds as stated towards a more challenging commitment to using resources efficiently and effectively. This new priority incorporates the themes of value for money, utilising local capacity and ensuring that staff can work effectively (see Table 1).
In December 2010, the DEC Board agreed the new list of 21 ways of working, reduced from the initial 35. Members will self-assess against these for the first time in 2011-12.
Next steps
After a year of reflection and review, we are now looking forward to the first year of DECAF Mark 2 assessments. We are also thinking about two broader challenges which need to be addressed.
First, how can we be more accountable to the public? The DEC is designed to offer a unified appeal for the public at times of emergency; we recognise that with this comes a responsibility to tell the full story to the public to give positive feedback on what their generosity has delivered, but also to explain the challenges. The DEC has just launched a new website (www.dec.org.uk) and we plan to use this to put more information in an accessible manner into the public domain so that we can use our work around quality and accountability to build the publics trust and to educate our donors about the complexity of humanitarian work.
The more abstract challenge, which the DECAF assessment process throws into the spotlight, is the complexity of accountability within large organisations. The DECAF assessments demand that agencies explain how they know that the high standards they strive for are being reached on the ground. With increasingly global structures, overseas sister agencies and myriad partners, for many agencies lines of accountability now lead to points outside of their direct control. Given this, the question of what effective global accountability looks like and how change can be achieved within global organisations is critical to our improvement agenda. While we dont have all the answers here at the DEC, we are looking forward to tackling these questions in conjunction with our members.
Annie Devonport is Humanitarian Programme Advisor at the DEC. Cait Turvey Roe is the DEC Accountability and Audit Manager.
Comments
Comments are available for logged in members only.