Issue 77 - Article 12

The evolution of a monitoring framework for the Ebola outbreak response in Kivu and Ituri provinces, 2018–2019

March 25, 2020
Emanuele Bruni, Chiara Altare, Nabil Tabbal, Silimane Ngoma and Ibrahima Socé Fall
Two surveillance officers discussing Key Performance Indicators in the Emergency Operations Center in Beni, North Kivu.
10 min read

The Ebola outbreak in North Kivu and Ituri has been one of the most difficult experienced by the Democratic Republic of Congo (DRC). The second-worst outbreak ever recorded, it has affected remote areas and urban centres bordering neighbouring countries, and has been exacerbated by a volatile context of insecurity and lack of community acceptance. The DRC government, Ministry of Health and World Health Organization (WHO) led a coordinated response by national and international partners to limit the spread of the disease and treat existing cases. Based on the experience of the outbreaks in 2014 in West Africa and in Équateur in DRC in 2018, the response was organised and implemented through the Incident Management System World Health Organization, Emergency Response Framework (Geneva; 2017)(https://apps.who.int/iris/bitstream/handle/10665/258604/9789241512299-eng.pdf?sequence=1). and under the umbrella of a joint Strategic Response Plan (SRP) The strategic response plans can be found here: SRP1 (www.who.int/emergencies/crises/cod/DRC-ebola-disease-outbreak-response-plan-15May2018-1025.pdf); SRP2 (www.who.int/emergencies/crises/cod/drc-srp-revised-v22december2018-EN-vF.pdf?ua=1); SRP3 (www.who.int/emergencies/crises/cod/drc-ebola-srp-v20190219-en.pdf); SRP4 (www.who.int/docs/default-source/documents/drc-srp4-9august2019.pdf?sfvrsn=679e4d26_2). encompassing activities within and beyond public health. These have been grouped in sub-pillars such as surveillance (including contact tracing); infection prevention and control (including safe burials); case management; vaccination; operational support and logistics; psycho-social support; social mobilisation, community engagement and risk communication (including anthropologic studies); laboratory and diagnostics; other basic health services; and security.

The evolution of the epidemic has been closely monitored through the extensive collection and analysis of epidemiological data to track cases, follow contacts, understand epidemiological links, map the spread of the outbreak and identify risk factors. In parallel, a monitoring framework was developed to provide operational and strategic analysis and enable partners and donors to follow up on response outcomes. While some attempts were made to clarify the link between response activities and Ebola incidence during the West Africa outbreak, standardised operational data from outbreak responses has usually been lacking. The monitoring framework currently being used in North Kivu and Ituri therefore represents one of the first attempts to use a harmonised, multisectoral and real-time monitoring system that allows the linking of response activities to short- and medium-term impacts. This article describes the process behind the development of the monitoring framework and its key components.

The evolution of the monitoring framework for the DRC Ebola response in Kivu and Ituri

Monitoring frameworks usually comprise components that together look at inputs, outputs, outcomes and impacts. During the Ebola response, these components have been developed at different times to address operational and strategic needs. Below is a chronological narrative of this process.

Designing and implementing the monitoring framework (August–October 2018)

The first step was to define a set of key outcome/performance indicators (KPIs) to monitor how well the response was achieving its results. Three to four key indicators per subpillar were defined and measured on a weekly basis. These indicators were derived from the SRP and were chosen through a consultative selection from the ones used in the recently closed response in Équateur. The second step focused on tracking the level of operationalisation of sector-specific activities against partner presence (4W Learning from the Équateur response, an activity monitoring system was put in place and adapted to improve the level of monitoring and promote accountability. Through a collaborative and consultative process, actors agreed on criteria describing necessary aspects for the implementation of each activity in terms of human resources and assets and activity implementation. For example, to measure the functionality of contact tracing, the following essential criteria needed to be in place: a system for identifying and tracking contacts; active and functional teams; a functional database; and daily validation of the contact search from a spotcheck. Based on these criteria, an algorithm was developed that measured whether an activity was fully, partially or not operational. Results were translated into a colour-coded visualisation (operational = green; partially operational = yellow; non-operational = red) and shared weekly.

Initial products and Information Management Working Group (IMWG) (November–December 2018)

The need for more coordinated information-sharing encouraged the establishment of an inter-agency Information Management Working Group, comprising among others MoH, WHO, UNICEF, IFRC, CDC, Oxfam and IOM and facilitated by OCHA. The first task of the IMWG was the elaboration of an information management strategy, including the definition of the products to be published, and approaches to the visualisation of KPIs and data analysis. The group facilitated interaction among agencies for a variety of activities, including refinement of the KPIs, the activity criteria and the algorithm, and led to the conception of new multisectoral tools such as the Infection Prevention and Control (IPC) Scorecard, which helps identify facilities in greater need of technical support to reduce infection risks. During this phase, the partners held a series of meetings to finalise the Information Management Strategy.

Digitalisation and refinement (January–May 2019)

The third major step was linked to the development of electronic tools throughout the data cycle, from data collection to dissemination of results. This included:

  • Data collection: switching from paper forms to electronic data capture, using the ODK technology, for both activities and KPIs. This technical development had a major impact on the timeliness, completeness and quality of data, enabled better control of the data collection process and provided additional technical features such as geolocalisation. Geolocalisation is the process of determininig the location of an object or place in terms of geographical coordinates.
  • Data analysis: data was analysed through written scripts using statistical software (Stata) and GIS software for geographical analysis. This allowed for reproducibility of analysis and comparability of data over time (between epidemiological weeks) and space (across geographical zones).
  • Data visualisation: results were visualised using Microsoft Power BI, which allowed for real-time availability. Data can be filtered for pillar, location and time, making it possible to see, for example, how KPIs evolved during a given month, or how the level of operationality of a given activity differed between health zones.

The launch of the UN scale-up strategy and the declaration of a Public Health Emergency of International Concern (May 2019– February 2020)

The spike in the number of cases and a series of security incidents led to a major shift in the governance of the response with the launch of the UN scale-up strategy in May 2019. This reflected the need for a system-wide response to contain and terminate the outbreak, going well beyond a public health approach. Implementation of the scale-up strategy was directed by the Ebola Emergency Response Team (EERT), chaired by the Ebola Response Coordinator (EERC) and the WHO Assistant Director General for Regional Emergencies.

The EERT coordinated the implementation of UN support to the DRC government across five pillars addressing public health priorities (pillar 1) and an enabling environment for a safe and effective response; strengthened political engagement, security and operations (pillar 2); strengthened support to communities affected by Ebola (pillar 3); strengthened financial planning, monitoring and reporting (pillar 4); and strengthened preparedness for surrounding countries (pillar 5). This governance shift implied OCHA officially taking over the overall coordination of information management, which had previously been shared with WHO. As a consequence, WHO’s M&E activities could revolve around pillars 1 and 5, and be extended in terms of frequency (many products started to be published on a daily instead of a weekly basis); new products were designed to respond to the specific needs of WHO M&E for pillar 1 (such as daily briefs, heatmaps and security incidents monitoring); and new sectoral tools were implemented, such as the IPC scorecard (a health facility-based evaluation assessing IPC). Following the classification of the outbreak as a Public Health Emergency of International Concern (PHEIC) in July 2019, a specific framework has been created to monitor indicators at international level in line with pillar 5.

Utilisation of the data and IM products

A variety of products designed to respond to the specific information needs of different actors are published daily (Daily brief, Heatmap, Scorecard, Incidents); weekly (activity evaluations, key performance indicators); and monthly (input and output analysis) (see Table 1). These products provide operational information to decision-makers, as well as more comprehensive information to support strategic planning.

Challenges and successes

The development and implementation of the M&E framework for the Ebola response was fraught with difficulties, ranging from the lack of a standardised outbreak response framework at the start that could be quickly deployed to the complexity of integrating data from multiple actors and sectors and the low appreciation of the role of M&E data in a health emergency. This led to delays in implementation and missed opportunities for more evidence-based decision-making throughout the emergency.

Table 1: Overview of the information management products used during the response to the Ebola outbreak in North Kivu and Ituri, 2018–2020

Even so, the progress that has been achieved is important. The establishment of the IMWG facilitated consultations among partners, and the participatory definition and revision of indicators, criteria and data collection tools increased the acceptability of the system. Efforts to streamline the data cycle through the development and implementation of electronic data capture tools, statistical scripts and visualisation dashboards allowed for real-time analysis and visualisation. The toolkit that has been designed and piloted represents an excellent starting point for future adaptations in other outbreaks.

Conclusion

In the context of outbreaks in disrupted health systems, WHO, Analysing Disrupted Health Sectors: A Modular Manual, 2009 (www.who.int/hac/techguidance/tools/disrupted_sectors/en/). a package of health indicators monitoring the performance and status of implementation of multilevel interventions Emmanuele Bruni et al., A Package for Monitoring Operational Indicators of the Response to the Outbreak of Ebola Virus Disease in the Democratic Republic of the Congo, Weekly Epidemiological Record, 94 (3), 2019 (https://apps.who.int/iris/handle/10665/279759). can identify the strengths and weaknesses of the response, inform decision-making, refine improvement strategies, provide lessons learned and improve accountability to affected populations. The implementation of a multisectoral and digitalised monitoring framework has helped raise awareness among response stakeholders of the added value of monitoring inputs, outputs and the status of activities, as well as performance indicators, to complement epidemiological data. The monitoring and evaluation framework has increasingly been incorporated into decision-making at operational, strategic and planning levels, and has become an integral component of strategic response plans. Interagency efforts are now needed to ensure that response planning for future health emergencies builds on this experience, and that a performance-oriented and monitoringdriven approach is adopted from the outset of an emergency.

Emanuele Bruni is Technical Officer (Planning and M&E), Health Emergency Programme, World Health Organization. Chiara Altare is Assistant Scientist, Centre for Humanitarian Health, Johns Hopkins Bloomberg School of Public Health. Nabil Tabbal is Information Management Team Lead, World Health Organization. Silimane Ngoma is Monitoring and Evaluation Analyst, World Health Organization. Ibrahima Socé Fall is Assistant Director General Emergency Programme, Health Emergency programme, World Health Organization.

Comments

Comments are available for logged in members only.

Can you help translate this article?

We want to reach as many people as possible. If you can help translate this article, get in touch.
Contact us

Did you find everything you were looking for?

Your valuable input helps us shape the future of HPN.

Would you like to write for us?

We welcome submissions from our readers on relevant topics. If you would like to have your work published on HPN, we encourage you to sign up as an HPN member where you will find further instructions on how to submit content to our editorial team.
Our Guidance