- Open Access
Developing monitoring and evaluation tools for event-based surveillance: experience from Vietnam
Globalization and Health volume 16, Article number: 38 (2020)
In 2016–2017, Vietnam’s Ministry of Health (MoH) implemented an event-based surveillance (EBS) pilot project in six provinces as part of Global Health Security Agenda (GHSA) efforts. This manuscript describes development and design of tools for monitoring and evaluation (M&E) of EBS in Vietnam.
A strategic EBS framework was developed based on the EBS implementation pilot project’s goals and objectives. The main process and outcome components were identified and included input, activities, outputs, and outcome indicators. M&E tools were developed to collect quantitative and qualitative data. The tools included a supervisory checklist, a desk review tool, a key informant interview guide, a focus group discussion guide, a timeliness form, and an online acceptability survey. An evaluation team conducted field visits for assessment of EBS 5–9 months after implementation.
The quantitative data collected provided evidence on the number and type of events that were being reported, the timeliness of the system, and the event-to-signal ratio. The qualitative and subjective data collected helped to increase understanding of the system’s field utility and acceptance by field staff, reasons for non-compliance with established guidelines, and other factors influencing implementation.
The use of M&E tools for the EBS pilot project in Vietnam provided data on signals and events reported, timeliness of reporting and response, perceptions and opinions of implementers, and fidelity of EBS implementation. These data were valuable for Vietnam’s MoH to understand the function of the EBS program, and the success and challenges of implementing this project in Vietnam.
All member states of the World Health Organization (WHO) are required by the International Health Regulations (IHR, 2005) to have an early warning and response (EWAR) system . In order to support the implementation of the IHR, the Global Health Security Agenda (GHSA) was launched in February 2014. GHSA strengthens both global capacity and nations’ capacity to prevent, detect, and respond to infectious threats [2, 3].
IHR (2005) emphasize development of capacities for collecting and analyzing information from sources outside the health system itself [1, 4], including detection of unusual events that may represent emerging threats. Event-based surveillance (EBS) is a component of EWAR that can significantly improve the sensitivity of a system to detect emerging outbreaks [5, 6]; effective monitoring and evaluation (M&E) of an EBS system can help ensure consistent system performance. However, a limited number of EBS M&E tools have previously been available in the published literature [7,8,9,10,11,12,13].
Monitoring in the context of surveillance refers to routine and ongoing tracking of (a) implementation of planned surveillance activities, (b) data gathered by the surveillance system, and (c) overall performance of the system. Evaluation of a surveillance system is periodic assessment of the relevance, effectiveness, and impact of activities of surveillance, and focuses on whether the system meets its objectives and makes effective use of resources [5, 6, 14, 15]. Routine monitoring of system performance and periodic wider system evaluations should be conducted as part of any surveillance system [5, 14]. When correctly implemented, M&E can promote the best use of data collection resources, ensure that the surveillance system meets its intended objectives, provide signs of potential system deviations, and identify opportunities for performance improvements [5, 6, 14, 15].
As part of GHSA program implementation, Vietnam’s MoH launched a two-phase project to implement EBS in six provinces in collaboration with the US Centers for Disease Control and Prevention (CDC) and the non-governmental organization PATH in September 2016 [16, 17]. A package of M&E tools was developed and deployed for this EBS project. Data on the process of implementation, deployment of the M&E tools, and results of the implementation showed EBS resulted in early detection and reporting outbreaks, improved collaboration between the healthcare facilities and preventive sectors of the ministry, and increased community participation in surveillance and reporting. In addition, the pilot demonstrated the value of supportive supervision and evaluation [16, 17]. This manuscript describes the development and content of the M&E tools used in the EBS project, and how the M&E strategy was integrated within EBS implementation.
Implementation of the EBS pilot project
Vietnam has 4 administrative health regions (north, south, central coast, and central highland); each has a regional public health institute (RI) that is responsible for the overall technical direction and supervision of surveillance and response to diseases and outbreaks in that region . Within each region, provincial preventive medicine centers (PPMCs) lead surveillance and response activities within their respective provinces. Within a province, the district health centers (DHCs) coordinate public health activities in each of the districts. Districts are divided into communes, and each commune has a commune health station (CHS). The CHS is the primary healthcare unit in Vietnam  and is usually staffed by a physician, a nurse, and a midwife. Within each commune, village health workers (VHWs; rural areas) and health collaborators (HCs; urban areas) constitute community networks and support the CHSs in different health-promotion activities. VHWs and HCs were trained through the current project to function as key informants, collecting information from the community and reporting as needed to the CHS. For the pilot project, the CHSs sensitized VHWs and HC to detect and report signals. Besides VHWs and HCs, some additional community members such as community leaders, school teachers, pharmacy workers, and veterinarians were also invited to participate in the EBS pilot and received training to recognize and report signals.
The MoH launched EBS implementation in 2 phases. Phase 1 started in September 2016 in 4 provinces (Quang Ninh and Nam Dinh in the north region, and An Giang and Ba Ria Vung Tau in the south region). Phase 2 started in August 2017 with addition of two more provinces (Dak Nong in the central highlands region and Binh Thuan in the central coast region).
To support the implementation of the EBS pilot, the MoH’s General Department of Preventive Medicine (GDPM) formed an EBS Technical Working Group (TWG) with experts from the MoH (including the 4 RIs), CDC, PATH, WHO, and technical staff from the PPMCs of the participating provinces . The TWG drafted a list of signals to be used at community and health facility levels (Table 1). The list was based on prioritization criteria that included diseases that have a great impact on public health and significant epidemic potential, emerging or reemerging diseases, and diseases that are scheduled for eradication or elimination. The list of signals provided guidance for signal detection at community level and health facilities. The TWG also drafted interim technical guidelines, standardized operative procedures, and M&E tools that were used to launch the project.
Thirty-three master trainers from the national, regional, and provincial level were trained on EBS. After this, all public health system staff in participating provinces were trained by the cascade training method, including VHWs/HCs, CHS staff, DHC staff, and hospital healthcare workers. Implementation of the project began when the trainings were completed. EBS implementation in each province was the responsibility of the surveillance staff at each level. An EBS focal point was designated at each level to coordinate project activities.
Signals detected or received by VHWs/HCs were reported immediately to the EBS focal point at CHSs, by phone calls or in person. Upon notification of signals to the CHS, the corresponding EBS focal point triaged the signals to screen out data/information that were not relevant for early detection purposes. Once a signal was triaged, it was reported to the DHC by phone/email and the district EBS focal point verified if the signal represented a real public health threat. The purpose of the signal verification was to determine whether the reported event truly occurred and collect information on the characteristics of the event. Once a signal was verified, it was referred to as an event. Signals detected by healthcare workers at health facilities were reported immediately to the corresponding PPMC or DHC’s EBS focal point, who conducted triage and verification. All events were reported to the PPMC, which assigned the level of risk and impact of each event through a risk assessment. The risk assessment determined the type of response to be initiated by the public health system [16, 17] (Fig. 1). Logbooks and monthly summary reports forms were used to record signals and events data at all levels.
Design and development of M&E tools
The TWG developed a strategic EBS framework (logic model) for the country based on the EBS implementation pilot project’s goals and objectives (Fig. 2). This conceptual map identified main components of the EBS implementation pilot and demonstrated how they relate to one another. The EBS logic model included ongoing monitoring and a detailed assessment of the entire system 5–9 months after EBS implementation using tools developed for those purposes.
Input, output, and outcome indicators were developed to reflect the goals of the EBS framework (Table 2). Input indicators referred to the resources needed to implement EBS, while output indicators were measures of the immediate results of EBS-related activities. Outcome indicators measured the quality of the surveillance system and the extent to which surveillance objectives were achieved.
Using the monitoring tools
As a basic tool for data monitoring, each administrative unit within a province completed a monthly summary report of signals and events and sent the report via email to EBS focal points at the next level. Each of the regional-level EBS focal points compiled the data from the provinces and shared them with the national-level EBS focal points. Personnel at the regional level were responsible for reviewing these data and liaising with the provincial level EBS focal points, who in turn coordinated with the EBS focal points at the district and commune levels as needed. In addition, a supervisory checklist was developed for each administrative level (provincial, district, and commune) to be used during routine monthly or every-other-month supportive monitoring visits during the pilot implementation phase. EBS focal points involved in monitoring were trained before monitoring tool implementation.
Supportive supervisory visits were conducted by EBS focal points at each administrative level, with staff from higher levels visiting staff at lower levels. The visits were intended to identify and correct problems during implementation and for provision of technical assistance, mentoring, and hands-on refresher training as needed. The supervisory checklist was structured to collect data from direct observations, documents such as log books, and interviews, and allowed supervisors to evaluate implementation fidelity as well as understand personal perceptions and opinions of the implementers regarding the program.
The checklist covered five main areas of implementation: EBS staffing, training, availability of resources needed to implement EBS (equipment, guidelines, forms, etc.), whether or not monitoring visits were being made to lower administrative levels, and problems with filling in records and forms (Table 3).
Using the evaluation tools
An evaluation team was formed consisting of stakeholders from GDPM, RIs, CDC, and PATH and designed five data collection tools for assessment of EBS 5–9 months after implementation to document products of EBS activities, perceptions of implementers, and fidelity of implementation. The tools included a desk review tool, a key informant interview guide, a focus group discussion guide, a timeliness form, and an online acceptability survey (applied in 4 Phase 1 provinces only). The main process and outcome components were identified and included input, activities, outputs, and short-term, intermediate-term, and long-term outcomes. Thirty-one indicators were developed for M&E purposes based on the EBS framework (Table 2). Field visits to all pilot provinces and selected districts/communes were scheduled (2 districts per province and 2 communes per selected district). In addition, target populations were identified for each evaluation tool at each administrative level (Table 3). Evaluation team members were trained how to use evaluation tools before their implementation.
The desk review tool collected data on number of EBS trainings and trainees; number and percentage of districts, communes, and hospitals implementing EBS; number of communications and registering materials (posters, brochures/flyers, and notebooks) delivered to local levels; number of signals and events reported from lower to upper levels each month; completeness of logbooks; and monthly allowance/incentives for implementers in each province (Table 4). The desk review tool was sent to participating provinces in advance of visits, allowing them 2 weeks prior to the field evaluation activities to complete it, and EBS staff reported that the time was sufficient to review primary sources of information (such as the commune-level log books of EBS signals) and compile accurate data.
The key informant interview tool consisted of semi-structured in-depth interviews with EBS focal points to obtain qualitative information on fidelity of EBS implementation, timeliness of reporting and response to events, perceived value and acceptance of EBS, and lessons learned that were applicable for future roll-out. A shortened version of the tool for interviews with hospital EBS focal points was developed and focused on fidelity of EBS implementation, perceived value and acceptance of EBS, and lessons learned (Table 4).
The focus group discussion tool collected qualitative data about EBS implementation at the district and commune levels. Different guides were developed for three specific target groups: (a) EBS focal points at the district level (for both clinical and preventive medicine sectors), (b) VHWs and HCs, and (c) community members serving as key informants, such as schoolteachers and pharmacy workers or members of social unions/associations. The tool collected data on fidelity to EBS interim guidelines implementation, timeliness of reporting and response to events, time spent working in EBS, perceived value and acceptance of EBS, reporting of cases of infectious diseases to the electronic communicable disease surveillance system, and lessons for future roll-out. The questions in the focus group discussion also sought to increase understanding of how well community members understood the signals, how the community health workers detected signals in the community, which strategies engaged community members to participate in EBS, and the perceived value and acceptance of EBS (Table 4).
The timeliness form collected data including the type of the event, date and time of signal onset (i.e., when the signal appeared), time of signal registration (i.e., when the signal was first registered in the corresponding logbooks), time of event confirmation (i.e., when the event was registered at the district level), time of response (i.e., when a response for the event was initiated), and the type of response implemented. The timeliness form was sent electronically to the EBS focal point at each DHC in the pilot provinces, who collected data and returned the completed form to evaluation team.
The online acceptability survey included questions on demography; personal beliefs, values, and attitudes toward EBS; possible barriers to participation in EBS; identity and roles of active informants in the community; facilitating factors for EBS implementation; and specific forms of government support. Most survey questions were based on a Likert scale model. Additional open-ended questions asked respondents to provide recommendations to improve EBS (Table 4). From June to July 2017, the survey was open to all VHWs/HCs, CHSs, DHCs, and PPMCs of Phase 1 participating provinces. The survey was not open for the Phase 2 pilot provinces.
The evaluation tools were developed using Microsoft Word 10.0. and hard copies were printed for site visits. The acceptability survey was uploaded to the ONA platform (Ona Systems, Inc.) for online entry data. Quantitative and qualitative data were stored in Microsoft Excel 2010.
The checklist facilitated the identification of deviations from the intended implementation process, and the prompt provision of practical recommendations to solve issues on the spot. Clara et al. showed how the event-to-signal ratio, calculated as events detected per month divided by signals detected per month from September 2016 to December 2017 in the 6 pilot provinces, increased at specific points that roughly corresponded to specific interactions that included supportive M&E visits . As Vietnam is in the process of scaling up EBS nationwide, and with EBS becoming an integral part of routine surveillance system, EBS supervisory visits will be integrated into existing routine supervisory visits for other public health purposes (e.g., supervision of nutrition programs, family planning, or general surveillance programs).
Desk review showed that a total of 8661 VHWs/HCs, 1379 CHS staff, 185 DHC staff, and 75 hospital healthcare workers were trained on EBS. Pilot provinces reported 4854 signals and 370 (8%) events from September 2016 to December 2017 [16, 17]. Information on type of event was available for 253 events and included a variety of vaccine-preventable diseases (e.g., chickenpox and mumps), zoonoses such as avian influenza in poultry, vector-bone diseases (e.g., dengue and malaria), foodborne disease outbreaks, and other non-infectious conditions (e.g., toxic-related illness and complication after vaccination) (Fig. 3). A response was implemented in 355 events (96%).
In total, 51 key informant interviews and 46 focus group discussions were conducted during field visits. The most common challenges that interviewees mentioned about signal detection were: a) need for refresher trainings for VHWs/HCs, b) VHWs’/HCs’ inability to cover large geographic areas, and c) limited implementation in urban areas. Suggestions included a need for ongoing supervision/refresher trainings to continue in order to identify and resolve similar problems; expansion of the network of information sources in the community to improve detection; and the need to refine some signals in the community and hospitals to facilitate their understanding. All interviewees stated they did not encounter major challenges when reporting signals/events to upper levels.
Conducting both evaluation tools (focus group discussions and key informant interviews) for different target groups enabled the collection of qualitative data including personal beliefs, values, and attitudes toward EBS. They were effective in documenting the perceived accuracy and the observed accuracy with which the five EBS steps (detection, triage, verification, risk assessment, and response) were carried out. Additionally, the interviews provided an opportunity for the EBS focal points at the province and district levels to share their views on successes and challenges of EBS. The interviewees presented many case studies that demonstrated the importance of EBS to their work around the surveillance of and response to infectious diseases and outbreaks.
Information on timeliness of reporting and response to events were available for 210 (57%) of 370 events. Timeliness data showed that the median times in hours from detection to notification and detection to response were within 24 h and 48 h, respectively. Avian influenza in poultry events and foodborne disease outbreaks showed the shortest median time from detection to notification and response (1 h or less) while [16, 17] (Fig. 3).
Although the timeliness form was simple, with only six requested variables to collect, its completion presented challenges for two main reasons: (1) it required compiling data that had to be acquired from logbooks in different physical locations and different levels of the health system and (2) some data elements required retrospective data collection, such as the date of signal occurrence, date of confirmation of events, etc. An electronic reporting system would have greatly facilitated collection of these data. At the time, the paper records lacked accurate date and time stamps.
The online acceptability survey was a cost-effective way to reach a large number of implementers at all levels in a relatively short period. A total of 1633 (22.8%) of the 7167 VHWs, 428/653 CHS EBS focal points (65.5%), 39/43 DHC EBS focal points (91%) and all PPMC EBS focal points from the 4 Phase 1 provinces completed the online acceptability survey . Percentages of respondents from all administrative levels who agreed/strongly agreed that EBS is very important in detection of events, helps detect events earlier, and should be continued ranged from 80 to 88%. Based on survey results, 86% of VHWs/HCs and CHS EBS focal points, and 90% of DHC focal points, reported being willing to continue taking part in EBS.
The use of the Likert scale simplified the filling process: the answers were easily quantifiable and allowed the participants to respond in a degree of agreement. However, the format of Likert scale items may have resulted in responses being influenced by previous questions or heavily biased to a neutral response, avoiding the extremes [19, 20]. In addition, because it was a self-administered survey, participants may have had different interpretations of the questions and difficulties in understanding how to respond (e.g., how to rank potential barriers for EBS implementation). Finally, the VHWs/HCs who wanted to participate in the survey had to go to CHSs with an internet connection to access the survey online. This may have been a barrier for those interested in participating, and therefore contributed to lower response rates among VHWs/HCs compared to consistently higher response rates at higher levels of the health system (i.e., the provincial, district, and CHS levels).
Lack of reference models available for quantitative and qualitative measurements of EBS prompted the development of new tools for the pilot project (Additional file 1). The use of a variety of tools capable of extracting and triangulating both quantitative and qualitative data increased the ability to understand better the nuanced characteristics of the system. The quantitative data collected helped provide evidence on the number and type of events that were being reported, the timeliness of the system, and the event-to-signal ratio. However, it was not possible to estimate impact of EBS due to lack of baseline data before EBS implementation. The qualitative and subjective data collected helped to increase understanding of the system’s field utility and acceptance by field staff, associated human resource costs, reasons for non-compliance with established guidelines, and other factors influencing implementation that are important but often difficult to quantify.
Although a framework for short, intermediate, and long-term outcomes (Table 2) was designed early and provided a framework for implementation, it was difficult to measure some outcomes. For example, the outcome “increased trust among the community” was a challenging one, although the focus group discussions with VHWs and community members provided some understanding of their willingness to participate (or continue participating) in EBS. Data to estimate the long-term outcome “reduction of mortality associated with infectious diseases” were not available during the evaluation. It may be important to revise the outcome indicators and consider developing more feasible impact indicators for future studies.
It must be noted that, given the diversity and the large number of actors and levels involved in the evaluation, the collection of data using the tools involved the implementation of a complex and intense schedule of activities that was time and resource intensive. The collation and subsequent analysis of data collected during the focus group interviews of workers and key informants was especially challenging due to time-consuming activities such as recording, transcription, and translation of the interviews, and the proper interpretation of the findings.
With the information collected from the M&E process, the evaluation team recommended that the MoH Vietnam improve EBS data quality through: a) encouraging regular use of the verification form for all events; b) simplifying the monthly summary report form; c) ensuring each district records all events verified, including basic information such as type of event, and date/time of onset, detection, notification, and response; d) developing an electronic data management system for EBS reporting, and e) conducting refresher trainings on how to register and document signals and events properly. In addition, the TWG revised the EBS implementation guidelines and training materials before scale-up nationwide. The results of the evaluation and lessons learned were shared with MoH decision makers, and in March 2018, the MoH issued a mandate to incorporate the EBS program into Vietnam’s national surveillance platform.
The tools developed for the project have been customized and deployed in other countries including India, Cameroon, Ghana, and Kenya. All the tools are available in additional files, and countries implementing EBS are engaged to review these tools. Experience gained from implementing this project has guided the drafting of the Africa CDC framework for EBS . It is hoped that these resources, if used appropriately, will help accelerate the effectiveness of EBS programs globally.
The use of the M&E tools designed and developed for the EBS pilot project in Vietnam provided data on signals and events reported, timeliness of reporting and response, perceptions and opinions of implementers, and fidelity of EBS implementation. These data were valuable for the Vietnam’s MoH to understand the function of the EBS program, and the success and challenges of implementing this project in Vietnam. The EBS framework and the indicators and the tools developed for Vietnam can easily be customized to be used in any other country implementing EBS.
Availability of data and materials
Authors can confirm that all relevant data are included in the article and/or its supplementary information file.
US Centers for Disease Control and Prevention
District health center
Commune health station
Early warning and response
General Department of Preventive Medicine
Global Health Security Agenda
World Health Organization
Monitoring and evaluation
Ministry of Health
Provincial preventive medicine center
EBS Technical Working Group
Village health workers
World Health Organization. International health regulations 2005. 3rd ed. Geneva: World Health Organization; 2016.
Global Health Security Agenda. Global Health Security Agenda (GHSA) 2024 framework. 2018. Retrieved from https://ghsa2024.files.wordpress.com/2019/11/ghsa-2024-framework.pdf.
Balajee SA, Ray A, Mounts AW. Global health security: capacities for early event detection, epidemiologic workforce, and laboratory response. Health Secur. 2016;14(6):424–32.
World Health Organization. Technical consultation on event-based surveillance: meeting report. Lyon 19–21 March, 2013. Geneva: World Health Organization; 2013. [cited 2018 Nov 21]. Available from: http://www.episouthnetwork.org/sites/default/files/meeting_report_ebs_march_2013_final.pdf.
World Health Organization. Early detection, assessment and response to acute public health events: implementation of early warning and response with a focus on event-based surveillance. Interim version. Geneva: World Health Organization; 2014.
World Health Organization Regional Office for the Western Pacific. A guide to establishing event-based surveillance. Geneva: World Health Organization; 2008.
Dagina R, Murhekar M, Rosewell A, Pavlin BI. Event-based surveillance in Papua New Guinea: strengthening an international health regulations (2005) core capacity. Western Pac Surveill Response J. 2013;4(3):19–25.
Stone E, Miller L, Jasperse J, Privette G, Diez Beltran JC, et al. Community event-based surveillance for ebola virus disease in Sierra Leone: implementation of a national-level system during a crisis. PLoS Curr. 2016;8. https://doi.org/10.1371/currents.outbreaks.d119c71125b5cce312b9700d744c56d8.
Ratnayake R, Crowe SJ, Jasperse J, Privette G, Stone E, Miller L. Assessment of community event–based surveillance for Ebola virus disease, Sierra Leone, 2015. Emerg Infect Dis. 2016;22(8):1431–7.
Toyama Y, Ota M, Beyene BB. Event-based surveillance in North-Western Ethiopia: experience and lessons learnt in the field. Western Pac Surveill Response J. 2015;6(3):22–7.
Severi E, Kitching A, Crook PD. Evaluation of the health protection event-based surveillance for the London 2012 Olympic and Paralympic Games. Euro Surveill. 2014;19(24):20832.
Ebola Response Consortium; UKaid. Evaluation of the functionality and effectiveness of community-based event surveillance (CEBS) in Sierra Leone. London: United Kingdom Department for International Development; 2015. [cited 2018 Nov 21]. Available from https://reliefweb.int/sites/reliefweb.int/files/resources/ERC%20CEBS%20Evaluation%20Report.pdf.
Crowe S, Hertz D, Maenner M, Ratnayake R, Baker P, Lash RR, et al. A plan for community event-based surveillance to reduce Ebola transmission—Sierra Leone, 2014–2015. MMWR Morb Mortal Wkly Rep. 2015;64(3):70–3.
Groseclose S, Buckeridge D. Public health surveillance systems: recent advances in their use and evaluation. Annu Rev Public Health. 2017;38(1):57–79.
World Health Organization. Communicable disease surveillance and response systems. Guide to monitoring and evaluating. Geneva: World Health Organization; 2006.
Clara A, Do TT, Dao ATP, Tran PD, Dang TQ, Tran QD, Ngu ND, Ngo TH, Phan HC, Nguyen TTP, Lai AT, Nguyen DT, Nguyen MK, Nguyen HTM, Becknell S, Bernadotte C, Nguyen HT, Nguyen QC, Mounts AW, Balajee SA. Event-based surveillance at community and healthcare facilities, Vietnam, 2016–2017. Emerg Infect Dis. 2018;24(9):1649–58.
Clara A, Dao ATP, Do TT, Tran PD, Tran QD, Ngu ND, Ngo TH, Phan HC, Nguyen TTP, Bernadotte-Schmidt C, Nguyen HT, Alroy KA, Balajee SA, Mounts AW. Factors influencing community event-based surveillance: lessons learned from pilot implementation in Vietnam. Health Secur. 2018;16(Supp. 1):S66–75.
Hoa NT, Tam NM, Derese A, et al. Patient experiences of primary care quality amongst different types of health care facilities in Central Vietnam. BMC Health Serv Res. 2019;19:275. https://doi.org/10.1186/s12913-019-4089-y.
Huang HY. Mixture random-effect IRT models for controlling extreme response style on rating scales. Front Psychol. 2016;7:1706. https://doi.org/10.3389/fpsyg.2016.01706.
Liu M, Harbaugh AG, Harring JR, Hancock GR. The effect of extreme response and non-extreme response styles on testing measurement invariance. Front Psychol. 2017;8:726. https://doi.org/10.3389/fpsyg.2017.00726.
The Africa Centres for Diseases Control and Prevention (Africa CDC). Africa CDC EBS framework population. 2020. Retrieved from Africa CDC website: http://www.africacdc.org/resources/strategic-framework/Strategic%20Framework/Africa%20CDC%20EBS%20Framework%20-%20EN.pdf/detail.
We thank to all the collaboration and leadership of surveillance staff from GDPM, National Institute of Hygiene and Epidemiology, and Pasteur Institute in Ho Chi Minh City, as well as from local agencies and organizations of the 6 pilot provinces.
The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.
Funding for this EBS project was provided by the Global Health Security Agenda through CDC cooperative agreements with GDPM (GH001249), National Institute of Hygiene and Epidemiology (GH001989 and GH000116), Pasteur Institute in Ho Chi Minh City (GH001992 and GH001628), and PATH (GH001812).
Ethics approval and consent to participate
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Clara, A., Dao, A.T.P., Mounts, A.W. et al. Developing monitoring and evaluation tools for event-based surveillance: experience from Vietnam. Global Health 16, 38 (2020). https://doi.org/10.1186/s12992-020-00567-2
- Event-based surveillance
- Monitoring and evaluation tools