Closing date:

ToR of the final evaluation of the DRR project

City/location:
Yangon
Propose an edit Upload your content

This job posting has closed

The Myanmar Red Cross Society (MRCS) Disaster Risk Reduction Capacity Building & Community-Based Disaster Risk Reduction Coastal Area Program (DRR Program) is being implemented since 2009 in 141 communities and 215 schools located in 9 townships spread in 4 different coastal region/states. The year 2013 is the final year of implementation dedicated to the completion of field work while capitalizing and reviewing the tools and methodologies having been developed throughout the project timeframe.

Indeed, beyond community and school engagement, the program has contributed significantly to MRCS capacity-building in the field of DRR, at both national and township levels. As a consequence, in collaboration with the International Federation of Red Cross and Red Crescent Societies (IFRC) and the American Red Cross (AmCross), a MRCS CBDRR Framework is currently under development to which the project is making contribution (financially and technically).

In this context of lessons learning and standardization, the project partners (including MRCS, French Red Cross and the Canadian Red Cross) intend to carry out an external final evaluation of the project in order to review the past interventions and contribute to the improvement of future DRR programming in the country.

Purposes

The overall objective of the final evaluation is to review progress towards the project’s objectives and results, assess the efficiency and cost-effectiveness of implementation, identify strengths and weaknesses in project design and implementation, and provide recommendations on design modifications and specific actions that would increase the effectiveness and impact of future similar initiatives.

It is expected that this evaluation will be a learning process for the MRCS as the leading implementation actor for the project and FRC and CRC as the financial and technical support providers, identifying key operation and strategic recommendations for improvement of DRR policies and procedures. That is why a participatory process will be required to ensure that all project components benefit from the evaluation activities.

Users and Intended Use:
- MRCS Governance : To benefit for strategic recommendations on DRR and branch development that will inform future DRR/OD policies of the Society. The MRCS strategic plan will also be benefitted.
HOD, Dy HODs and Project coordinators of the DM Division: To identify remaining priority unmet needs of MRCS and the communities of this program that will guide design of upcoming DRR programs funded by other donors

To identify key good practices and lessons learnt from the past 5 year engagement with branches, communities and schools.

- FRC / CRC : FRC Country Delegation. To understand strengths and weaknesses of the existing beneficiary engagement and community mobilization strategies based on perspectives of different stakeholders.

- FRC/CRC (Geographic Desks, DRR Advisors, PMER, advisors, Movement Relations Bureau) : To have evidence to improve the existing beneficiary engagement and community mobilization strategies and to develop new strategies for other community-based programs.

To provide inputs into potential future DRR programming, both in Myanmar and in the region.

- Other
IFRC Country office and other interested Movement Partners : To learn about the DRR project, its strengths and weaknesses, and benefit from the experience in developing new projects.

Backround of the program/project

The Myanmar Red Cross Society Disaster Risk Reduction Capacity Building & Community-Based Disaster Risk Reduction Coastal Area Program (DRR Program) was commenced in January 2009 with the overall objective of contributing to increase the resilience of population at risk facing natural disasters in coastal area of Myanmar. The specific objective of the program was to increase the capacity of the Myanmar Red Cross Society (MRCS), communities and schools, in order to improve community based disaster risk reduction activities in Townships at high risk areas of the coastal regions (Ayeyarwaddy, Yangon, Bag and Rakhine). MRCS is responsible to implement the program where the French Red Cross (FRC) and the Canadian Red Cross Society (CRC) provide with the financial and technical support for the implementation.

MRCS capacity building, community resilience and school disaster risk reduction are three key components of this program. The fundamental strategy of this program is to build MRCS HQ and Township Red Cross (TRC) DRR capacities and mobilize them to enhance community resilience and school disaster risk reduction. In line with this, the Program has set four expected results of this intervention:

  1. Capacities and knowhow of MRCS DRR staff and volunteers (Headquarters, Regions and Townships branches) are improved in order to improve the assistance for vulnerable communities.
  2. The capacity (knowledge, awareness and organization) of targeted at-risk communities to better cope with natural hazards is improved.
  3. Preparedness of targeted schools in at-risk areas is strengthened through planning, knowledge and awareness mechanisms.
  4. Widespread information and communication related to DRR activities are realized at national level to increase the advocacy of DRR mechanisms to Myanmar stakeholders.


The program has been implemented in 9 Townships, 141 villages and 215 schools of 4 coastal regions/state (formally called divisions) of Myanmar. Almost after 3 years of the project implementation, the implementing partners commissioned a joint team to undertake a mid-term review in 2011. The MTR report encompassed data and information collected during the mid-term review and analyzed all internal and external factors in order to guide the next steps for the program.

One of the strong concerns raised by this study was the sustainability of the DRR unit and the program activities at all levels. In order to sustain the existing structures, activities and approaches the study suggested that it is further necessary to build capacity of MRCS HQ and Townships in human, organizational and financial resources and technical backstopping rather than expanding the program areas. As a result, the target regions/states were reduced from 6 to 4 and new Township Reinforcement Activities (TRA) were implemented in previous townships. At national level, the process towards a CBDRR Common Framework was initiated in March 2013 and should be completed by the end of the year.

Evaluation scope

General scope of this evaluation (programmatic and geographic):

The evaluation will focus on the three first expected results of the program.

It is planned that the evaluation team will visit three regions : Ayeyarwaddy, Bago and Yangon. In each region, one township will be chosen and communities and schools will be visited together with the township branch representatives. Primary data from at least 17 villages and 24 schools in 3 selected Townships will be collected .

Scope of evaluation criteria:

The following criteria will be reviewed in priority :

Relevance and appropriateness - The extent to which the objectives of an operation are consistent with beneficiaries’ needs, country needs, organisational priorities, and partners’ and donors’ policies.
Effectiveness – The extent, to which the program/project’s outcomes (immediate and intermediate outcomes) were achieved, or expected to be achieved, taking into account their relative importance.
Sustainability – The continuation of benefits from communities, schools and townships after the programme has been completed. Sustainability is concerned with measuring whether an activity or an impact is likely to continue after donor funding has been withdrawn.
Impact – Positive and negative, intended or unintended long-term result produced by the project. Although too early to assess, the evaluation will propose preliminary conclusions on this criteria

Cross-Cutting Themes to be considered in the Evaluation:

Gender Equity: The extent to which gender equity issues are integrated throughout the project cycle management in line with the gender policy or strategy of MRCS and IFRC.
Advocacy: The extent to which the good practice of the project are used by MRCS to influence change at local/national levels, especially with regards to DRR and the involvement of government counterparts.
Capacity Building: The extent to which capacity building of the MRCS and targeted communities is done and their positive and negative effects.

Evaluatuon criteria and key evaluation questions

Relevance and appropriateness:

  • What are priority unmet operational and organizational needs of MRCS that remain relevant for the second phase of the DRR program?
  • What are priority unmet needs of the targeted communities and the schools that remain relevant for the second phase of the DRR program?
  • What are the remaining gaps in integrating issues of power relations (including gender-based) and socio-cultural practices into the community mobilization
  • To what extent promising practices and deficiencies of the project have contributed to the MRCS CBDRR framework?

Effectiveness:

  • To what extent were the three considered project expected results (skills, knowledge and behaviors/practices, systems) achieved?
  • For the major outcomes (e.g., significant changes in practices by MRCS or community or school), which key stakeholders (e.g., MRCS, community themselves, others) that contributed to these changes, and how? ((i.e., the stakeholders’ characteristics, activities, processes, products or services that influenced the significant changes). Will the contributing factors to the changes replicable in other context?
  • Were there any unforeseen/foreseen negative side-effects or positive side-effects?
  • What are persistent challenges in the project that could impact the second phase of the program?

Sustainability:

FOR MRCS:

  • Which activities that need financial resources to maintain outcomes (e.g., skills, awareness, and practices) and their costs have been covered by the MRCS (township level only?) at the time of the evaluation?
  • What is the possibility of the achieved positive changes (e.g., skills, awareness, and practices) at the MRCS (township only?) will be maintained?
  • To what extent the match between engagement of the MRCS and their characteristics (e.g., interest, skills, leadership, etc.) in the project has affected continuation of project activities and/or maintenance of results achieved by the project?
  • What will be additional intervention required to facilitate stronger sustainability at the community level?

For community:

  • Which activities that need financial resources to maintain outcomes (e.g., skills, awareness, and practices) and their costs have been covered by the community or the schools at the time of the evaluation?
  • What is the possibility of the achieved positive changes (e.g., skills, awareness, practices) at the community level or the school will be maintained?
  • To what extent the match between engagement of the communities or the school and their characteristics (e.g., interest, collectivisms versus individualism, skills, etc) has affected maintenance of results achieved by the project?
  • What will be additional intervention required to facilitate stronger sustainability at the community level (community and school)?

Impact:

  • Can the target populations (communities and schools) be considered better prepared and less at-risk than before the project started?
  • Has the disaster preparedness level of the Township Red Cross and villages improved?

Evaluation methodology and process

The evaluation management team will comprise of the Secretary General of the MRCS, the Desk Officer of FRC (based in Paris, France) and the Program Manager of the CRC (based in Ottawa, Canada).The management team will oversee the evaluation and will ensure that it upholds the expected quality of this TOR. It will make a decision on the final TOR, selection of evaluator(s), evaluation matrix, tools and methodologies, and the final report.

The evaluation team will be composed of at least (1) an international external evaluator and (2) a national external evaluator. The international evaluator will be the team leader and as such will be responsible for the overall process from the design, preparation of tools, data collection, coordination with relevant parties, and finalisation of the report (to be revised accordingly). In addition, as required by the team, MRCS will provide volunteers to assist in the collection and entry of data.

The evaluation team will work in collaboration with the following actors:

  • The MRCS DM Division, who will facilitate linkages with the program team and the different division of MRCS.
  • The FRC Program Coordinator who will provide all needed documents, data and information linked to the program.
  • The FRC Regional DRR Delegate (based in Bangkok), who will provide highlights on the technical aspects of the DRR Program in line with South East Asia context.
  • The IFRC DM Delegate, who will highlight the DM and DRR part of the MRCS 2011-2015 strategy and ongoing mechanisms.
  • The Regional Red Cross Executive Committee (RCEC) of the Myanmar Red Cross particularly G1 and G2
  • The Townships’ RCECs of the Myanmar Red Cross Society and Red Cross Volunteers (RCVs)


The lead evaluator will present the findings in-country to key decision-makers and relevant staff of MRCS, FRC, IFRC and AmCross. A workshop (1/2 day or one day) will be held with all key national stakeholders to discuss to present the preliminary findings of the evaluation. It will be also a good opportunity to collectively formulate recommendations for future improvement of the approach.

MRCS/FRC in country will provide necessary logistical support for the field visits and the organization of the national workshop. Hence all costs related to those should not be included in the consultant budget.

Key milestones of the evaluation :

Phase 1: Desk Review. Review of all program documents (proposal, reports, tools, guidelines, IEC materials, methodologies and orientations), background, contextual components. 2 working days.
Phase 2: Preparation. Preparation – according to work plan – of field trips; information gathering to be done at the MRCS/FRC office and interviews with important stakeholders in Yangon. 5 working days
Phase 3: Inception report. Based on this review and interviews, develop the design, tools, review matrix, work plan of the review. 1 working day
Phase 4: Field work (3 Township branches, 17 communities and 24 schools). In order to assess the contribution made by the program (‘project’ and ‘non-project’ comparisons could also be made). The evaluator team will be accompanied by MRCS representatives throughout the field visit. 20 working days
Phase 5: Technical review With key MRCS, CRC, FRC and IFRC personnel of initial data and findings . 4 working days
Phase 6: Draft preparation and presentation. First draft to be presented during a national workshop as well as technical sessions with the key stakeholders. 5 working days
Phase 7: Final report . Including feedback from the management team and final editing. 3 working days
Total: 40 working days

This planning is indicative, and could be reviewed and changed by evaluation team in accordance with the circumstances upon approval by all parties.

The final evaluator or the evaluation team is free to propose alternative methodology at proposal stage as long as the expected objectives are covered in a pertinent and effective manner.

Review of the final report

The evaluation process will be followed to ensure stakeholders input while maintaining the integrity and independence of the evaluation report according to the following lines.

  • Inaccuracy. Inaccuracies are factual, supported with undisputable evidence, and therefore should be corrected in the evaluation report itself.
  • Clarifications. A clarification is additional, explanatory information to what the evaluators provided in the report. It is the evaluators’ decision whether to revise their report according to a clarification; if not, the evaluation management response team can decide whether to include the clarification in their management response.
  • Difference of opinion. A difference of opinion does not pertain to the findings (which are factual), but to the conclusions and/or recommendations. These may be expressed to the evaluators during the evaluation process. It is the evaluators’ decision whether to revise their report according to a difference of opinion; if not, the evaluation management response team can decide whether to include the clarification in their management response.

Evaluation deliverables

Inception Report – The inception report will include the proposed methodologies, data collection and reporting plans with draft data collection tools such as interview guides, a timeframe with firm dates for deliverables, and the travel and logistical arrangements for the team – at least one day before starting the field work

Initial Findings, Recommendations and Follow-up Actions – at least two working days before the national workshop

The First Draft report – A draft report, consolidating findings from the evaluation, identifying key findings, conclusions, recommendations and lessons for the current and future similar program, will be submitted within 2 weeks after the national workshop to the Evaluation Management Team.

Final report – The final report will contain a short executive summary (no more than 5 pages) and a main body of the report (no more than 60 pages excluding annexes) covering the background of the intervention evaluated, a description of the evaluation methods and limitations, findings (to be presented by evaluation criteria), conclusions, lessons learned, clear recommendations. Recommendations should outline recommendations that the program staff and the reviewer(s) have in common or different views based on the workshop to discuss the findings. The report should also contain appropriate appendices, including a copy of the ToR, cited resources or bibliography, a list of those interviewed and any other relevant materials (e.g., tools). The final report will be submitted 2 weeks after receipt of the consolidated feedback from the Evaluation Management Team.

Evaluation quality and ethical standards

The evaluators should take all reasonable steps to ensure that the evaluation is designed and conducted to respect and protect the rights and welfare of the people and communities involved and to ensure that the evaluation is technically accurate and reliable, is conducted in a transparent and impartial manner, and contributes to organizational learning and accountability.

The evaluation standards are :

  1. Utility: Evaluations must be useful and used.
  2. Feasibility: Evaluations must be realistic, diplomatic, and managed in a sensible, cost effective manner.
  3. Ethics & Legality: Evaluations must be conducted in an ethical and legal manner, with particular regard for the welfare of those involved in and affected by the evaluation.
  4. Impartiality & Independence; Evaluations should be impartial, providing a comprehensive and unbiased assessment that takes into account the views of all stakeholders.
  5. Transparency: Evaluation activities should reflect an attitude of openness and transparency.
  6. Accuracy: Evaluations should be technical accurate, providing sufficient information about the data collection, analysis, and interpretation methods so that its worth or merit can be determined.
  7. Participation: Stakeholders should be consulted and meaningfully involved in the evaluation process when feasible and appropriate.
  8. Collaboration: Collaboration between key operating partners in the evaluation process improves the legitimacy and utility of the evaluation.


It is also expected that the evaluation will respect the seven Fundamental Principles of the Red Cross and Red Crescent : 1) humanity, 2) impartiality, 3) neutrality, 4) independence, 5) voluntary service, 6) unity, and 7) universality.

MRCS/FRC/CRC are seeking a team of professionals (at least 1 national and 1 international consultants) to undertake this evaluation. It is anticipated that the following skills and experience will be represented in the team:

Requirements for Team Leader (international consultant):

  1. Demonstrated experience in leading at least 3 evaluations of similar scale of DRR related humanitarian projects/programs
  2. Good knowledge of the Red Cross/Red Crescent Movement
  3. Demonstrated experience in designing and implementing DRR projects
  4. Demonstrated experience in institutional capacity-building / organizational development
  5. Demonstrated experience in quantitative and qualitative data collection and analysis
  6. Fluency in English required; Local translator shall be provided, if needed.


Requirements for National Consultant

The key responsibility of the national consultant is to assist the team leader in data collection and analysis. It is anticipated that the following requirements are met:

  1. Demonstrated experience in conducting mid-term or final evaluations of DRR projects/programs.
  2. Demonstrated experience in designing and implementing DRR and/or capacity projects
  3. Demonstrated experience in institutional capacity-building / organizational development
  4. Demonstrated experience in quantitative and qualitative data collection and analysis
  5. Fluency in English and Myanmar languages required
Share this

Is this page useful?

Yes No
Report an issue on this page

Thank you. If you have 2 minutes, we would benefit from additional feedback (link opens in a new window).