Pollution and Health: A Global Public Health Crisis
Explore our main report[JOB OPPORTUNITY] Environmental Pollution Programme Evaluator | Full-time | Africa
Organisation: Global Alliance on Health and Pollution (GAHP)
Location: Uganda
Duration: November 2025 – March 2026
About GAHP:
GAHP is a collaborative body comprising more than 60 members and dozens of observers that advocates for resources and solutions to pollution problems. GAHP was formed because international and national-level actors/ agencies recognise that a collaborative, multi-stakeholder, multi-sectoral approach is necessary and critical to addressing the global pollution crisis and its resulting health and economic impacts.
GAHP’s overall goal is to reduce deaths and illnesses caused by all forms of toxic pollution, including air, water, soil, and chemical waste, especially in low- and middle-income countries. These efforts will contribute to achieving the Sustainable Development Goals related to pollution, particularly in the health sector (Targets 3.4 and 3.9).
Position overview
GAHP is seeking a Programme Evaluator to assess individual project performance (in terms of relevance, effectiveness and efficiency) and determine outcomes and impacts (actual and potential) stemming from the projects, including their sustainability. The evaluation has two primary purposes: (i) to provide evidence of results to meet accountability requirements, and (ii) to promote operational improvement, learning and knowledge sharing through results and lessons learned among GAHP, DEFRA, NEMA, MWE, MoH, MAAIF, KCCA and other partners. Therefore, the evaluation will identify lessons of operational relevance for follow-up projects or other similar initiatives.
Key Responsibilities and Terms of Reference
Section 1. PROJECT BACKGROUND AND OVERVIEW
Project Description
Pollution is a growing health and environmental challenge in Uganda and remains a critical issue for the government. Uganda’s Third National Development Plan (NDP III) 2020/21–2024/25) underscores the growing threat of pollution to the country’s population and natural resources, and emphasises the need for the design of sustainable interventions tailored to reduce the impacts of pollution.
Currently, pollution poses significant health risks to the citizens of Uganda in both urban and rural areas. Some of the identified pressing pollution challenges that require urgent attention in Uganda include indoor air pollution, traffic-related air pollution, agrochemical use and misuse, improper solid waste management, heavy metal pollution, and contaminated sites.
The Environmental Pollution Programme (EPP) was established to address priority pollution issues in sub-Saharan Africa and Southeast Asia. It strives to deliver efficient and sustainable interventions that improve health outcomes and reduce the environmental impacts of various forms of pollution. The EPP is funded by UK International Development from the UK government through the Department for Environment, Food and Rural Affairs (DEFRA) and is being implemented in Uganda by the Global Alliance on Health and Pollution (GAHP).
The EPP programme in Uganda (EPP-UG) has six components: (i) development of a Health and Pollution Action Plan (HPAP) for Uganda, (ii) review of Uganda’s waste management and agriculture policies, with emphasis on pollution, Gender Equality, Disability, and Social Inclusion (GEDSI), poverty and health, (iii) Civil society-led awareness campaign on pollution and its health impacts, (iv) research on traffic-related air pollution (TRAP) in Kampala, (v) a study on the economic impacts of air and water pollution and the cost of inaction (vi) EPP-UG conference on pressing pollution issues in Uganda and their impacts on health. The programme has a typical structure for such initiatives, as observed in most projects of a similar purpose.
Executing Arrangements
The EPP-UG programme is implemented in Uganda by GAHP in collaboration with the National Environment Management Authority (NEMA), the Ministry of Water and Environment (MWE), the Ministry of Health (MoH), the Ministry of Agriculture, Animal Industry, and Fisheries (MAAIF), the Kampala Capital City Authority (KCCA), and other partners. The EPP-UG programme is supervised by the Department for Environment, Food & Rural Affairs of the United Kingdom Government (DEFRA).
Section 2. OBJECTIVE AND SCOPE OF THE EVALUATION
Key Evaluation Principles
Evaluation findings and judgements should be based on sound evidence and analysis, clearly documented in the evaluation report. Information will be triangulated (i.e. verified from different sources) as far as possible, and when verification is not possible, the single source will be mentioned (whilst anonymity is still protected). Analysis leading to evaluative judgements should always be clearly spelt out.
The “Why?” Question. As this is a terminal evaluation, particular attention should be given to learning from the experience. Therefore, the “Why?” question should be at the forefront of the consultants’ minds throughout the evaluation exercise and is supported by the use of a theory of change approach. This means that the consultants need to go beyond assessing “what” the programme performance was and make a serious effort to provide a deeper understanding of “why” the performance was as it was. This should provide the basis for the lessons that can be drawn from the programme.
Baselines and counterfactuals. When attempting to attribute outcomes and impacts to the individual project intervention, evaluators should consider the difference between what has happened with, and what would have happened without, the project. This implies that consideration should be given to the baseline conditions, trends, and counterfactuals in relation to the intended project outcomes and impacts. It also means that there should be plausible evidence to attribute such outcomes and impacts to the project's actions. Sometimes, adequate information on baseline conditions, trends or counterfactuals is lacking. In such cases, this should be clearly highlighted by the evaluators, along with any simplifying assumptions made to enable them to make informed judgements about individual project performance.
Communicating evaluation results. A key aim of the evaluation is to encourage reflection and learning among DEFRA and GAHP staff, as well as key programme partners and stakeholders. The consultant should consider how reflection and learning can be promoted, both through the evaluation process and in the communication of evaluation findings and key lessons. Clear and concise writing is required on all evaluation deliverables. Draft and final versions of the main evaluation report will be shared with key stakeholders.
Objective of the Evaluation
The Terminal Evaluation (TE) is undertaken after the EPP-UG programme to assess individual project performance (in terms of relevance, effectiveness, and efficiency), and to determine outcomes and impacts (actual and potential) stemming from the project, including their sustainability. The evaluation has two primary purposes: (i) to provide evidence of results to meet accountability requirements, and (ii) to promote operational improvement, learning and knowledge sharing through results and lessons learned among GAHP, DEFRA, NEMA, MWE, MoH, MAAIF, KCCA and other partners. Therefore, the evaluation will identify lessons of operational relevance for follow-up projects or other similar initiatives.
Evaluation Criteria
All evaluation criteria will be rated on a six-point scale. Sections A through I below outline the scope of the requirements. A link to a table for recording the ratings is provided in Annexe 1. A weighting table will be supplied in Excel format (link provided in Annexe 1) to support the determination of an overall project rating. The set of evaluation criteria are grouped in nine categories: (A) Strategic Relevance; (B) Quality of Project Design; (C) Nature of External Context; (D) Effectiveness, which comprises assessments of the achievement of outputs, achievement of outcomes and likelihood of impact; (E) Financial Management; (F) Efficiency; (G) Monitoring and Reporting; (H) Sustainability; and (I) Factors Affecting Project Performance. The evaluation consultants can, with justification, propose other evaluation criteria as deemed appropriate.
A. Strategic Relevance
The evaluation will assess ‘the extent to which the activity is suited to the priorities and policies of the target group, recipient and donor’. This criterion comprises three elements: Factors affecting this criterion may include stakeholders’ participation and cooperation, responsiveness to human rights and gender equity, and country ownership and drive.
B. Quality of Project Design
The quality of programme design is assessed using an agreed template during the evaluation inception phase, ratings are attributed to identified criteria, and an overall Programme Design Quality rating is established. This overall Programme Design Quality rating is entered in the final evaluation ratings table as item B. The Main Evaluation Report includes a summary of the project’s strengths and weaknesses at the design stage.
Factors affecting this criterion may include (at the design stage): stakeholders’ participation and cooperation, and responsiveness to human rights and gender equity, including the extent to which relevant actions are adequately budgeted for.
C. Nature of External Context
At the evaluation inception stage, a rating is established for the project’s external operating context (considering the prevalence of conflict, natural disasters and political upheaval). This rating is entered in the final evaluation ratings table as item C. Where a project has been rated as facing either an Unfavourable or Highly Unfavourable and unexpected external operating context, the overall rating for Effectiveness may be increased at the discretion of the Evaluation Consultant and Evaluation Manager together. A justification for such an increase must be given.
D. Effectiveness
The evaluation will assess effectiveness across three dimensions: achievement of outputs, achievement of direct outcomes and likelihood of impact.
Achievement of Outputs
The evaluation will assess the programme’s success in producing the programmed outputs (products and services delivered by the project itself) and achieving milestones as per the project workplan. Any formal modifications/revisions made during project implementation will be considered part of the project design. Where project outputs are inappropriately or inaccurately stated in the work plan, a table should be provided showing the original formulation and the amended version for transparency. The achievement of outputs will be assessed in terms of both quantity and quality, considering their usefulness and the timeliness of their delivery. The evaluation will provide a brief explanation of the reasons behind the project's success or shortcomings in delivering its programmed outputs and meeting expected quality standards.
Factors affecting this criterion may include preparation, readiness, and the quality of project management and supervision.
Achievement of Direct Outcomes
The achievement of direct outcomes is assessed as performance against the direct outcomes.
Factors affecting this criterion may include: the quality of project management and supervision; stakeholders’ participation and cooperation; responsiveness to human rights, gender equity, and communication; and public awareness.
Likelihood of Impact
The evaluation will assess the likelihood of the intended, positive impacts becoming a reality. The review will also consider the possibility that the intervention may lead, or contribute to, unintended adverse effects. Some of these potential negative effects may have been identified in the project design as risks.
The likelihood of impact could be evaluated through a theory of change reconstructed for evaluation purposes.
Factors affecting this criterion may include: the quality of project management and supervision, including adaptive project management; stakeholder participation and cooperation; responsiveness to human rights and gender equity; country ownership and drive; and communication and public awareness.
Financial Management
Financial management will be assessed under three broad themes: completeness of financial information, communication between financial and project management staff and compliance with relevant UN financial management standards and procedures. The evaluation will establish the actual spend across the life of the project of funds secured from all donors. This expenditure will be reported, where possible, at the output level and will be compared with the approved budget. The evaluation will also assess to what extent any project delays could have been avoided through stronger project management and identify any negative impacts caused by project delays. The review will verify the application of proper financial management standards and adherence to GAHP financial management policies. Any financial management issues that have impacted the timely delivery of the project or its quality of performance will be highlighted.
Factors affecting this criterion may include preparation, readiness, and the quality of project management and supervision.
Efficiency
The evaluation will assess the cost-effectiveness and timeliness of project execution. Focusing on the translation of inputs into outputs, cost-effectiveness refers to the extent to which an intervention achieves, or is expected to achieve, its results at the lowest possible cost. Timeliness refers to whether planned activities were delivered within expected timeframes, as well as whether events were sequenced efficiently. The evaluation will also assess to what extent any project extension could have been avoided through stronger project management and identify any negative impacts caused by project delays or extensions. The evaluation will describe any cost- or time-saving measures implemented to maximise results within the secured budget and agreed-upon project timeframe, and consider whether the project was implemented most efficiently compared to alternative interventions or approaches.
The evaluation will give special attention to efforts by the project teams to utilise and build upon pre-existing institutions, agreements, partnerships, data sources, synergies, and complementarities with other initiatives, programs, and projects to increase project efficiency. The evaluation will also consider the extent to which the project management minimised the GAHP environmental footprint.
Factors affecting this criterion may include preparation and readiness (e.ge, timeliness); quality of project management and supervision, and stakeholders’ participation and cooperation.
Monitoring and Reporting
The evaluation will assess monitoring and reporting across three sub-categories: monitoring design and budgeting, monitoring implementation and project reporting.
a) Monitoring Design and Budgeting
Each project should be supported by a sound monitoring plan designed to track progress against SMART[1] indicators toward achieving the project’s outputs and direct outcomes, including those disaggregated by gender or groups with low representation. The evaluation will assess the quality of the design of the monitoring plan as well as the funds allocated for its implementation. The adequacy of resources for mid-term and terminal evaluations/reviews should be discussed, if applicable.
b) Monitoring Implementation
The evaluation will assess whether the monitoring system was operational and facilitated the timely tracking of results and progress towards project objectives throughout the project implementation period. It will also consider how information generated by the monitoring system during project implementation was used to adapt and improve project execution, achieve outcomes, and ensure sustainability. The evaluation should confirm that funds allocated for monitoring were used to support this activity.
c)Project Reporting
GAHP prepared monthly updates for DEFRA. This information will be provided to the Evaluation Consultant(s) by the Evaluation Manager.
Factors affecting this criterion may include the quality of project management and supervision, as well as responsiveness to human rights and gender equity (e.g., disaggregated indicators and data).
Sustainability
Sustainability is understood as the probability of direct outcomes being maintained and developed after the close of the intervention. The evaluation will identify and assess the key conditions or factors that are likely to undermine or contribute to the persistence of achieved direct outcomes. Some aspects of sustainability may be embedded in the project’s design and implementation approaches. In contrast, others may be influenced by contextual circumstances or conditions that evolve over the life of the intervention. Where applicable, an assessment of bio-physical factors that may affect the sustainability of direct outcomes may also be included.
a) Socio-political Sustainability
The evaluation will assess the extent to which social or political factors support the continuation and further development of project direct outcomes. It will consider the level of ownership, interest, and commitment among government and other stakeholders to take the project’s achievements forward. In particular, the evaluation will assess whether individual capacity development efforts are likely to be sustained.
b) Financial Sustainability
Some direct outcomes, once achieved, do not require further financial inputs, e.g. the adoption of a revised policy. However, to derive a benefit from this outcome, further management action may still be needed, e.g. to undertake actions to enforce the policy. Other direct outcomes may depend on a continuous flow of action that needs to be resourced to be maintained, e.g., the continuation of a new resource management approach. The evaluation will assess the extent to which project outcomes are dependent on future funding for the benefits they bring to be sustained. Secured future funding is only relevant to financial sustainability when the direct outcomes of a project are extended into a future project phase. The question remains as to whether the future project outcomes will be financially sustainable.
c) Institutional Sustainability
The evaluation will assess the extent to which the sustainability of project outcomes depends on issues related to institutional frameworks and governance. It will consider whether institutional achievements, such as governance structures and processes, policies, sub-regional agreements, and legal and accountability frameworks, are robust enough to continue delivering the benefits associated with the project outcomes after project closure.
Factors affecting this criterion may include stakeholders’ participation and cooperation, as well as responsiveness to human rights and gender equity (e.g., where interventions are not inclusive, their sustainability may be undermined). Additionally, factors such as communication, public awareness, country ownership, and drivenness are also essential considerations.
I. Factors and Processes Affecting Project Performance
(These factors are rated in the ratings table but are discussed as cross-cutting themes as appropriate under the other evaluation criteria, above.)
1. Preparation and Readiness
This criterion focuses on the inception or mobilisation stage of the Programme. The evaluation will assess whether appropriate measures were taken to either address weaknesses in the project design or respond to changes that occurred between project approval, the securing of funds, and project mobilisation. The evaluation will consider the nature and quality of engagement with stakeholder groups by the project team, confirmation of partner capacity, development of partnership agreements, as well as initial staffing and financing arrangements. (Project preparation is covered in the template for the assessment of Project Design Quality).
2. Quality of Project Management and Supervision
In some cases, ‘project management and supervision’ will refer to the supervision and guidance provided by GAHP to implementing partners.
The evaluation will assess the effectiveness of project management in providing leadership towards achieving the planned outcomes, managing team structures, maintaining productive partner relationships (including HPAP Technical Working Groups), communication and collaboration with GAHP colleagues, risk management, problem-solving, project adaptation, and overall project execution. Evidence of adaptive project management should be highlighted.
3. Stakeholder Participation and Cooperation
Here, the term ‘stakeholder’ should be considered in a broad sense, encompassing all project partners, duty bearers with a role in delivering project outputs, target users of project outputs, and any other collaborating agents external to GAHP. The assessment will consider the quality and effectiveness of all forms of communication and consultation with stakeholders throughout the project’s life, as well as the support provided to maximise collaboration and coherence between various stakeholders. This includes sharing plans, pooling resources, and exchanging learning and expertise. The inclusion and participation of all diverse groups, including those based on gender, should be considered.
4. Responsiveness to Human Rights and Gender Equity
The evaluation will ascertain to what extent the project has applied the UN Common Understanding on the human rights-based approach (HRBA) and the UN Declaration on the Rights of Indigenous Peoples. Within this human rights context, the evaluation will assess to what extent the intervention adheres to the GAHP Policy and Strategy for Gender Equality and the Environment.
The report should present the extent to which the intervention, following an adequate gender analysis at the design stage, has implemented the identified actions and/or applied adaptive management to ensure that Gender Equity and Human Rights are adequately taken into account. In particular, the evaluation will consider to what extent project design (section B), the implementation that underpins effectiveness (section D), and monitoring (section G) have taken into consideration: (i) possible gender inequalities in access to and the control over natural resources; (ii) specific vulnerabilities of women and children to environmental degradation or disasters; (iii) the role of women in mitigating or adapting to environmental changes and engaging in environmental protection and rehabilitation.
5. Country Ownership and Driven-ness
The evaluation will assess the quality and degree of engagement of government/public sector agencies in the project. The review will consider the involvement not only of those directly involved in project execution and those participating in technical or leadership groups, but also those official representatives whose cooperation is needed for change to be embedded in their respective institutions and offices. This factor is concerned with the level of ownership generated by the project over outputs and outcomes, which is necessary for long-term impact to be realised. This ownership should adequately represent the needs and interests of all genders and marginalised groups.
6. Communication and Public Awareness
The evaluation will assess the effectiveness of: a) communication of learning and experience sharing between project partners and interested groups arising from the project during its life, and b) public awareness activities that were undertaken during the implementation of the project to influence attitudes or shape behaviour among wider communities and civil society at large. The evaluation should consider whether existing communication channels and networks were used effectively, including meeting the differentiated needs of gender and marginalised groups, and whether any feedback channels were established. Where knowledge-sharing platforms have been established under a project, the evaluation will comment on the sustainability of the communication channel in terms of socio-political, institutional, or financial sustainability, as appropriate.
Section 3. EVALUATION APPROACH, METHODS AND DELIVERABLES
The Terminal Evaluation will be a desk review supplemented by interviews with members of the project team, representatives of selected key stakeholders, and other methods chosen by the evaluation team. Both quantitative and qualitative evaluation methods will be used as appropriate to determine project achievements against the expected outputs, outcomes and impacts.
Evaluation Deliverables and Review Procedures
The evaluation team will prepare[2]:
Draft and Final Evaluation Report: containing an executive summary that can act as a stand-alone document; detailed analysis of the evaluation findings organised by evaluation criteria and supported with evidence; lessons learned and recommendations, and an annotated ratings table.
Review of the draft evaluation report. The evaluation team will submit a draft report to the Evaluation Manager and revise the draft in response to their comments and suggestions. Once a draft of adequate quality has been peer-reviewed and accepted, the Evaluation Manager will share the cleared draft report with the National Coordinator, who will notify the Programme Manager if the report contains any blatant factual errors. The Evaluation Manager will then forward the revised draft report (corrected by the evaluation team where necessary) to other project stakeholders for their review and comments. Stakeholders may provide feedback on any factual errors and highlight the significance of such errors in any conclusions. They may also offer feedback on the proposed recommendations and lessons. Any comments or responses to draft reports will be sent to the Evaluation Manager for consolidation. The Evaluation Manager will provide all comments to the evaluation team for consideration in preparing the final report, along with guidance on areas of contradiction or issues requiring an institutional response.
Based on a careful review of the evidence collated by the evaluation consultants and the internal consistency of the report, the Evaluation Manager will provide an assessment of the ratings in the final evaluation report. Where there are differences of opinion between the evaluator and the Evaluation Manager on project ratings, both viewpoints will be clearly presented in the final report.
The Consultants’ Team
For this evaluation, the evaluation team will consist of a Team Leader and one Supporting Consultant who will work with the Evaluation Manager. The consultants will liaise with the Evaluation Manager on any procedural and methodological matters related to the evaluation. The project team will, where possible, provide support (such as introductions and meetings) to allow the consultants to conduct the review as efficiently and independently as possible.
The Lead consultant will be hired over the period 1st of November, 2025 to 20th of March, 2025 and should have: an advanced university degree in environmental sciences, international development or other relevant political or social sciences area; a minimum of 10 years of technical / evaluation experience, including of evaluating national/regional programmes; a broad understanding of Persistent Organic Pollutants (POPs); heavy metals; air pollution; waste management; excellent writing skills in English; team leadership experience.
The Supporting Technical Consultant will be hired over the period of 1st of November, 2025 to 20th of March 20, 2025 and should have: an university degree in environmental sciences, international development or other relevant political or social sciences area; a minimum of 10 years of technical/research experience; good level understanding of Persistent Organic Pollutants (POPs), pesticides management, air pollution; waste management; CSOs operation and capacity building, and knowledge of cultural background of Uganda; sufficient writing skills in English. The Supporting consultant will provide expert support to the lead consultant.
The Team Leader will be responsible for overall management of the evaluation and timely delivery of its outputs. The Supporting Consultant will make substantive and high-quality contributions to the evaluation process and outputs. Both consultants will collaborate to ensure that all evaluation criteria and questions are thoroughly addressed.
Schedule of the evaluation
Deadline
Contracting procedures: November 1, 2025
Working Group Meetings: November 10 – December 2, 2025
Draft report to National Coordinator and Programme Manager: January 30, 2026
Draft Report shared with the donor: February 5, 2026
Draft report 2: February 28, 2026
Draft Report shared with wider group of stakeholders: March 3, 2026
Final Report: March 20, 2026
Contractual Arrangements
Evaluation Consultants will be selected and recruited by GAHP under an individual contract. By signing the service contract with GAHP, the consultant(s) certify that they have not been associated with the design and implementation of the project in any way that may jeopardise their independence and impartiality towards project achievements and the performance of project partners. Additionally, they will not have any future interests (within six months after the contract's completion) with the project’s executing or implementing units. All consultants are required to sign the Code of Conduct Agreement Form.
Fees will be paid in instalments, upon acceptance by the Programme Manager of the expected key deliverables. The schedule of payment is as follows:
Schedule of Payment for the Lead Consultant:
Percentage Payment: Approved Draft Main Evaluation Report: 60%
Approved Final Main Evaluation Report: 40%
The payment schedule for the Supporting Consultant will be specified in the contract.
Suppose the consultants are unable to provide the deliverables in accordance with these guidelines and in line with the expected quality standards. In that case, payment may be withheld at the discretion of the GAHP Grants Manager until the consultants have improved the deliverables to meet the established quality standards.
Suppose the consultant(s) fail to submit a satisfactory final product promptly, i.e. before the end date of their contract. In that case, the Programme Manager reserves the right to employ additional human resources to finalise the report and to reduce the consultants’ fees by an amount equal to the extra costs incurred to bring the report up to standard.
Qualifications:
An advanced university degree in environmental sciences, international development or other relevant political or social sciences area.
At least 10 years of technical / evaluation experience, including evaluating national/regional programmes;
A broad understanding of Persistent Organic Pollutants (POPs), heavy metals, air pollution, and waste management.
Excellent writing skills in English; team leadership experience.
Strong communication, networking, and interpersonal skills.
Ability to work independently
Application:
Interested candidates should submit a CV (no more than two pages) and a cover letter highlighting relevant experience, along with three work references, one of which must be from a government or UN organisation, to hassanatou@gahp.org and copy secretariat@gahp.org. The application deadline is November 17th.
In the cover letter, summarise in bullet points your applicable skills, experience and track record.
The subject line of your email should read: Programme Evaluator Position Application.
Only candidates selected for interviews will be contacted. No phone calls will be accepted.
EQUAL OPPORTUNITY EMPLOYMENT
GAHP is committed to providing equal opportunity employment for all qualified persons and not discriminating against any employee or applicant for employment because of race, colour, religion, sex, gender, age, national origin, veteran status, disability, or sexual orientation.
Orientation or any other protected status.
OFFICE CONDITION
This is a full-time position that requires a minimum of forty hours per week. GAHP currently has an office in Geneva, Switzerland. GAHP does not currently have an office in Uganda, and applicants will be expected to work from home full-time.
Reference will be given to those who meet all the requirements and can start immediately.
[1] SMART refers to indicators that are specific, measurable, assignable, realistic and time-specific.
[2] Evaluation Manager can provide samples as required