Previous page | ToC | Next page
This evaluation was conducted as a part of the Treasury Board Secretariat’s requirement to evaluate funding for the program components of the CAA International Actions Theme. The objectives of this evaluation were to:
This evaluation examined all planned activities for the International Actions Theme of the CAA, covering the four-year time frame from 2007–08 through 2010–11. The evaluation examined evidence provided on the program components from the start of the initiative in 2007–08 up to December 31, 2009. Given the timing of the implementation phase for this evaluation (the evaluation was conducted at this point to contribute to the overall evaluation of the entire CAA expected in October 2010), performance data for intermediate, final and even some immediate outcomes were not available. The focus of this evaluation, therefore, was on outputs and movement toward achieving early outcomes. Where processes are ongoing, the evaluation examined the extent to which the outcomes have been achieved and/or the likelihood that planned activities will achieve expected results. In addition, the evaluation examined how external factors have affected, or are likely to affect, the Theme’s ability to achieve its expected outcomes.
Of note, the International Actions Theme is linked to the UNFCCC and Kyoto Protocol through the International Financial and Other Obligations and the International Partnerships and Negotiations program components (refer to section 2.1 for the program profile), which are both within the scope of this evaluation. Activities directly related to Canada meeting Kyoto targets for emissions are not part of the International Actions Theme and are outside the scope of this evaluation.
Other initiatives that were considered in scoping this evaluation included the following:
Thirteen evaluation questions, grouped according to issue, were addressed in this evaluation to assess relevance and performance. (Evaluation Framework is provided in Annex 1.)
Issue 1: Continued Need for the Program
Issue 2: Alignment with Government Priorities
Issue 3: Alignment with Federal Roles and Responsibilities
Issue 4: Achievement of Expected Outcomes
Issue 5: Demonstrated Efficiency and Economy
The three methods used to collect data for this evaluation are described below.15
1. Document Review:
An in-depth review of all relevant documents provided by the departments engaged in the International Actions Theme was conducted, and further documents were requested where gaps were identified or where additional information was needed to supplement the evidence base. The project team reviewed each document to understand the mandated requirements, goals and objectives, governance structure, authorities, activities, outputs and outcomes for each component. The overall adequacy of the document evidence base (e.g., availability of performance information) was also assessed. All evidence from the documentation was recorded against the defined evaluation issues. This data collection method addressed evaluation questions #1–13.
2. Key‑Informant Interviews:
Interviews were conducted with key informants identified by the departments involved in the CAA International Actions Theme, with suggested additions by the project team (e.g., external contacts, international experts). Interviews were either conducted in person (e.g., senior management, executives in the private sector) or by phone. Interviews provided information on the program components’ relevance and performance. Additional interviews were conducted as the evaluation progressed based on feedback from interviewees and information learned through the process. A total of 43 interviews were conducted. This data collection method addressed evaluation questions #1–13.
3. Focus Group of Experts:
A two-hour focus group session was held with five Canadian experts on climate change and air quality issues.16 The purpose of the focus group was to:
- share and test preliminary findings and key messages emerging from the evaluation; and
- collect additional information and expert advice on Canada’s international performance related primarily to climate change, and also to air quality.
This data collection method addressed evaluation questions #5–8.
Preliminary findings were validated by the appropriate program officers from each of the four components of the International Actions Theme in order to correct for omissions and errors before completion of the evaluation report.
The evaluation focused on the completion of program activities using allocated resources from the fiscal years 2007–08 to 2009–10. As such, the evaluation focused on the achievement of outputs and intended immediate outcomes only, since data collection took place approximately one year before the end of the funding (presented in the program Logic Model in Figure 1). The achievement of these higher‑level outcomes may be assessed in future evaluations, depending on future funding for activities under this theme.
The confidentiality of the UNFCCC negotiations, and the direction provided by Cabinet to the Canadian delegation, constrained the evaluators’ ability to investigate the extent to which success in securing Canada’s stated interests in the Convention can be attributed directly or indirectly to efforts by the Government. In particular, interviews with other international negotiators were limited as were interviews with representatives from industry17 and the non-governmental sector. These interviews would have provided an external perspective on the Theme’s performance and relative impact on advancing Canada’s positions in the UNFCCC as well as in other multilateral negotiation processes in which Canada seeks to advance its positions. As a result, the evaluation was limited in providing an independent perspective on the Theme’s performance, particularly how its negotiation activities influence the international negotiation process within and outside of the UNFCCC.
Ongoing performance data were not available for all program components, particularly financial information, which limited the demonstration of achievement of expected outcomes.
15 The key methodological instruments used for data collection are presented under separate cover in a technical appendix.
16 External experts did not fulfill an advisory or oversight function to the evaluation.
17 Industry representatives beyond those related to the APP.