Skip booklet index and go to page content

Evaluation of the Security and Prosperity Partnership of North America (SPP) Pilot Project on Reducing Emissions from Vehicles and Engines (PPRE)

3.0 Evaluation Design

3.1 Purpose and scope

A formal, independent evaluation is required to assess the relevance and performance3 of the SPP PPRE. The scope of the evaluation covers the three years of the program from 2008 to 2011 and all related project activities. This requirement fulfills the commitment made in the program design for this pilot, which was submitted to and approved by Treasury Board officials in the summer of 2009.

As per the requirements presented in the Directive on the Evaluation Function, this evaluation addresses value for money by including conclusions about the relevance and performance of the pilot program. As the funding is not ongoing, and there is no decision required on program renewal, recommendations are not part of this evaluation. Rather, the focus is on lessons learned that may inform related future work and regulatory development.

3.2 Evaluation Issues and Questions

The evaluation provides conclusions on the following broad evaluation questions:

  • Relevance: Did the pilot project remain consistent with, and contribute to, federal government priorities and responsibilities and address actual needs?
  • Performance – Project Design and Delivery: Was the PPRE project designed and delivered appropriately?
  • Performance – Effectiveness: Has the project achieved its intended outcomes?
  • Performance – Efficiency and Economy: Were the most appropriate, economical and efficient mechanisms used to achieve outcomes?

As this pilot project concluded at the end of March 2011, and all results are therefore not yet available, the evaluation report focuses on outputs and early outcomes achieved, as well as lessons learned.

Table 5 presents the evaluation framework, which outlines the evaluation questions and indicators that were explored as part of this evaluation, as well as the methods utilized to collect evidence.

Table 5: Evaluation Framework

Evaluation Issue & Question




1. Was there a need for the project? Is there a need for the project to continue?

  • Demonstration of environmental need
  • Demonstration that the pilot complements, and does not duplicate, other similar programs (CARA, ecoFREIGHT, etc.)
  • Demonstration that the pilot addresses gaps in environmental need
  • Project activities and reach are connected to environmental need
  • Views of stakeholders and project managers on the connection of project objectives with environmental need
  • Document review
  • Literature review
  • Interviews

2. Is the project aligned with federal government priorities?

  • Project’s objectives correspond to recent/current federal government priorities
  • Project’s objectives are aligned with current departmental strategic outcomes
  • Evidence of alignment with SPP, CARA, ecoFREIGHT and FleetSmart
  • Document review
  • Interviews

3. Is the project consistent with federal roles and responsibilities?

  • Project mandate aligned with federal government jurisdiction
  • Views on the appropriateness of federal involvement
  • Document review
  • Interviews
  • Literature review

Performance – Project Design and Delivery

4. Are appropriate performance data being collected, captured and safeguarded? If so, is this information being used to inform senior management / decision makers?

  • Existence of effective performance measurement plan
  • Extent to which performance data are reliable, timely and relevant
  • Extent to which performance data inform/support EC’s decision-making processes
  • Document review
  • Key informant interviews

5. Is the program design appropriate for achieving expected program results?

  • Plausible link between program activities, outputs, and intended outcomes
  • Clearly defined and understood governance structure, including program processes, roles, responsibilities and accountabilities
  • Program resources/capacity commensurate with expected program results
  • Views on the appropriateness of program activities, processes and governance structures
  • Document review
  • Key informant interviews

6. Are there any barriers or challenges that could affect the success of this project if it was to be implemented nationally or integrated into an existing national project?

  • Views of stakeholders and project managers
  • Evidence of documented barriers or challenges
  • Evidence of solutions or approaches to addressing barriers or challenges
  • Interviews
  • Document review

7. Have the results of the pilot project been communicated effectively? How could communications be improved?

  • Views of project managers, partners and recipients
  • Extent to which planned communications activities have been implemented as intended
  • Evidence of information sharing, distribution of project publications and reports, etc. to public fleet operations and other fleets
  • Interviews
  • Document review

8. What are the best practices and lessons learned as a result of the project?

  • Identified lessons learned and shared best practices among partners
  • Identified strengths and weaknesses
  • Views on strengths / best practices, weaknesses and lessons learned
  • Interviews
  • Document review

Performance – Effectiveness

9. To what extent have the intended outputs and outcomes been achieved as a result of the project?

  • Views of project managers, partners and recipients on achievement of outputs and outcomes
  • Analysis of available performance data on project’s expected results and performance indicator
  • Population-weighted count of municipalities, through FCM, that report  implementing green fleet initiatives
  • Document review
  • Interviews

10. Have there been any unintended outcomes, either positive or negative?

  • Presence/absence of unintended outcomes
  • Views on whether unintended outcomes occurred
  • Document review
  • Interviews

Performance – Efficiency and Economy

11. Has the project undertaken its activities and delivered products (e.g., technical studies) in the most efficient manner?

  • Analysis of project operational costs in relation to the production of outputs
  • Views on whether the cost of producing project outputs is as low as possible
  • Evidence of / views on whether there are alternative, more efficient ways of delivering project activities and outputs
  • Document review
  • Interviews
  • Literature review
  • Financial analysis

12. Has the project achieved, or is it on track to achieve, its intended outcomes in the most economical manner?

  • Views on whether good value is being obtained with respect to the use of public funds
  • Evidence of / views on whether there are alternative program models that would achieve the same expected outcomes at a lower cost
  • Document review
  • Interviews
  • Literature review

3.3 Evaluation Approach and Methodology

Three lines of inquiry were completed to collect evidence in order to address the evaluation questions and develop the evaluation findings:

  1. An in-depth review of all relevant documents provided by the organizations engaged in the PPRE was completed, and additional documents were requested where gaps were identified or where additional information was needed to supplement the existing evidence base. The project team reviewed each document to understand the mandated requirements, goals and objectives, governance structure, authorities, activities, outputs and outcomes for the pilot project and its components. The overall adequacy of the document evidence base (e.g., availability of performance information) was also assessed. All evidence from the documentation was examined against the defined evaluation issues. A bibliography of the documents reviewed is included in Annex 1.

    This data collection method addressed evaluation questions 1-12.

  2. A literature review was conducted to: (1) better understand the federal government’s approach to addressing HDD vehicle emissions compared to the federal approach in the United States; and (2) compare the pilot to other programs in Canada that aim to reduce HDD vehicle emissions through capacity building. Details on the literature reviewed are included in Annex 2.

    This data collection method addressed evaluation questions 1, 2, 3, 11 and 12.

  3. Key informant interviews with representatives from Environment Canada as well as key partners and stakeholders were either conducted in person (e.g., with Environment Canada staff) or by phone. Interviews provided information on the program components’ relevance and performance. A total of 8 interviews with 11 individuals were conducted. A list of the categories and the number of informants is included in Annex 3, as are the interview guides.

    This data collection method addressed evaluation questions 1-12.

Draft findings from the above lines of inquiry were integrated and presented to the managers overseeing the evaluation in Environment Canada, in order to validate key facts and address omissions and errors before completion of the final evaluation report.

3.4 Limitations

The PPRE was not complete at the time of the evaluation (the evaluation findings were arrived at in February 2011 and the pilot project ended in March 2011). As such, the evaluators have commented exclusively on the pilot’s ability to deliver its intended outputs and achieve its intended outcomes. Also, the PPRE will not receive any additional funding beyond this fiscal year. Therefore, the evaluation did not include a series of recommendations to improve the delivery and effectiveness of the PPRE. Instead, a summary of lessons learned has been included to inform the design and delivery of additional pilot projects that focus on addressing vehicle emissions.

The evaluation did not address the second initiative (i.e., Advancing Regulatory Action to Reduce Vehicle and Engine Emissions), as it was not implemented and no G&C funds were spent on this project.4 Therefore, the evaluation scope does not reflect the original program design as the second initiative was not examined.

3 As per the 2009 Treasury Board (TB) Policy on Evaluation, the evaluation should assess: Relevance: The extent to which the Program addresses a continued need, is aligned with Government priorities, and is aligned with federal roles and responsibilities; and Performance: The extent to which the Program has achieved its expected outcomes, and demonstrates efficiency and economy.

4 The Evaluation Team was informed that the second initiative was never launched formally and was not announced by the Minister of the Environment. Program staff indicated that G&C funding allocated to this initiative was not utilized for this project, although some of the funding was reallocated to the pilot project and the remainder was lapsed.

Date modified: