Evaluation of the Great Lakes Action Plan IV

Final Report: June 10, 2010

Previous page | Table of Contents | Next page

2.0 Evaluation Design

2.1 Objectives and Scope

The evaluation covers GLAP IV program activities from the 2005‑2006 to 2009‑2010 fiscal years. The evaluation findings will be used to assist the Great Lakes AOC program and other departmental stakeholders in the ongoing management of GLAP IV, and will contribute to efforts (beginning in February 2010) related to the renewal of the program for another five-year period. The evaluation issues include:

  1. Is there a continued need for a GLAP?
  2. Is the GLAP IV aligned with federal government priorities?
  3. Is the GLAP IV consistent with federal roles and responsibilities?
  4. To what extent have intended outcomes been achieved as a result of GLAP IV?
  5. Is the GLAP IV design appropriate for achieving expected program results?
  6. Have there been any unintended (positive or negative) outcomes?
  7. Is GLAP IV undertaking activities and delivering products in the most efficient manner?
  8. Is GLAP IV achieving its intended outcomes in the most economical manner?

The complete matrix of evaluation questions, indicators and data sources is presented in Annex B (under separate cover).

2.2 Approach and Methodology

Multiple lines of evidence were used to increase the reliability and robustness of the analysis. The evaluation included a review of documentation and literature, a review of GLSF and federal GLAP IV project files, and key informant interviews. Data collection occurred between February 3 and March 5, 2010.

2.2.1 Document and Literature Review

Secondary documentary sources were reviewed to develop a detailed program profile and contribute to addressing several of the evaluation questions, including relevance and program performance (achievement of program outcomes and the cost-efficiency analysis in particular). Environment Canada provided documents such as: background/planning and scoping materials; performance documents (e.g., Evaluation Plan for the Great Lakes Action Plan, Mid-term Review of GLAP IV); corporate/policy documents; reports on the Great Lakes (e.g., IJC reports); program work plans / progress reports; and other material. A document review template was developed to summarize findings in the documents pertaining to the evaluation questions. A bibliography of key documents is included in Annex C (under separate cover).

Top of Page

2.2.2 File Review

A review was conducted of a sample comprising 39 of a total of 234 files for contribution projects funded between 2005‑06 and 2008‑09 by the GLSF program component.[16] The sampling strategy was devised by Environment Canada, taking into consideration fiscal year, AOC priority group, and project value strata[17] so that the sample reflected the population. Annex D (under separate cover) provides a profile of the sample and population of GLSF files.

A data collection template was used to capture file information in a consistent manner, and to ensure that the content of the files was well-documented and linked to the specific evaluation questions and indicators (see Annex E under separate cover). The file review addressed issues related to the consistency of contributions to departmental objectives, the achievement of outputs and outcomes, unintended impacts, and project costs and leveraging. A profile of the characteristics of the files that were reviewed is included in Annex E (under separate cover).

The GLSF files were generally well documented, though there were some gaps. Contents typically included the project proposal (92 per cent of files), memorandum of understanding (MOU) or contribution agreement (90 per cent of files), financial information or records (e.g., invoices; 77 percent of files) and proposal review / technical review information (69 per cent of files). Almost 80 per cent of files had some output or outcome reporting (i.e., an annual report, progress report or final report). It is not clear why some files do not have output or outcome reporting. Table 2.1 indicates the extent to which key documents were included in the files that were reviewed.

Table 2.1: Distribution of Documents in GLSF Files Reviewed
Type of document Percent of files
Proposal 92%
MOU / Contribution Agreement 90%
Financial files/records 77%
Proposal review/approvals 69%
Final report 62%
Correspondence 56%
Technical screening 31%
Progress reporting / activity reporting 28%
Annual report 18%
Meeting/committee agendas/minutes 8%
Audit/evaluation reporting 5%
Other products 26%
Other (e.g., brochures, media releases, special technical reviews, photographs, CDs) 49%

Files for 21 projects sampled from those conducted by federal partners were also requested. All funded partners, including DFO, the former ECB, NWRI and PWGSC provided project-related documentation. However, the documentation was in various formats (e.g., conference presentations, reports, scientific publications, invoices/contribution agreements) and reporting was not always linked to a single project, thus was not amenable for capture using a standardized file review methodology. As a result, descriptive analyses were undertaken to summarize the contents of the federal files (see Annex F under separate cover). To better understand the reporting requirements and project-level documentation for federal projects, the key informant interview guides for federal partners and federal project proponents were expanded to address this issue.

Top of Page

2.2.3 Key Informant Interviews

In total, 46 key informant interviews were completed with Environment Canada personnel, committee members, other government representatives, project proponents (funded and unfunded) and experts. The targeted distribution of interviews was met in all cases and is as follows:

The interviews were conducted by phone and the duration of the interviews varied by respondent group, ranging from 30 to 90 minutes. Key informants were asked their views and opinions on a number of issues, including the continued need and relevance of the program, its design and delivery, the extent to which the program is meeting immediate, intermediate and longer-term outcomes, and the efficiency and economy of the program in achieving outputs and results. A master key informant interview guide is included in Annex G (under separate cover), from which individual interview guides were further tailored to be suitable for the various respondent groups.

To ensure a common understanding of the terms used in the analysis and reporting of interview results, the following guidelines have been used:

Top of Page

2.3 Challenges and Limitations

A number of methodological caveats should be noted for this evaluation. First, much of the information that was gathered for this review is drawn from internal sources: program documentation as well as interviews with program managers or federal partners, committee members and project proponents. With respect to program documentation, the main disadvantage is that these secondary sources often reflect stated intentions of a program (not implementation) and are not necessarily produced for the explicit purposes of the evaluation, thus are often not organized to present, or do not touch upon, issues related to those investigated through an evaluation. With respect to the interviews, respondents frequently had some prior connection to the program (e.g., unfunded applicants who were funded for another project, or experts who worked at Environment Canada in the past). Although a small number of interviews were conducted with unfunded GLSF applicants and experts, and priority was given to interviewing key informants who had not been awarded GLSF funding, the interview feedback benefits from respondents' knowledge of the program but lacks a high level of objectivity.

Second, the availability of program activity information and financial data with respect to the GLAP IV is limited. Specifically:

As a result, it was not possible to provide a complete picture of program activities and outputs. As well, the cost-efficiency analysis for the program was limited by the lack of clear financial information, though this analysis was supplemented by qualitative assessments of cost-efficiency obtained in the key informant interviews.

Finally, it is important to note that the determination of progress in AOCs and the achievement of program outcomes, particularly related to restoration and delisting, are only realizable in the longer term. The five-year term of GLAP IV and, indeed, the 20 years of GLAP funding are widely perceived to be insufficient to address the complex ecological issues in the AOCs. In addition, the achievement of longer-term GLAP IV program outcomes (e.g., delisting) requires the significant contribution of other partners and stakeholders. External factors, such as the availability of funding for infrastructure projects, are critical to the achievement of program outcomes. Thus, evaluation findings related to longer‑term outcomes of GLAP IV must be considered in this context.

Top of Page

2.4 Reporting Note

In the next chapter, findings are presented for the following five evaluation issues explored through the evaluation: relevance, design and delivery, achievement of program outcomes, efficiency, and economy.[19] A rating is also provided for each evaluation question. The ratings are based on a judgement of whether the findings indicate that:

In addition, a tilde symbol (~) is used to denote instances where outcome achievement ratings are based solely on subjective evidence. A summary of ratings for the evaluation issues and questions is presented in Annex H (under separate cover).


[16] GLSF files from 2009‑2010 were not sampled due to the recency of their approval and the limited reporting available for these projects during the evaluation data collection period.

[17] A homogeneous subgroup of members of the population.

[18] The process of transformation was designed to enable the Department to plan, manage and report by results. Transformation involved re-defining the results structure (Program Activity Architecture) and new management structures and processes to promote integrated management and decision making in the context of a clearer view of departmental results and strategic direction (Environment Canada, Report on Plans and Priorities, 2006‑2007).

[19] Treasury Board policy requires that all evaluations of federal programs and initiatives address the core issues of relevance and performance. Performance is a blended construct that includes a program's effectiveness (achievement of outcomes), efficiency (achievement of outputs/activities at the lowest cost) and economy (achievement of outcomes at the lowest cost).

Previous page | Table of Contents | Next page