This page has been archived on the Web

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Skip booklet index and go to page content

Evaluation of the Great Lakes Action Plan IV

2.0 Evaluation Design

2.1 Objectives and Scope

The evaluation covers GLAP IV program activities from the 2005‑2006 to 2009‑2010 fiscal years. The evaluation findings will be used to assist the Great Lakes AOC program and other departmental stakeholders in the ongoing management of GLAP IV, and will contribute to efforts (beginning in February 2010) related to the renewal of the program for another five-year period. The evaluation issues include:

  1. Is there a continued need for a GLAP?
  2. Is the GLAP IV aligned with federal government priorities?
  3. Is the GLAP IV consistent with federal roles and responsibilities?
  4. To what extent have intended outcomes been achieved as a result of GLAP IV?
  5. Is the GLAP IV design appropriate for achieving expected program results?
  6. Have there been any unintended (positive or negative) outcomes?
  7. Is GLAP IV undertaking activities and delivering products in the most efficient manner?
    • How could the efficiency of the program's activities be improved?
    • Are there alternative, more efficient ways of achieving the program's objectives?
  8. Is GLAP IV achieving its intended outcomes in the most economical manner?

The complete matrix of evaluation questions, indicators and data sources is presented in Annex B (under separate cover).

2.2 Approach and Methodology

Multiple lines of evidence were used to increase the reliability and robustness of the analysis. The evaluation included a review of documentation and literature, a review of GLSF and federal GLAP IV project files, and key informant interviews. Data collection occurred between February 3 and March 5, 2010.

2.2.1 Document and Literature Review

Secondary documentary sources were reviewed to develop a detailed program profile and contribute to addressing several of the evaluation questions, including relevance and program performance (achievement of program outcomes and the cost-efficiency analysis in particular). Environment Canada provided documents such as: background/planning and scoping materials; performance documents (e.g., Evaluation Plan for the Great Lakes Action Plan, Mid-term Review of GLAP IV); corporate/policy documents; reports on the Great Lakes (e.g., IJC reports); program work plans / progress reports; and other material. A document review template was developed to summarize findings in the documents pertaining to the evaluation questions. A bibliography of key documents is included in Annex C (under separate cover).

Top of Page

2.2.2 File Review

A review was conducted of a sample comprising 39 of a total of 234 files for contribution projects funded between 2005‑06 and 2008‑09 by the GLSF program component.[16] The sampling strategy was devised by Environment Canada, taking into consideration fiscal year, AOC priority group, and project value strata[17] so that the sample reflected the population. Annex D (under separate cover) provides a profile of the sample and population of GLSF files.

A data collection template was used to capture file information in a consistent manner, and to ensure that the content of the files was well-documented and linked to the specific evaluation questions and indicators (see Annex E under separate cover). The file review addressed issues related to the consistency of contributions to departmental objectives, the achievement of outputs and outcomes, unintended impacts, and project costs and leveraging. A profile of the characteristics of the files that were reviewed is included in Annex E (under separate cover).

The GLSF files were generally well documented, though there were some gaps. Contents typically included the project proposal (92 per cent of files), memorandum of understanding (MOU) or contribution agreement (90 per cent of files), financial information or records (e.g., invoices; 77 percent of files) and proposal review / technical review information (69 per cent of files). Almost 80 per cent of files had some output or outcome reporting (i.e., an annual report, progress report or final report). It is not clear why some files do not have output or outcome reporting. Table 2.1 indicates the extent to which key documents were included in the files that were reviewed.

Table 2.1: Distribution of Documents in GLSF Files Reviewed
Type of documentPercent of files
Proposal92%
MOU / Contribution Agreement90%
Financial files/records77%
Proposal review/approvals69%
Final report62%
Correspondence56%
Technical screening31%
Progress reporting / activity reporting28%
Annual report18%
Meeting/committee agendas/minutes8%
Audit/evaluation reporting5%
Other products26%
Other (e.g., brochures, media releases, special technical reviews, photographs, CDs)49%

Files for 21 projects sampled from those conducted by federal partners were also requested. All funded partners, including DFO, the former ECB, NWRI and PWGSC provided project-related documentation. However, the documentation was in various formats (e.g., conference presentations, reports, scientific publications, invoices/contribution agreements) and reporting was not always linked to a single project, thus was not amenable for capture using a standardized file review methodology. As a result, descriptive analyses were undertaken to summarize the contents of the federal files (see Annex F under separate cover). To better understand the reporting requirements and project-level documentation for federal projects, the key informant interview guides for federal partners and federal project proponents were expanded to address this issue.

Top of Page

2.2.3 Key Informant Interviews

In total, 46 key informant interviews were completed with Environment Canada personnel, committee members, other government representatives, project proponents (funded and unfunded) and experts. The targeted distribution of interviews was met in all cases and is as follows:

  • Departmental program managers and federal partners involved in the management and delivery of GLAP IV (n=11);
  • Representatives from federal OGDs and Ontario who are members of committees (GLEC, GLAP Workplan Review Team, COA Management Committee, and COA Annex Implementation Committee (n=10);
  • Federal project proponents (n=4);
  • GLSF project proponents (n=10);
  • Unsuccessful applicants (n=4);
  • RAP committee members (n=4); and
  • International or academic experts on aquatic ecosystem remediation (n=3).

The interviews were conducted by phone and the duration of the interviews varied by respondent group, ranging from 30 to 90 minutes. Key informants were asked their views and opinions on a number of issues, including the continued need and relevance of the program, its design and delivery, the extent to which the program is meeting immediate, intermediate and longer-term outcomes, and the efficiency and economy of the program in achieving outputs and results. A master key informant interview guide is included in Annex G (under separate cover), from which individual interview guides were further tailored to be suitable for the various respondent groups.

To ensure a common understanding of the terms used in the analysis and reporting of interview results, the following guidelines have been used:

  • "A few/a small number of interviewees" = less than 25 per cent
  • "Some/a minority of interviewees" = 25‑49 per cent
  • "A majority of interviewees" = 50‑75 per cent
  • "Most interviewees" = over 75 per cent
  • "Almost all interviewees" = 95 per cent or more

Top of Page

2.3 Challenges and Limitations

A number of methodological caveats should be noted for this evaluation. First, much of the information that was gathered for this review is drawn from internal sources: program documentation as well as interviews with program managers or federal partners, committee members and project proponents. With respect to program documentation, the main disadvantage is that these secondary sources often reflect stated intentions of a program (not implementation) and are not necessarily produced for the explicit purposes of the evaluation, thus are often not organized to present, or do not touch upon, issues related to those investigated through an evaluation. With respect to the interviews, respondents frequently had some prior connection to the program (e.g., unfunded applicants who were funded for another project, or experts who worked at Environment Canada in the past). Although a small number of interviews were conducted with unfunded GLSF applicants and experts, and priority was given to interviewing key informants who had not been awarded GLSF funding, the interview feedback benefits from respondents' knowledge of the program but lacks a high level of objectivity.

Second, the availability of program activity information and financial data with respect to the GLAP IV is limited. Specifically:

  • Due to the departmental transformation in 2005‑2006 and unbundling of resources and activities of the GLBEI,[18] it is not possible to track Environment Canada departmental GLAP IV expenditures. Expenditures were being tracked at the ecosystem initiative level and not coded at the program level. This gap limits the cost-efficiency analysis conducted for the evaluation.
  • The GLSF program database is out-of-date with respect to entries after 2006‑2007, and key measures such as financial expenditures and leveraged funding from partners and program activities are either not captured or not captured systematically year-to-year. This was due to a shift from the use of MOUs to a grants and contributions (G&C) funding mechanism (with different reporting templates and a centralized information management system), as well as the internal re-organization of the unit.
  • There is no centralized database for performance measurement purposes for the program.

As a result, it was not possible to provide a complete picture of program activities and outputs. As well, the cost-efficiency analysis for the program was limited by the lack of clear financial information, though this analysis was supplemented by qualitative assessments of cost-efficiency obtained in the key informant interviews.

Finally, it is important to note that the determination of progress in AOCs and the achievement of program outcomes, particularly related to restoration and delisting, are only realizable in the longer term. The five-year term of GLAP IV and, indeed, the 20 years of GLAP funding are widely perceived to be insufficient to address the complex ecological issues in the AOCs. In addition, the achievement of longer-term GLAP IV program outcomes (e.g., delisting) requires the significant contribution of other partners and stakeholders. External factors, such as the availability of funding for infrastructure projects, are critical to the achievement of program outcomes. Thus, evaluation findings related to longer‑term outcomes of GLAP IV must be considered in this context.

Top of Page

2.4 Reporting Note

In the next chapter, findings are presented for the following five evaluation issues explored through the evaluation: relevance, design and delivery, achievement of program outcomes, efficiency, and economy.[19] A rating is also provided for each evaluation question. The ratings are based on a judgement of whether the findings indicate that:

  • the intended outcomes or goals have been achieved or met (labelled as Achieved);
  • considerable progress has been made to meet the intended outcomes or goals, but attention is still needed (labelled as Progress Made, Attention Needed);
  • little progress has been made to meet the intended outcome and attention is needed on a priority basis (labelled as Little Progress, Priority for Attention); and
  • a rating is not applicable (identified by the N/A symbol).

In addition, a tilde symbol (~) is used to denote instances where outcome achievement ratings are based solely on subjective evidence. A summary of ratings for the evaluation issues and questions is presented in Annex H (under separate cover).


[16] GLSF files from 2009‑2010 were not sampled due to the recency of their approval and the limited reporting available for these projects during the evaluation data collection period.

[17] A homogeneous subgroup of members of the population.

[18] The process of transformation was designed to enable the Department to plan, manage and report by results. Transformation involved re-defining the results structure (Program Activity Architecture) and new management structures and processes to promote integrated management and decision making in the context of a clearer view of departmental results and strategic direction (Environment Canada, Report on Plans and Priorities, 2006‑2007).

[19] Treasury Board policy requires that all evaluations of federal programs and initiatives address the core issues of relevance and performance. Performance is a blended construct that includes a program's effectiveness (achievement of outcomes), efficiency (achievement of outputs/activities at the lowest cost) and economy (achievement of outcomes at the lowest cost).

Date modified: