Evaluation of the EcoAction Community Funding Program

May 2009

| ToC | Previous | Next |


3.0 EVALUATION DESIGN

The evaluation of the EcoAction Community Funding Program was undertaken using multiple research methods. These included a document/file review, secondary data analysis, key informant interviews, and a survey of non-funded applicants. The purpose and scope of the evaluation and each of these activities are described in this section.

3.1 Purpose and Scope

This evaluation of the EcoAction Community Funding Program focused on the fiscal years from 2004–05 to 2008–09 and the four themes supported by the program: Climate Change, Clean Air, Clean Water and Nature.

The evaluation was based on preliminary work undertaken by Environment Canada, such as the existing logic model and proposed evaluation issues, questions, indicators and data sources as listed in the Terms of Reference of the evaluation. Annex 1 of this report includes the evaluation matrix used to guide the evaluation.

The evaluation assessed the relevance, success, cost effectiveness, and design and delivery of the EcoAction program. The evaluation was designed to determine whether the program:

3.2 Evaluation Approach and Methodology

3.2.1 Review of Program Documentation

The evaluation team completed a comprehensive review of program documentation and other relevant materials (see list in Annex 2). Specifically, the following types of documents were provided for this purpose:

In addition to reviewing program documentation, twelve projects were selected using a random but representative approach for a more in-depth review of their results. These project final reports were reviewed for this purpose but provided little added value to the evaluation. 

3.2.2 Review of Management Information System (MIS) Data

In consultation with program staff, relevant MIS data fields were selected and data on applications, partnerships and project outcomes (i.e., project indicators) were provided for review.

Application data included information on the project and organization that applied for funding, the EcoAction contribution and project value, and the funding decision and status of the project. In total, data were provided for 1878 applications, of which 750 had been approved and 486 projects had been completed.

Partnership data described information on those organizations that provided the leveraged funds from other sources required to obtain EcoAction funding.  These data included the partners’ name and type of organization, the type of contribution (cash or in-kind) and whether the contribution had been confirmed. For the 486 completed projects, there were 3443 partners listed in total, which represented about 60% of the pledged contributions marked as confirmed in the MIS.4

Indicator data included targets (i.e., goals) and results (i.e., actuals) for each project.  These indicators are designed to measure project objectives, both intended (i.e., targets or goals) and achieved (i.e., results or actuals).  When submitting an application, applicants identify relevant indicators from a list of 56 indicators and set targets. For example, the 56 indicators from which applicants can select include the number of jobs created, the number of partnerships developed, the amount of GHG emissions reduced, the kilometres of stream or lake bank protected or stabilized and the number of native plants, trees and shrubs planted. Upon completion of the project, applicants report on the results for the indicators they selected at the outset to indicate whether or not the project achieved their objectives.  Given that not all indicators are relevant for many projects, the data for some results (i.e., actuals) are based on only a handful of projects, while the data for other results are based on a larger number of projects.5

3.2.3 Interviews with Key Informants

To gather in-depth information on many of the central evaluation questions and to supplement information collected through other lines of evidence, the evaluation team completed 18 key informant interviews with funded applicants and representatives associated with the EcoAction program (Annex 4). The interviews were distributed as follows:

The interviews were undertaken in person or by telephone, depending on the location and preference of the respondent. Interviews were conducted through the use of a semi-structured interview guide, customized to the type of interviewee (i.e., senior management, regional and national program management/staff, and funded applicants), and in the official language of the respondent’s choice.

3.2.4 Analysis of Client Survey Data

In addition to the document and file review detailed above, the evaluation team also conducted data analyses of client surveys conducted in 2002 and 2008. 

The 2002 client survey was a telephone survey administered by the Corporate Research Group. The survey sample included 252 EcoAction clients whose project began and ended in the period between 1998 and 2008. In total, 196 clients completed a telephone survey for a response rate of 78%. The overall survey results are considered accurate within ±3.3%, 19 times out of 20. Response bias in terms of the distribution of survey respondents cannot be examined as insufficient information is available.

The 2008 client survey was an on-line survey administered by Phase 5. The survey sample included 307 EcoAction clients whose project began and ended between 2004 and 2008. In total, 126 clients completed an online survey, for a response rate of 41%. The overall survey results are considered accurate within ±6.7%, 19 times out of 20. There was little to no response bias in terms of regional distribution of survey respondents compared to the population.

Data from both client surveys were analyzed to obtain information regarding program relevance, success and cost effectiveness. It should be noted, however, that analysis was limited by access to unprocessed data and by the comparability of survey questions between the two surveys.  

3.2.5 Survey of Non-funded Applicants

As a complement to the client surveys, an online survey was undertaken with representatives from projects that were eligible but did not receive EcoAction funding. This survey sought to provide insight into the incremental impact of the EcoAction program by providing a comparison to funded projects.

A sample of non-funded applicants was prepared by Environment Canada that included 737 non-funded applications. This list was further refined to exclude applications without an email address as well as multiple applications made by the same individual. Only the most recent application by applicants listed more than once was included in the sample as it would have the most up-to-date contact information. The final sample consisted of 511 applications with a unique contact name, email address and telephone number.

An advance notification email was sent by Environment Canada to those included in the sample file. This was followed by a survey invitation sent by the evaluation team a few days later, with a link to the online survey. During the survey period, reminder emails to encourage participation were sent to those who had not yet completed the survey.

Data collection took place from December 12–29, 2008. The survey was later extended to January 16, 2009. In total, 154 non-funded applicants completed an online survey, for a response rate of 30% (if undelivered emails are excluded from the count, the response rate is 40%). The overall survey results are considered accurate within ±7.0%, 19 times out of 20, based on the sampled population (n = 737).

As shown in Table 3-1, there was little to no response bias, meaning that those who completed the survey are a fair representation of those who were eligible for EcoAction funding but did not receive it.

Table 3-1: Distribution of Survey Completions by Region and Category
 

Completions
(n = 154)

Sample
(n = 511)

Population
(n = 737)

Percent

Percent

Percent

Region

Pacific and Yukon

14%

14%

12%

Prairie and Northern

15%

12%

11%

Ontario

18%

19%

19%

Quebec

27%

31%

35%

Atlantic

27%

24%

24%

Category

Clean Air

1%

2%

3%

Clean Water

21%

26%

24%

Climate Change

44%

37%

40%

Nature

33%

33%

32%

Not Identified

1%

2%

2%

Source: Survey of Non-Funded Applicants

3.3 Research Challenges and Limitations

Notwithstanding the strengths of the evaluation (e.g., use of multiple lines of evidence, survey of non-funded applicants to determine incremental impact, etc.), some challenges and limitations were encountered.

It is difficult to measure the impact of the program.

Ideally, the impact of the EcoAction program may be measured effectively by selecting applicants with identical characteristics, or at least applicants that are similar on key characteristics, and comparing project outcomes for applicants who received funding and those who did not.  Differences in outcomes could then be attributed to the presence or absence of funding. This approach, however, requires using a quasi-experimental method with advance planning and monitoring that was not within the scope of this evaluation. 

A simpler but less effective method of obtaining insight into the impact of a program consists of examining a sample of non-funded applicants and comparing them to funded applicants, without consideration of differences between funded and non-funded projects. Since the program has existing results from client surveys (in 2002 and 2008) that collected data on funded applicants, a survey of non-funded applicants was created to provide a comparison. However, because the survey method is limited in the kind of information that can be collected, these two groups cannot be compared in detail.  A comparison of funded and non-funded applicants, however, does provide some insight into the impact of not receiving funding, which in turn can speak to the impact of the program.

Non-sampling errors, such as level of comprehension of survey questions and errors in self-reporting, can occur.

While surveys are designed to be representative of the population being studied, survey findings are subject to non-sampling errors. Non-sampling errors, such as respondents’ misunderstanding questions, inaccurately recalling their experience, or introducing a bias in their responses, are an intrinsic part of survey results.  Because of such errors, using administrative data can provide more reliable findings. 

For this evaluation, however, no administrative information was available on participants in the survey of funded applicants.  For example, the analysis had to rely on respondents to the survey when calculating the value of their EcoAction contribution despite the fact that 20% of survey responses were missing or had to be excluded from the analyses due to inconsistencies in the data.  Survey respondents were asked what the value was of their most recently completed project and what the proportion of that value was from EcoAction.  Fourteen respondents did not provide an answer to either one or both of these questions.  An additional 15 respondents were identified as outliers because they responded that their project value was less than $1,000 (values ranged from $39 to $268) or that the proportion of the project value from EcoAction funding was greater than 50% (percentages ranged from 55% to 100%).6  These outliers were excluded from the analyses.  In this case, administrative data could have provided more reliable information.

Raw data from the 2002 survey were not available for analysis.

The EcoAction program collected data on funded applicants in 2002 and again in 2008 through a client survey.  Although a report on quantitative findings and some tables were available, the raw data were not available from the 2002 survey of funded applicants.  As well, the survey instrument used in 2002 differed from the one used in 2008. As a result, comparisons between the 2002 and 2008 surveys were limited.  

There are inconsistencies in the MIS data.

It appears that program data are not systematically updated in the MIS.  For example, data for project status were inconsistent and therefore the number of approved and completed projects varied across sources.  As well, pledged contributions that were later confirmed appear not to have been updated in the database.  Some issues with the 56 indicators that are recorded in MIS posed problems: there were no indicator data associated with 218 projects even though they were completed and some had general comments on the file regarding project results.  These data issues indicate that MIS data are not a reliable source of information for the program.

Cost effectiveness analysis is limited.

Program expenditures and human resources utilization are not recorded specifically for the EcoAction program but rather include related programs. Although program staff were able to provide data specifically for the EcoAction program, these are estimates. 

Some findings are based on anecdotal evidence only.

Despite the fact that care was taken to support findings with other methods used in this evaluation, some evaluation questions are primarily addressed through only a few lines of evidence, some of which were largely anecdotal.  Findings related to program governance and accountability are, for the most part, based on key informant interviews.  Therefore, the extent to which these findings can be validated is limited within the scope of this evaluation.  As well, findings related to program impacts are based on self-report data which have not been validated by the program and could not be validated within the scope of the evaluation.  Although interviews with those external to the program would have offered an objective perspective on potential impacts as well as relevance, the program could not identify anyone outside EcoAction who had sufficient knowledge to provide meaningful comment.  

The report identifies which line(s) of evidence informed the findings presented in each section, allowing the reader to recall the appropriate limitations described above. 

3.4 Notes on Reporting

The following should be considered when reviewing this report:

 


4 The proportion of contributions that were pledged but not confirmed may suggest that the MIS is not up-to-date. This issue is discussed in Section 4.2.2 b) Leveraging of Support for Environmental Activities as well as Annex 2 Assessing Leveraged Contributions.

5 Only those indicators with data on at least 5% of all projects are reported in the findings section.  Findings for all indicators are included in Annex 3.

6 Note that the value of the EcoAction portion of funding must be less than or equal to 50% of the total project value, given one of the requirements for eligibility is that the applicant lever at least half of the project value from other sources.

| ToC | Previous | Next |