Chief Audit Executive / Director General, Audit and Evaluation Branch – Annual Report – 2009-2010

June 2010

Previous page | Table of Contents | Next page

4 Performance Summary: Evaluation

4.1 Progress Against Plan

Progress Against Plan: Evaluation

Evaluation Projects Planned for this Period

34

  • 24 evaluations
  • 7 evaluation plans
  • 3 other projects

The plan consisted of 33 evaluation projects of which 15 were carried-over from the previous year. One unplanned interdepartmental evaluation was introduced in the fourth quarter of 2009-10:

  • Genomics Research and Design Initiatives

Three projects do not lead to a final product and as such are not listed in the tables below: CAA Evaluation Oversight is an ongoing interdepartmental role; recommendations follow-up is an ongoing function and one project consisted of time needed to complete the posting of an already-approved evaluation report.

Evaluation Projects Completed during this Period

17

  • 12 evaluations
  • 5 evaluation plans

Of the 21 evaluation projects planned for completion in 2009-10, 14 were carried-over from the previous fiscal year and have been completed.

The following were reviewed and approved by DEC:

Carried-over from 2008-09

  • EcoAction Funding (May 2009)
  • Environmental Damages Fund (July 2009)
  • Grants to support Environmental Research and Development (July 2009)
  • Contributions to support Environmental Research and Development (July 2009)
  • Contributions to support Environmental and Sustainable Development Initiatives (July 2009)
  • Contributions to support Canada's International Commitments (July 2009)
  • National Agri-environmental Standards Initiatives (July 2009)
  • Clean Air Regulatory Agenda Evaluation Plan (October 2009)
  • Weather Predictions Evaluation Plan (October 2009)
  • Habitat Stewardship Contribution Program (October 2009)
  • Invasive Alien Species Strategy (October 2009)
  • Enforcement Program (October 2009)
  • Chemicals Management Plan – Performance Measurement and Evaluation Plan (October 2009)
  • Strategic Environmental Assessment (October 2009)

Planned for 2009-10

  • Evaluation of the National Air Quality Health Index (February 2010)
  • Federal Species at Risk Evaluation Plan (February 2010)
  • Renewable Fuels Evaluation Plan (April 2010)

Evaluation Projects on Hold

1

  • 1 evaluation
  • Clean Air Agenda Community Partnerships Theme – Program not yet implemented.

Planned Carried-Over Evaluation Projects

10

  • 9 evaluations
  • 1 evaluation plan
  • Building Public Confidence in Pesticide Regulation and Improving Access to Pest Management Products (tabling Q1 – 2010-11)
  • Wildlife Habitat Canada Foundation (tabling Q2 – 2010-11)
  • Freshwater Initiatives Evaluation Plan (tabling Q1 – 2010-11)
  • Clean Air Regulatory Agenda Theme (tabling Q1 – 2010-11)
  • Clean Air Agenda International Actions Theme (tabling Q1 – 2010-11)
  • Clean Air Agenda Management and Accountability Theme (tabling Q1 – 2010-11)
  • Research and Development, Production and Monitoring in Support of Weather Predictions (Delayed from Q1 to Q2 2010-11)
  • Genomics Research and Design Initiatives (Interdepartmental led by Natural Resources Canada) (tabling Q4 – 2010-11)
  • Meteorological Program for the 2010 Winter Games (interdepartmental led by Canadian Heritage) (tabling Q4 – 2010-11)
  • Chemicals Management Plan Evaluation (Interdepartmental with Health Canada) (tabling Q4 – 2010-11)

Unplanned Carried-Over Evaluation Projects

3

  • 2 evaluations
  • 1 evaluation plan
  • Water Management Evaluation Plan (tabling Q1 – 2010-11)
  • Great Lakes Action Plan (tabling Q1 – 2010-11)
  • Improved Climate Change Scenarios (tabling Q1 – 2010-11)

Carry-overs were due to a variety of factors such as complexity arising from evaluating interdepartmental initiatives or as uncovered during the design phase, external events and delays in the approval process.

Percentage of Plan completed

82%

 

Evaluation recommendations followed up

41

Follow-up was conducted on 41 evaluation recommendations: of these six had no documentation, one demonstrated some progress, ten demonstrated moderate progress, 12 demonstrated significant progress, and 12 were implemented.

29 percent of the 41 followed-up recommendations were fully implemented.

A detailed report on the status of management action plans for evaluation is provided in Appendix D.

Top of Page

4.2 Professional and Practices Development

In April 2009, the TB Policy on Evaluation came into effect and has been a major driver of the activities of the Evaluation Division. While many aspects of the new Policy were already in place at EC, the new Policy also brought with it a number of new requirements for the Evaluation Division. To address these requirements, an implementation plan was created in order to put timelines and milestones into place for the following activities to ensure compliance with the new policy:

As well, to enable the implementation of the Policy on Evaluation, the Department has developed and implemented a strategy whereby department managers are informed of the requirements of the Evaluation Policy pertaining to performance management. The Department continues to monitor management's responses to previous evaluations. The Department's Evaluation Plan has also been expanded to include a monitoring component related to performance measurement to support future evaluations.

Evaluation leadership has been demonstrated through participation on TBS working group on competencies; hosted the second annual Environmental Evaluators Network (Canada) Forum; presentations at the Canadian Evaluation Society Annual National Conference, and to the Heads of Evaluation on best practices as highlighted in the OAG Audit of Evaluation; and engagement on the development of guidelines and handbook for Performance Measurement and Evaluation Plans for regulatory initiatives.

Top of Page

4.3 Follow-up on Recommendations and Management Action Plans

As noted above under 3.5, similar procedures have been used to manage and follow-up on evaluation recommendations and to regularly report to the DEC on the status of the management actions taken in response to evaluation recommendations. The Evaluation Division is currently transferring its evaluation follow-up recommendations to TeamMate. Follow-up on recommendations is conducted on an ongoing basis to ascertain the degree to which the action plans in response to recommendations made in previous evaluations have been implemented, and to determine outstanding gaps/risks. Progress reports on follow-ups are provided on a regular basis to the DEC and the Deputy Minister.

Top of Page

4.4 Other Accomplishments

In 2009-10, a progress reporting tool was developed and implemented to track the progress of evaluation projects against the risk-based evaluation plan, including timelines for completion and a rationale for deviations from the plan. The progress report is presented at every DEC meeting.

Top of Page

4.5 Client Surveys

After each evaluation, the Division sends surveys to clients in order to assess evaluation products and services. Evaluation Committee members are asked to provide feedback on the quality of evaluation products and services, and the value of the evaluations themselves, and ADMs with functional responsibility for the program that was evaluated are asked about the value of the evaluation. During this year, a question was added to both surveys on whether: "The evaluation contributed to information required in the context of program renewal." Surveys were sent out for eight evaluations and the Evaluation Division received responses from Evaluation Committee members for 100 percent of the evaluations, and from ADMs for 50 percent of the evaluations.

The Division received recurrent comments praising the professionalism and skill of the evaluators, the quality of evaluation reports, and the applicability and transferability of evaluation report findings, conclusions, and recommendations.

The Division also received repeated comments indicating problems in the working relationship with consultants, the quality of first drafts produced by consultants, and the length of the online publishing timeline. These statements reflect anomalous experiences this year with our consultants; the Division is aware of and managing these risks. In terms of the survey answers (which ranged from Strongly Disagree to Strongly Agree, as well as "No Answer / Don't Know"), most of the answers were positive. For the questions on the quality of evaluation services and products (asked of Evaluation Committee members), the proportion of positive responses (i.e., those responding Agree or Strongly Agree) ranged from 79 to 100 percent. For the questions related to the value of the evaluation, the proportion of positive responses ranged from 50 to 93 percent for Evaluation Committee members and 60 to 100 percent for ADM respondents.

Top of Page

4.6 Quality Assurance

The Evaluation Division undertakes an annual IQA process to ensure that the Division is meeting accepted professional standards. The IQA standards and criteria for assessment have traditionally been based the IIA Standards in order to establish a relatively similar self-assessment process for both the audit and evaluation functions. Since the last IQA, however, both Evaluation Policy and Evaluation Standards have been updated, and an Environment Canada-specific Evaluation Policy has been developed. While the current IQA continues to reflect IIA standards where possible, it also integrates these new and updated evaluation policies and professional standards. The IQA is conducted by an Evaluation Manager within the Evaluation Division and validated by the Division Director. The IQA results for 2009-10 indicate that the Evaluation Division is meeting all identified standards.

The OAG's Fall 2009 Report examined the effectiveness of evaluation functions within a number of Government departments; EC was one of those departments. The Department was cited for good practices including having formal processes to identify opportunities for improvement and a formal time reporting system. Two recommendations did, however, result and related to the need to develop and implement action plans to ensure that ongoing program performance is collected to support effective evaluation.

Top of Page

4.7 Lessons Learned

Lessons Learned by Evaluation relevant to the overall Department:

Lessons Learned by Evaluation relevant to the AEB:


[3] The revised departmental Evaluation Policy was approved by the DEC in October 2009 and tabled for the Executive Management Committee (EMC) in January 2010.

Previous page | Table of Contents | Next page