Evaluation of the Services to Marine Transportation Sub-activity

Previous page | Table of contents | Next page



3 Evaluation Design

3.1 Purpose and Scope

The purpose of this evaluation is to examine the relevance and performance of the SMT SA.17 This included investigating and reporting on:

The evaluation data collection phase began in November 2010 and continued until April 2011. During this period, there were periodic updates with representatives in the Audit and Evaluation Branch to share documents and preliminary analyses, and, where possible, conduct joint interviews.

In terms of scope, the SMT SA relies on a broad base of weather prediction services provided by other groups within the MSC and S&T Branch and supported by CIOB, as well as collaboration with other organizations. Although these activities provide important contextual information to better understand the SMT operations, they are not included within the scope of the evaluation per se.

Excluded from the scope of the evaluation are the services provided by the CIS relating to pollution monitoring, prevention and enforcement. These activities relate to the PAA element 3.1.2.2 – Marine Pollution.

Additionally, the new METAREAs (Meteorological Areas) Initiative in the Arctic, which involves the expansion of EC’s domestic marine and ice services to provide a full suite of meteorological information for two of five new Arctic METAREAs established by the IMO, was being developed immediately prior to the evaluation data collection, and was therefore not sufficiently advanced in terms of outcomes to be included within the scope of this evaluation. Individuals did comment frequently on northern issues, however, and these are discussed in this evaluation as appropriate.

Top of page

3.2 Evaluation Approach and Methodology

Four key considerations, outlined below, played a role in the design of the evaluation approach and methodology used for this evaluation.

First, as noted previously, this evaluation was initiated as a joint review and evaluation. This involved a joint planning phase with audit, including the initial risk assessment/scoping interviews being conducted by the audit and evaluation project leads, and presenting a joint review and evaluation plan to the responsible ADM for sign-off. Additionally, throughout the conduct of the project, opportunities to conduct a more streamlined data collection approach were undertaken where feasible, in an attempt to reduce the burden on participating stakeholders.

Second, this evaluation was a pilot for conducting evaluations at the PAA level. As such, methods were informed by the fact that this evaluation cut across two components within the PAA, with different stakeholders, program delivery methods and end-users.

Third, other recent evaluations have been conducted that offered insight into the SMT SA. This evaluation builds upon these previous evaluations, including relying on recent evidence as presented in these reports, to the degree that this was appropriate and feasible.

Finally, the evaluation took extra efforts to understand the perspective of end-users of EC’s services to marine transportation, because it is required in order to adequately assess the degree to which the SMT SA is achieving its outcome of ensuring that marine communities have the weather, wave and ice information that they need.

With these considerations in mind, the following three methods were used in the conduct of this evaluation, ensuring that each evaluation question was addressed with multiple lines of evidence:

Top of page

3.3 Limitations of the Evaluation

The SMT program is a new entity in the 2010-11 PAA, in which all service activities targeting marine transportation have been grouped under a common results structure. As such, this program entity does not yet benefit from a fully articulated program logic and clear governance structure.20 In order to address this, the evaluation team developed a logic model with activities, outputs and outcomes for the purpose of the evaluation, which may not represent the final indicators selected to measure performance for the SA. The evaluation team made efforts, however, to ensure that the logic model was an effective tool for assessing performance for the SA, by basing it on existing performance documents and validating it with the evaluation steering committee and senior management. Furthermore, where available and relevant, additional performance data beyond measures for the identified outputs and outcomes were also assessed.

Although the number of respondents to the questionnaire exceeded the evaluation team’s original goal of 30, the results of the end-user questionnaire cannot be viewed as a statistically valid representation of SMT end-users. The sample size was too small to provide statistical validity, and it was skewed toward the views of the east coast fishing industry, with almost two thirds of respondents coming from this group. This over-representation appears to be because east coast fishing organizations contacted for the evaluation were extremely effective in distributing the questionnaire to their membership. The findings from this line of evidence, therefore, should be viewed as illustrative, and treated with caution. To compensate for this, the evaluation also relied on other sources of evidence, including the results of a previously conducted survey of end-users.

Top of page


17. An evaluation matrix, including evaluation questions, indicators and methods/sources, is included in Annex A.

18. EC, Audit and Evaluation Branch. January 2011. Evaluation of Research and Development, Production and Monitoring in Support of Weather Prediction.

19. DFO. April 2011. Evaluation of the Icebreaking Program.

20. Responsible managers on the Marine and Ice Program Management Board are, as of the publication of this report, undertaking the development of a program logic model, performance measures and clear governance structure specific to the SMT SA.


Previous page | Table of contents | Next page