This page has been archived on the Web

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Skip booklet index and go to page content

Evaluation of the Services to Marine Transportation Sub-activity

Evaluation Design

3.1 Purpose and Scope

The purpose of this evaluation is to examine the relevance and performance of the SMT SA.17 This included investigating and reporting on:

  • The continued need for these services, and their alignment with government priorities (i.e., relevance);
  • The extent to which performance data are being collected and how well the SA is delivering on its outcomes, including the degree to which users' needs have been met (i.e., performance – achievement of outcomes);
  • The degree to which the SA is delivering its services in an economical and efficient manner (i.e., performance – demonstration of economy and efficiency).

The evaluation data collection phase began in November 2010 and continued until April 2011. During this period, there were periodic updates with representatives in the Audit and Evaluation Branch to share documents and preliminary analyses, and, where possible, conduct joint interviews.

In terms of scope, the SMT SA relies on a broad base of weather prediction services provided by other groups within the MSC and S&T Branch and supported by CIOB, as well as collaboration with other organizations. Although these activities provide important contextual information to better understand the SMT operations, they are not included within the scope of the evaluation per se.

Excluded from the scope of the evaluation are the services provided by the CIS relating to pollution monitoring, prevention and enforcement. These activities relate to the PAA element 3.1.2.2 – Marine Pollution.

Additionally, the new METAREAs (Meteorological Areas) Initiative in the Arctic, which involves the expansion of EC's domestic marine and ice services to provide a full suite of meteorological information for two of five new Arctic METAREAs established by the IMO, was being developed immediately prior to the evaluation data collection, and was therefore not sufficiently advanced in terms of outcomes to be included within the scope of this evaluation. Individuals did comment frequently on northern issues, however, and these are discussed in this evaluation as appropriate.

3.2 Evaluation Approach and Methodology

Four key considerations, outlined below, played a role in the design of the evaluation approach and methodology used for this evaluation.

First, as noted previously, this evaluation was initiated as a joint review and evaluation. This involved a joint planning phase with audit, including the initial risk assessment/scoping interviews being conducted by the audit and evaluation project leads, and presenting a joint review and evaluation plan to the responsible ADM for sign-off. Additionally, throughout the conduct of the project, opportunities to conduct a more streamlined data collection approach were undertaken where feasible, in an attempt to reduce the burden on participating stakeholders.

Second, this evaluation was a pilot for conducting evaluations at the PAA level. As such, methods were informed by the fact that this evaluation cut across two components within the PAA, with different stakeholders, program delivery methods and end-users.

Third, other recent evaluations have been conducted that offered insight into the SMT SA. This evaluation builds upon these previous evaluations, including relying on recent evidence as presented in these reports, to the degree that this was appropriate and feasible.

Finally, the evaluation took extra efforts to understand the perspective of end-users of EC's services to marine transportation, because it is required in order to adequately assess the degree to which the SMT SA is achieving its outcome of ensuring that marine communities have the weather, wave and ice information that they need.

With these considerations in mind, the following three methods were used in the conduct of this evaluation, ensuring that each evaluation question was addressed with multiple lines of evidence:

  • Document review – The evaluation team requested all pertinent program documents, such as program planning and legislative documents, Acts of Parliament, minutes from EC Board meetings and performance measurement data collected by the program. To facilitate document review, members of the evaluation team were granted access to the SMT shared drive. Financial data were also obtained from EC's Financial Management Branch. EC's Evaluation of Research and Development, Production and Monitoring in Support of Weather Prediction,18 and DFO's evaluation of its Icebreaking Program19 were also reviewed. Documents received by either the evaluation or the audit team were shared with the other.
     
  • Open-ended interviews – A long-form interview guide (see Annex B) was administered to 29 individuals (detailed below), including those involved in the delivery of SMT as well as end-users of its services. Interviewees who participated in both the review and evaluation components were provided with the opportunity to conduct one joint interview or two separate interviews. The purpose of the open-ended interviews was to engage individuals that were knowledgeable about the SMT SA's activities and impacts based on their professional association with EC's SMT SA services. Key informants representing the marine weather information and ice information components of the SA were interviewed. In the majority of cases, the key informant was knowledgeable about one of these two components and few were able to speak to the SA as a whole. Table 4 provides a detailed breakdown of key informant interviewees.
    Table 4: Key Informant Interviewees: Long-form Interview Guide (text description)
    Key Informant TypeNumber Interviewed
    MSC Policy and Planning3
    Marine Program Management7
    Ice Program Management1
    Marine Services4
    Ice Services3
    Regional Operations (Marine and Ice)4
    Clients (Other Government Departments)3
    End-Users (Industry and Shipping)4
    Total29
  • Questionnaire for End-Users – A questionnaire was administered to end-users of SMT SA services. The purpose of the questionnaire was to obtain information on the use of and satisfaction with SMT SA services among mariners who were the target clients of these services. The evaluation team was provided with a list of key industry and non-governmental organization contacts. These individuals were contacted by telephone initially and, if willing to participate, were sent an email providing greater detail on the purpose of the evaluation, along with a copy of the guide in both official languages. Participants had the option of either filling out responses directly on the paper copy and faxing it back to EC Audit and Evaluation Branch officials, or arranging for an interview with a branch official. The vast majority (27 out of 33) chose to fill out the guide and fax or mail their responses back to EC. Table 5 indicates the distribution of respondents to the end-user questionnaire.
     
    Table 5: End-user Type: Questionnaire (text description)
    End-user TypeNumber Interviewed
    Recreational mariner / sport fisher7
    Professional mariner2
    Professional fish harvester18
    Ship manager or operator3
    Other3
    Total33

3.3 Limitations of the Evaluation

The SMT program is a new entity in the 2010-11 PAA, in which all service activities targeting marine transportation have been grouped under a common results structure. As such, this program entity does not yet benefit from a fully articulated program logic and clear governance structure.20 In order to address this, the evaluation team developed a logic model with activities, outputs and outcomes for the purpose of the evaluation, which may not represent the final indicators selected to measure performance for the SA. The evaluation team made efforts, however, to ensure that the logic model was an effective tool for assessing performance for the SA, by basing it on existing performance documents and validating it with the evaluation steering committee and senior management. Furthermore, where available and relevant, additional performance data beyond measures for the identified outputs and outcomes were also assessed.

Although the number of respondents to the questionnaire exceeded the evaluation team's original goal of 30, the results of the end-user questionnaire cannot be viewed as a statistically valid representation of SMT end-users. The sample size was too small to provide statistical validity, and it was skewed toward the views of the east coast fishing industry, with almost two thirds of respondents coming from this group. This over-representation appears to be because east coast fishing organizations contacted for the evaluation were extremely effective in distributing the questionnaire to their membership. The findings from this line of evidence, therefore, should be viewed as illustrative, and treated with caution. To compensate for this, the evaluation also relied on other sources of evidence, including the results of a previously conducted survey of end-users.


17. An evaluation matrix, including evaluation questions, indicators and methods/sources, is included in Annex A.

18. EC, Audit and Evaluation Branch. January 2011. Evaluation of Research and Development, Production and Monitoring in Support of Weather Prediction.

19. DFO. April 2011. Evaluation of the Icebreaking Program.

20. Responsible managers on the Marine and Ice Program Management Board are, as of the publication of this report, undertaking the development of a program logic model, performance measures and clear governance structure specific to the SMT SA.

Date modified: