2.0 Background

Previous page | Table of Contents | Next page

This section provides an overview of the program, a profile of resource allocation and a summary of results from previous related assessments.

2.1 Program Profile

2.1.1 Organizational Structure

The Meteorological Service of Canada (MSC) is the organizational branch within EC with responsibility for producing and disseminating weather and environmental services, including those related to weather prediction. The MSC also relies on the Atmospheric Science and Technology Directorate (ASTD) of the Science and Technology Branch (S&TB) as a key contributor to meteorological research and development (R&D), and the Major Projects and Supercomputing Directorate (MPSD) of the Chief Information Officer Branch (CIOB) for key activities related to information management and information technology (IM/IT).4

The MSC is organized into four main directorates.

Other services to support the Department’s meteorological operations are provided by:

The Weather and Environmental Services (WES) QMS provides the framework for these six directorates to work together in an integrated manner. A QMS is a set of policies, processes and procedures required for planning and execution (production / development / service) in the core business areas of an organization.

The WES QMS identifies the accountability of each Director General (DG) with respect to their functional role. This functional accountability is centred on one of six overarching business processes, which at times cross organizational boundaries: strategic support to senior management; monitoring; production (also referred to internally as "prediction"); services; R&D; and information management and information technology (IM/IT) service delivery. The QMS accountability also recognizes process-to-process interaction, e.g., where the R&D process interacts with the production process, and was designed to promote efficient and effective seamless management of WES programs in an integrated, end-to-end fashion.

Management oversight of these activities and functions is provided by a QMS Steering Committee, a senior executive committee chaired by the Assistant Deputy Minister (ADM)-MSC and composed of the responsible DGs, and a DG Leads Committee that meets biweekly throughout the year. Ultimate accountability rests with the WES  Board,6 a senior departmental committee composed of ADMs and Regional DGs.

2.1.2 Clients of Environment Canada’s Weather Prediction Services

In addition to providing support for policy development within the Department, EC’s weather prediction services serve a broad and diverse group of external clients. As outlined in the 2010-11 Program Activity Architecture (PAA) structure, the MSC’s operations are broadly divided among: i) services to address the public good; and ii) services for targeted users (including services to aviation, marine transportation, and economic sectors, and the Department of National Defence).  

More specifically, the MSC’s clients include:

The MSC has a small base of cost‑recovery clients, with key clients including: NAV CANADA, the Department of National Defence, and Fisheries and Oceans Canada.

2.1.3 Context

In March 2003, the MSC began a five-year transition initiative aimed at revitalizing and transforming its operations to ensure their sustainability. It received new funding of approximately $74 million over five years to undertake this process, as well as $5 million per year thereafter. The objectives of this project, known as the Transition Project, were to help the MSC become a more sustainable organization, to advance science- and weather-related services, and to help Canadians adapt in ways that safeguard security, health and safety, economic prosperity and environmental quality.7 This included the establishment of national research laboratories to collocate with each of the five consolidated Storm Prediction Centres, intended to enhance the transfer of scientific knowledge to forecast production operations and thus strengthen linkages between science, forecast production and service to the public. The scientific research of the laboratories was intended to benefit not only forecasters across the MSC, but also research organizations within and outside of Canada. In 2008, EC conducted an evaluation of this initiative, which is further discussed in section 2.3.

In 2004-05, EC underwent an internal reorganization. Previously, the MSC managed its own IT and core research services. This reorganization, or transformation, moved key functions in the production and delivery of weather prediction into separate branches, such as IT into CIOB8 and R&D into S&TB. Consequently, planning and setting of priorities is now done through a results-based management structure, overseen by the WES Board, which reports to the Department’s Executive Management Committee. 

In 2008, EC obtained certification for its weather and environmental services function under ISO (International Organization for Standardization) 9001 following the implementation of its QMS. A QMS serves to integrate all of the internal processes of an organization to provide a system (process) approach for product/service delivery thereby allowing the organization to identify, measure, control and improve core processes with the goal of improving performance.

Top of Page

2.2    Resource Allocation

Weather Prediction

Approximately $205 million was spent on weather prediction activities in 2009-10. These expenditures were spread across the three branches responsible for weather prediction (MSC, S&TB and CIOB), with most (86%) being spent by the MSC.  

The MSC’s work related to climate change was excluded from the scope of the evaluation. Although related to weather, climate change involves work on much longer time scales. On the other hand, other areas such as air quality, seasonal prediction and ice monitoring were included in the analyses conducted for this evaluation, as they are integral to and very closely interrelated with weather prediction, and it is not possible to easily extract them from weather prediction activities.

"Science" in Support of Weather Prediction9

The intended focus of this evaluation is on "science" activities in support of weather prediction. As such, this evaluation does not cover a specific program or group of programs. Rather, it targets a cross-cutting function (science) that involves several organizational units within the Department in support of one category of services, weather prediction, delivered by the MSC.   

Following consultation with departmental managers, it was acknowledged that "science" comprises several different activities, including research, development, production and monitoring (hence the title for this evaluation). For the purpose of this evaluation, "science" was classified in two main categories:

  1. R&D, which in the context of weather prediction sciencerefers to fundamental research, including modelling, developing predictions and specialized R&D--roles that are primarily conducted in S&TB but also to some extent in the MSC division responsible for prediction development.

  2. Related science activities, which in the context of weather prediction science refer to validation, improvements and optimization of models and tools, operationalization of core science, and monitoring--roles that are conducted in the prediction development, production and monitoring divisions of the MSC and, to a lesser extent, by CIOB’s support function.

As shown in Figure 1, approximately 9% of total weather prediction expenditures in 2009-10 was spent on R&D, 50% on related science activities, and 41% on non-science activities (e.g., enabling functions).

Figure 1: Weather Prediction Expenditures (2009-10) Total: $205.7M

Click to enlarge.

Figure 2 shows the distribution of expenditures on R&D, related science activities, and non-science in support of weather prediction, by branch.10 The chart depicts the majority of weather prediction expenditures as occurring within the MSC. As expected, most of the core scientific activity takes place in S&TB, as its main mandate is the production of new knowledge; on the other hand, as a departmental enabler, CIOB engages only in related scientific and non-scientific activity.

Figure 2: Distribution of WP Expenditures by Branch (2009-2010)

Click to enlarge.

Top of Page

2.3    Previous Assessments

This evaluation took place following two other assessments of MSC activities, namely: i) an EC evaluation of the MSC Transition Project and ii) an audit by the Office of the Auditor General (OAG) included in the December 2008 Report of the Commissioner of the Environment and Sustainable Development, entitled Managing Severe Weather Warnings. Although these studies differed in scope and objectives from this evaluation, both touched upon the issue of the adequacy and sufficiency of the science and technology underlying weather predictions, and both assessments also examined the success with which weather information reaches Canadians. Additionally, an EC Audit of Governance of Specialized IT Resources conducted in 2009 explored issues related to the delivery of highly specialized IM/IT services within the Department. In exploring these same issues, the current evaluation draws on and expands upon the evidence and conclusions from these previous assessments.

Evaluation of the MSC Transition Project

An evaluation of the MSC Transition Project, conducted by EC, was completed in 2008.11 The evaluation found that the Project addressed needs to modernize EC’s weather services, and successfully resulted in the consolidation of weather offices, with a focus on high‑impact weather, increased training and recruitment of staff, the introduction of new products and services, and the life‑cycle management and modernization of some equipment. Activities requiring further attention included the decommissioning and cleanup of some climate and hydrometric sites, the development and implementation of quality assurance processes and data management frameworks, modernization of the monitoring networks, and succession planning. 

The evaluation recommended that the Department articulate and communicate its strategic direction and action plan for ensuring the continued sustainability of the MSC and that the measurement, monitoring, tracking and reporting of performance and financial information be strengthened.12 A management action plan was developed in response to the evaluation recommendations and is being implemented by the Department.

External Audit of Severe Weather Warnings

An audit of severe weather warnings was conducted by the Commissioner of the Environment and Sustainable Development (CESD), and published in 2008. This audit examined the management of the weather observation network, forecaster technologies, and "the delivery and verification of severe weather warnings."13 

The audit found several deficiencies in the life‑cycle management of the MSC’s monitoring network assets, and also noted an inconsistent and incomplete approach to verifying the quality of severe weather warnings. As a result, the audit argued that the MSC lacked sufficient information on the performance of its assets to make investment decisions, and was unable to verify the timeliness, accuracy or effectiveness of its severe weather warnings. The management response to both of these items stated that the implementation of ISO certification for its systems and procedures (already under way during the conduct of the audit) would result in formalized approaches to life-cycle management of assets and performance monitoring and measurement. The audit further critiqued the degree to which the MSC provided opportunities for, and incorporated, user feedback. It also indicated that the MSC could increase the number of channels through which warnings were distributed.14

A management action plan was developed in response to the audit recommendations and is being implemented by the Department.

Other Related Assessment

An audit of governance of specialized IT resources was also conducted in 2009 by EC’s Audit and Evaluation Branch. Where applicable, findings from this audit were taken into account in this evaluation.


4 MSC, Annual Report 2006.

5 EC, S&TB, April 23, 2010: http://www.ec.gc.ca/scitech/default.asp?lang=en&n=58F9D2F5-1#astd.

6 The design and delivery of specific elements of EC’s outcomes and priorities are the responsibility of three Priority Management Boards: Ecosystem Sustainability, Environmental Protection, and Weather and Environmental Services.

7 EC, Evaluation of the Meteorological Service of Canada (MSC) Transition Project, 2008, p.1.

8 Certain highly specialized skill sets were not migrated to CIOB but instead continued to be carried out within program areas, by staff known as "embedded IT staff" or by "embedded computer systems."

9 The figures in this section are based on 2009-10 departmental expenditure data for WES Board outcome projects under the 2009-10 PAA. The figures for R&D and "related science activities" in support of weather prediction were calculated using percentage estimates of the level of R&D, related science activities and non-scientific activity occurring within each outcome project. All estimates were informed by scoping interviews and have been validated by program-level individuals. 

10 Figure 2 excludes some resources (representing less than 1%), which are allocated to other branches beyond CIOB, S&TB and the MSC. This results in a small discrepancy between the totals in Figure 1 and Figure 2. 

11 EC, Evaluation of the MSC Transition Project, 2008, p. i.

12 Ibid.

13 Office of the Auditor General, 2008 December Report of the Commissioner of the Environment and Sustainable Development: Chapter 2-Managing Severe Weather Warnings-Environment Canada, 2008, p. 1.

14 Ibid.

Previous page | Table of Contents | Next page