4.0   Findings

Previous page | Table of Contents | Next page

The findings of this evaluation are presented below, by evaluation issue (relevance and performance) and the related evaluation questions. The findings at the overall‑issue level are presented first, followed by the findings for each evaluation question.

A rating is also provided for each evaluation question based on a judgement of the evaluation findings. The rating symbols and their significance are outlined below in Table 3. A summary of ratings for the evaluation issues and questions is presented in Annex 3.

Table 3: Rating Symbols and Significance

Symbol

Significance

Achieved

The intended outcomes or goals have been achieved or met

Progress Made; Attention Needed

Considerable progress has been made to meet the intended outcomes or goals, but attention is still needed

Little Progress; Priority for Attention

Little progress has been made to meet the intended outcomes or goals, and attention is needed on a priority basis

N/A

A rating is not applicable

~

Outcomes achievement ratings are based solely on subjective evidence

Top of Page

4.1 Relevance

Evaluation Issue: Relevance

Overall Findings: The provision of meteorological services is legislated within the Department of the Environment Act. R&D, production, monitoring and service delivery activities in support of weather prediction are aligned with federal government priorities. Furthermore, by supporting the provision of quality meteorological services and information, these activities also support the departmental mandate of EC and several federal stakeholders, and address the need to inform Canadians about their changing environment to help ensure their safety and welfare.

4.1.1   Continued Need for Program


Evaluation Issue: Relevance: Q1. Continued need for activities

Indicator(s):

Methods:

Rating: Achieved


Multiple sources indicate that there is a continued need for R&D, production, monitoring and service delivery in support of weather prediction, because these activities allow for the provision of accurate weather prediction information and services, helping Canadians reduce risks posed by changing weather conditions. Moreover, there is a large and growing demand by Canadians for weather data.

4.1.2   Alignment with Federal Government Priorities and Departmental Strategic Outcomes


Evaluation Issue: Relevance: Q2. Alignment of activities with federal government priorities and departmental strategic outcomes

Indicator(s):

Methods:

Rating: Achieved


Evidence shows the importance of weather prediction and its supporting science activities to federal government priorities and departmental strategic outcomes.

4.1.3   Alignment with Federal Roles and Responsibilities


Evaluation Issue: Relevance: Q3. Legitimate and necessary role for government

Indicator(s):

Methods:

Rating: Achieved


Documents underline the legitimacy and necessity of the government’s role in undertaking scientific activity in support of weather prediction.  

Top of Page

4.2 Performance - Achievement of Expected Outcomes

Evaluation Issue: Performance - Achievement of Expected Outcomes

Overall Findings: The evaluation found evidence that the needs of end-clients of weather prediction products and services are being adequately addressed by weather prediction science decisions, as shown by positive results of the most recent public opinion survey, the satisfaction expressed by interviewed clients, and the high regard in which Canadian weather prediction science is held by external collaborators. Recent changes made to governance mechanisms are expected to address some of the challenges encountered with integration of S&TB and CIOB in weather prediction science priority setting and decision making. Similarly, although still in early stages of implementation, the QMS is already benefitting weather prediction science decision making by requiring a more systematic documentation of processes and mapping of internal/client relationships, thereby facilitating science knowledge transfer across the innovation chain. The main gaps and limitations found pertain to the transfer of science knowledge within EC, particularly in terms of communication between the service and other functions along the innovation chain, and the transfer of science from the MSC’s national laboratories into operations. Additionally, more could be done in terms of enhancing the weather prediction products delivered and the delivery mechanisms to increase access to products for external clients.

4.2.1 Meeting Internal and External Client Needs


Evaluation Issue: Achievement of Expected Outcomes :Q4. Internal and external client needs being met

Indicator(s):

Methods:

Rating: Progress Made; Attention Needed


In order to deliver improved services to external clients, those involved in the delivery of weather prediction need to be part of an integrated innovation chain, supported by the appropriate infrastructure and processes and collaboration from the global meteorological community. The needs of external clients of weather prediction are diverse, expanding and becoming increasingly sophisticated as a result of increasing possibilities due to evolving science and technology. Evidence shows that EC is well regarded among the international meteorological community, and is recognized as a valued contributor to scientific collaboration and a world leader in a number of key areas. Evaluation findings show that although traditional weather services are working well and continue to improve, more could be done to ensure that enhanced capabilities and data are shared with external clients in a manner that better supports decision making.

Expectations and Needs of Weather Prediction Clients

External Clients

Circumstances in which Weather Prediction Services are Meeting Client Needs

Circumstances in which Weather Prediction Services are not Meeting Client  Needs

4.2.2 Effectiveness of Current Science Knowledge Transfer Practices


Evaluation Issue: Achievement of Expected Outcomes: Q5. Science knowledge being transferred.

Indicator(s):

Methods:

Rating: Progress Made; Attention Needed


Departmental stakeholders involved in the delivery of weather predictions clearly identified with being part of an innovation chain, and recognized the benefits of collaboration both with internal colleagues and with the broader science community domestically and internationally. Internally, the interaction between research, development, production and monitoring functions appears strong, but more could be done to strengthen communications with the service function in order to ensure that the external client perspective is adequately considered in science decisions and to provide the service function with an improved understanding of emerging science in order to help anticipate what may be possible in the future. A gap was identified in the area of transferring science developed in the MSC’s national laboratories into operations.

Table 7: Knowledge Transfer Success Indicators and Ratings

Achieved Progress Made; Attention Needed Little Progress; Priority for Attention
Interactions between Stakeholders, including Engagement of the Target Audience
Communication channels, processes and context between knowledge translation actors
Working relationships among stakeholders
An ongoing forum for sharing among stakeholders
Opportunities for collaboration
Shared vocabulary among stakeholders
Knowledge being relevant to and understood by the target audience
Knowledge Use or Application
Knowledge being used to inform decision making

Internal Knowledge Transfer

External Knowledge Transfer

4.2.3 Effectiveness of Priority Setting and Decision Making


Evaluation Issue: Achievement of Expected Outcomes: Q6. Effectiveness of priority setting and decision making

Indicator(s):

Methods:

Rating: Progress Made; Attention Needed


Several mechanisms are in place for weather prediction priority setting and decision making. These mechanisms appear to generally function well, but there is a perception, particularly among working level managers and staff, of a lack of long-term coordinated vision and priority setting for activities involving weather prediction science research and IM/IT support functions. As is often the case for cross-cutting functions, engagement of S&TB and CIOB in decision making and priority setting has been challenging given their place in the organization (part of WES Board, but not the MSC). The creation of MEPIC is seen by several stakeholders as a welcome solution to the absence of a hierarchical decision‑making structure for science activities delivered across both branches; and, combined with the ADMMSC’s expected launch of new signature projects is expected to address the need for long-term coordinated vision and priority setting.

Existing Priority‑Setting and Decision‑Making Mechanisms60

Information Sources and Criteria used for Priority Setting and Decision Making

Adequacy and Effectiveness of Existing Mechanisms

When asked about their opportunities, as users of weather prediction products/services/information, to provide feedback and suggestions to decision makers in terms of their needs and priorities, a majority of interview respondents commented positively on such opportunities.

4.2.4 Quality Management System


Evaluation Issue: Economy and Efficiency: Q7. Contribution of QMS to effective science knowledge transfer

Indicator(s):

Methods:

Rating: Progress Made; Attention Needed


Evidence indicates that the QMS has begun contributing to effective science knowledge transfer in support of weather prediction. Although implementation challenges have been encountered, some benefits are emerging, such as more systematic documentation of processes and mapping of internal/client relationships, thereby facilitating knowledge transfer across the innovation chain. There was an identifiable use of common language among interviewees involved in the delivery of weather prediction and, although it is not possible to say to what degree this is attributable to the QMS, there was a clear recognition by these interviewees that they are part of an innovation chain aimed at improving service to end users. In general, interviewees perceived that the QMS was most useful for areas of work with clear measurable and targetable objectives (e.g., operations), while implementation of the QMS was perceived to be more challenging in areas dealing with ideas or concepts that are difficult to quantify or measure (e.g., policy, high‑level strategic planning). 

Quality Management System Implementation71

Circumstances in Which QMS Works

Circumstances in Which QMS Does Not Work Well

Link to Knowledge Transfer and the Innovation Chain

4.2.5 External Factors (to Weather Prediction Innovation Chain)


Evaluation Issue: Achievement of Expected Outcomes: Q8. External factors

Indicator(s):

Methods:

Rating: N/A


The stability and overall level of government funding is an important factor, external to the weather prediction innovation chain, which affects the decision making and activities of science in support of weather predictions. Availability of financial resources also plays a role in other external factors affecting weather prediction, which include challenges in obtaining IT resources and skilled human resources generally, as well as the need for a more robust supercomputing and monitoring infrastructure.

Top of Page

4.3 Performance - Demonstration of Efficiency and Economy

Evaluation Issue: Performance - Demonstration of Efficiency and Economy

Overall Findings: Evidence suggests that the level of effort expended for weather prediction activities yields commensurate or better value for given resources. However, concerns regarding resources and sustainability were noted. Weather prediction science-related activities were generally perceived by a number of interviewees as being efficient; however, further collaboration with others (e.g., other organizations and/or other countries) and sharing of infrastructure have been highlighted as potential means of pursuing further efficiencies.

4.3.1 Efficiency and Economy


Evaluation Issue: Efficiency and Economy: Q9. Value for money and efficiency

Indicator(s):

Methods:

Rating:


Figure 4: Relative Spending on Meteorological Services88

Figure 4: Relative Spending on Meteorological Services

Click to enlarge.


21 EKOS Research Associates Inc., 2007 National WES Products and Services Survey Final Report, April 2007, p. 4.

22 Convention of the World Meteorological Organization, 2007 (1947), p. 9.

23  OAG, 2008 December Report: Chapter 2--Managing Severe Weather Warnings, p. 3.

24 Finance Canada, Budget 2010: Leading the Way on Jobs and Growth, March 4, 2010, p.106.

25 EC, Environment Canada’s Proposed 2010-11 Program Activity Architecture.

26 EC, Environment Canada’s Sustainable Development Strategy 2007-2009, 2006, p. 4.

27 MSC, "Forecasts for Canadians, Foresight for Canada - Articulating a Vision and Strategic Direction for Environment Canada’s Weather and Environmental Services," 2008,p. 7.

28 Ibid, p. 7.

29 Ibid, p. 20.

30 Ibid, p. 8-10.

31 "Stormy weather ahead: getting the word out," Regina Leader-Post, July 14, 2010, p. B9.

32 EC Weather and Environmental Services Board, 2010-11 Departmental Priorities, April 2010, (Presentation deck), p. 6.

33 National Academy of Sciences, "Fair Weather: Effective Partnerships in Weather and Climate Services," 2003, p. 25.

34 MSC, "Forecasts for Canadians, Foresight for Canada," 2008, p. 10.

35 EC, Nowcasting R&D Strategy, 2010-2015, undated, p. 13.

36 Allison Kerry et al., Benchmarking Study on Business Models for the MSC-Final Report, June 4, 2009, p. 5, 20.

37 EC, 2007-2008 Departmental Performance Report, 2008, p. 33.

38 EC, "Canadian Meteorological Private Sector," April 19, 2010. http://www.ec.gc.ca/meteo-weather/default.asp?lang=En&n=8DB56DCD-1

39 Independent Review of the Research and Development Program of the MSC, p.3.

40 Environment Canada’s Science Plan, vol. 2: Supporting Documents, 2007, p. 27.

41 EC, S&TB, Measuring Environment Canada’s Research and Development Performance, 2009, p. 17. http://www.ec.gc.ca/doc/scitech/mecrdp_e.html

42 EKOS Research Associates Inc., National WES Products and Services Survey 2007, Final Report, April 2007, p. 83.

43 OAG, 2008 December Report: Chapter 2--Managing Severe Weather Warnings, p. 6 (2.13).

44 MSC, "Forecasts for Canadians, Foresight for Canada," 2008, p. 6.

45 EC, 2006-2007 Departmental Performance Report, 2007, p. 44.

46 EC Weather and Environmental Services Board, Weather and Environmental Services, 2010-11 Departmental Priorities, April 2010, (Presentation deck), p. 11.

47 WES Board, "Delivery of Weather and Environmental Services (OPG 2B2), Priorities and Challenges for 2009-10," Presented by Diane Campbell, March 31, 2009, (presentation deck), p. 2, 6.

48 OAG, 2008 December Report: Chapter 2--Managing Severe Weather Warnings, p.20

49 EC, "Canadian Meteorological Private Sector," April 19, 2010. http://www.ec.gc.ca/meteo-weather/default.asp?lang=En&n=8DB56DCD-1

50 Kerry et al., Benchmarking Study, p. 22.

51 Graham et al., "Lost in Knowledge Translation,"

52 Indicators were derived from: Colleen M. Davison, "Knowledge Translation: Implications for Evaluation," in Judith M. Ottoson and Penelope Hawe, editors, Knowledge Utilization, Diffusion, Implementation, Transfer, and Translation: Implications for Evaluation - New Directions for Evaluation 124: 82-83.

53 WES Board, "Delivery of Weather and Environmental Services (OPG 2B2), Priorities and Challenges for 2009-10,"  March 31, 2009, p. 5.

54 EC IM&IT Plan, CIOB, "Simply the Best," Version 1.08, December 11, 2008, p. 10.

55 Davison, "Knowledge Translation: Implications for Evaluation," p. 79-82.

56 MSC, MRD, Computing Requirements for Regional Science Divisions and Associated Labs, November 2009.

57 OAG, 2008 December Report: Chapter 2--Managing Severe Weather Warnings.

58 EKOS Research Associates Inc., National WES Products and Services Survey 2007, Final Report, April 2007, p. 19.

59 Ibid., p. 83.

60 The ensuing description of existing priority‑setting and decision‑making mechanisms was developed using information from several sources, including key informant interviews as well as terms of reference and records of decisions for the various committees listed.

61 WES Board, Proposed 2010-11 PAA Governance S.O. 2.0 "Canadians are equipped to make informed decisions on changing weather, water and climate conditions," revised September 30, 2009.

62 NWPs are produced by inputting weather data into mathematical models, whereas non-NWPs are produced using other scientific means.

63 MSC, TTAC - Terms of Reference, 2010.

64 Friday, Jr. (Panel Chair) et al., Independent Review, p. 18.

65 Ibid.

66 MSC, Strategic Plan 2003-2012, Research and Development Program, p. iv.

67 Kerry et al., Benchmarking Study, p. 22.

68 OAG, 2008 December Report: Chapter 2--Managing Severe Weather Warnings, p. 8.

69 EC, Audit of Governance of Specialized IT Resources, 2010, p. 5-7.

70 EC Weather and Environmental Services Board, Weather and Environmental Services, 2010-11 Departmental Priorities, April 2010, (Presentation deck).

71 Sources include interview findings and the following two documents: MSC - QMS Office, WES Quality Management System (QMS), March 18, 2008 (Presentation deck), slides 3 and 4 [labelled as draft at bottom of slides]; Environment Canada. 2007-2008 Report on Plans and Priorities, p. 35.

72 EC, Measuring Environment Canada’s Research and Development Performance, 2009, p. 22.

73 Sources include interview findings and the following document: MSC, Performance Management, Joanne Volk, Weather and Environmental Services Board Quality Management System: Management Review (Presentation deck), slide 4, May 26, 2009. 

74 WES Board, PAA, PMF, QMS, MRSS.Beyond the Alphabet Soup: Information Session for Employees and Managers of the Weather and Environmental Services (WES) Program (Presentation deck), slide 18 (labelled as draft at bottom of slides), March 2010.

75 Ibid., slide 21 .

76 MSC, Performance Management, Joanne Volk, Weather and Environmental Services Board Quality Management System: Management Review, (Presentation deck), slide 4. 

77 MSC, "Forecasts for Canadians, Foresight for Canada," p. 11.

78 Kerry et al, Benchmarking Study, p. 6.

79 Ibid., p. 22.

80 MSC, The Meteorological Service of Canada’s People Plan 2009-2012, June 5, 2009.

81 MSC, "Forecasts for Canadians, Foresight for Canada," p. 11-12.

82 Kerry et al., Benchmarking Study, p. 22.

83 OAG, 2008 December Report: Chapter 2--Managing Severe Weather Warnings.

84 EC, 2007-2008 Report on Plans and Priorities, p. 39.

85 Kerry et al., Benchmarking Study, p. 22.

86 MSC, "Forecasts for Canadians, Foresight for Canada," p. 21.

87 Kerry et al., Benchmarking Study, p. 21.

88 Ibid., p. 21. Note: Amounts are presented in U.S. dollars.

89 Ibid., p. 21.

90 Ibid., p. 20.

91 Ibid., p. 20.

92 Ibid., p. 23.

Previous page | Table of Contents | Next page