This page has been archived on the Web

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

2006-2008 Literature Review Archives - Policy/Mitigation

Anderson, K. and A. Bows. 2008. Reframing the climate change challenge in light of post-2000 emission trends. Philosophical Transactions of the Royal Society A. doi:10.1098/rsta.2008.0138. 20 pp.
Emissions reductions required to stabilize atmospheric concentrations of carbon dioxide equivalents at levels that might limit warming to 2-3 degrees are shown to be extremely unlikely. Plan to adapt to at least 4 C warming, the authors advise.
Two researchers with the U.K. Tyndall Centre for Climate Change Research have published a paper that contributes to the discussion and assessment of anthropogenic emission paths leading to different atmospheric GHG stabilization levels. Anderson and Bows undertake two sets of analyses. In the first analysis, they evaluate possible CO2 emission paths from energy and industry sources to meet stabilization at 450 ppm CO2 -eq within constraints imposed by CO2 emissions from deforestation, current and near-term emission trends for both CO2 and non- CO2 gases, and by allowable cumulative emissions to meet stabilization at 450 ppm. (Any eventual stabilization concentration is determined primarily by the amount of accumulated anthropogenic emissions up to the time of stabilization). The cumulative emissions totals used in this analysis were from the IPCC Fourth Assessment Report and took into account carbon cycle feedbacks, which lower the allowable emissions relative to the case with no carbon-cycle feedbacks. In addition to stabilization at 450 ppm, they analyzed a second suite of emission paths with prescribed annual CO2 emission reductions from energy and industrial sources beginning in 2020, leading to stabilization of atmospheric concentrations of CO2 -eq in the range of 550-650 ppm. The same constraints regarding emissions from deforestation, emissions of non- CO2 gases, and current CO2 energy/industry emissions were employed in this analysis as well. The results suggest that a complete decarbonization of the economy by sometime in the second half of the century is required to stabilize atmospheric CO2 -eq concentrations at 450 ppm. Even stabilizing at 550 ppm would require annual emission reductions that the authors state are without historical precedent. They conclude that stabilization at much below 650 ppm CO2 -eq is improbable and that adaptation planning should be guided by this, rather than likely temperature increases associated with lower stabilization levels. Stabilization at 650 ppm CO2 -eq would lead to an equilibrium temperature rise of about 4°C (using the IPCC best estimate for climate sensitivity). The conclusions of Anderson and Bows regarding the mitigation challenge of meeting low stabilization goals are similar to those of the IPCC. It is also worth noting that modeling and estimation of carbon cycle feedbacks is an area of active research. The estimates in the IPCC AR4 for allowable cumulative emissions to meet stabilization at 450 ppm, taking carbon cycle feedbacks into account, should be considered preliminary.

Baettig, M.B., M. Wild and D. Imboden. 2007. A climate change index: where climate change may be most prominent in the 21st century. GRL Vol 34, l01701, doi:10.1029/2006GL028159, 2007.
A climate change index (CCI) is derived that provides a single measure of the strength of future climate change relative to today's natural variability. Canada and other high latitude regions show relatively high values for the CCI, but when a country's Human Development Index is considered along with its CCI, it is in the tropics that the countries most vulnerable to climate change are found.
The authors of this study have developed an aggregate measure for the strength of future climate change relative to natural variability. Their index is composed of annual and seasonal temperature and precipitation indicators which each measure some change in extreme events (e.g. change in occurrence of 1 in 20 year hot/wet/dry event). The method is based on the assumption that changes in extremes are likely to have the strongest impacts on environmental and social systems. The index is calculated based on gridded global temperature and precipitation projections from 3 GCMs (ECHAM5, HadCM3 and CGCM2) using both moderate (SRES B2) and high (SRES A2) emission scenarios. The results indicate that climate will change most strongly relative to natural variability in the high latitudes and the tropics. When the CCI is averaged for each country, and then compared to the Human Development Index - a measure of socio-economic development - the vulnerability of many tropical countries to future climate change is made evident.

Crutzen, P.J. 2006. Albedo Enhancement by stratospheric sulphur injections: A contribution to resolve a policy dilemma? Climatic Change 77: 211-219.
In a recent editorial essay in the journal Climatic Change, renowned chemist Paul Crutzen contemplates the use of geoengineering to resolve earth's warming issues. According to Crutzen, winner of the 1995 Nobel Prize in chemistry for his work on ozone depletion, artificially enhancing the earth's albedo through stratospheric sulphur injections could help cool the climate and counterbalance the warming caused by increased CO2emissions. Based on the 0.5°C cooling observed following the Mount Pinatubo eruption in 1991, researchers have calculated that the cooling effect generated by 5.3 Tg of sulphur would be sufficient to counteract the increased 4 W/m²greenhouse warming from CO2doubling. Crutzen is well aware of the various dilemmas - technical, political and ethical - associated with such a project, but nonetheless advocates that the idea be considered and tested in small-scale atmospheric tests. Crutzen cautions that stratospheric sulphur injections should be a last resort to the problem of global warming, only to be used if all other options to reduce greenhouse gases have been exhausted. Crutzen's paper generated a significant number of editorial comments from within the scientific community and although opinion is mixed, and most agreed that the idea should not be ignored, the consensus was that priority should be on treating the cause, not the symptoms.
Related Editorial comments: Bengtsson, L. 2006. Geo-engineering to confine climate change: Is it at all feasible? Climatic Change 77: 229-234; Cicerone, R.J. 2006. Geoengineering: Encouraging research and overseeing implementation. Climatic Change 77: 221-226; Kiehl, J.T. 2006. Geoengineering climate change: Treating the symptom over the cause? Climatic Change 77: 227-228; Lawrence, M.G. 2006. The Geoengineering dilemma: To speak or not to speak. Climatic Change 77: 245-248; MacCracken, M.C. 2006. Geoengineering: Worthy of cautious evaluation? Climatic Change 77: 235-243.

Danielsen, F., H. Beukema, N.D. Burgess et al. 2008. Biofuel Plantations on Forested Lands: Double Jeopardy for Biodiversity and Climate, Conservation Biology, DOI: 10.1111/j.1523-1739.2008.01096.x.
Replacing high-carbon and high biodiversity forest or peatland with biofuel plantations, in an effort to reduce the use of fossils fuels, will accelerate both climate change and biodiversity loss.
An international group of scientists from seven countries recently published the most comprehensive analysis yet of the impact of oil-palm plantations on greenhouse gas emissions and biodiversity for south-east Asia. Using published figures, they assessed changes in carbon stocks with changing land use, when various natural landscapes were converted to palm-oil plantations. They also considered GHG emissions throughout the process of producing and transporting the biofuel and from its ultimate combustion. These carbon stocks changes were then compared with the amount of fossil-fuel carbon emissions avoided by use of biofuel carbon. The results showed that only when palm-oil plantations replaced degraded land were there benefits from replacing fossil fuel use with biofuel. Planting oil palms on degraded grassland would lead to a net removal of carbon within 10 years. However, it would take 75-93 years for the carbon emissions saved through use of biofuel to compensate for the carbon lost through tropical forestland conversion through.logging (75 years) or burning (93 years). In the case where the original habitat was peatland, 692 years are required to compensate for carbon lost through land conversion. The authors also assessed the biodiversity of animal and plant species between forests and oil-palm plantations. They found that species richness of birds, lizards, mammals and plants is always lower in oil-palm plantations than in forest. Major components of the forest vegetation were completely absent from plantations and there were no sign of regeneration. These findings suggest that replacing high-carbon and high biodiversity forest or peatland with oil palm monocultures in an effort to reduce the use of fossils fuels will accelerate both climate change and biodiversity loss.

Doniger, D.D., A.V. Herzog, D.A. Lashof. 2006. An Ambitious, Centrist Approach to Global Warming Legislation. Science 314: 764-765.
While it is not within our normal scope to review policy papers, we did want to bring to your attention a short piece published in the November 3, 2006 edition of the journal Science. Doniger, Herzog and Lashof discuss an approach to climate change legislation that shows how delaying emissions reductions for 20 years will be more costly than beginning promptly. They also discuss how this could be implemented through legislation and the use of an emissions trading program in the United States.

Hansen, J. et al., (2007), Dangerous human-made interference with climate: a GISS modelE study, Atmos. Chem. Phys., 7, 2287-2312.
A well publicized paper by NASA lead scientist James Hansen, and a suite of co-authors, makes headlines with news that the Earth is approaching a dangerous tipping point. The paper itself is a synthesis of new modeling work combined with an analysis of existing science on the impacts of climate change. The news coverage was a little misleading in that the paper does not present any new science for identifying critical levels of climate change. However, Hansen and co-authors present well reasoned arguments supported by science for limiting global warming to about 1°C above the year 2000.
The objective of the Framework Convention on Climate Change (UNFCCC) is to stabilize greenhouse gas (GHGs) concentrations at a level that would avoid dangerous anthropogenic interference with climate. What would constitute 'dangerous' climate change has been the subject of much debate. In a recent paper, a large team of authors from multiple institutes, led by James Hansen of NASA's Goddard Institute for Space Studies, investigate this issue presenting a "mini-synthesis" of their own and other scientists' work on future projections and impacts of climate change, and emission reduction scenarios to limit global warming. The future climate is simulated by the atmospheric climate model GISS modelE coupled with an ocean model for three IPCC emission scenarios which correspond to the near maximum (A2), minimum (B1) and midrange (A1B) scenarios, and for two alternative scenarios; one that keeps further global warming under 1°C and one that keeps it to about 2°C. The global warming in all the IPCC scenarios exceeds 1°C by 2100. The authors focus their discussion of the impacts of climate change to three specific cases: Arctic climate change, tropical storm intensification and ice sheet and methane hydrate stabilities. They conclude that in all three cases, limiting future warming to 1°C above year 2000 significantly reduces the risks of triggering dangerous anthropogenic climate change. The authors also note, that judging from recent Earth history, if the warming is less than 1°C, strong positive feedbacks are not unleashed but, if global warming gets well out of this range, positive feedbacks could set in motion climate changes far outside the range of recent experience. They conclude that a CO2 level exceeding about 450 ppm is 'dangerous' but that limiting atmospheric CO2 to this level, in combination with decreases in non-CO2 forcings, could together be sufficient to avoid dangerous climate change.

Harvey, L.D.D. 2007. Allowable CO2 concentrations under the United Nations Framework Convention on Climate Change as a function of the climate sensitivity probability distribution function. Environmental Research Letters Vol 2, doi:10.1088/1748-9326/2/1/014001.
University of Toronto researcher Danny Harvey, in a new study published in the British scientific journal Environmental Research Letters, suggests that, if the world is to avoid dangerous human interference with the climate system, atmospheric CO2 concentrations much above current levels may not be allowable. In fact, if we assume that the critical threshold for avoiding such danger is a temperature rise of between 1.5°C and 2.5°C above pre-industrial levels and that the risk of exceeding this must be kept below 10%, it may already be too late to avoid danger.
In his study, Dr. Harvey uses a series of probability analyses to estimate, under varying assumptions, at what critical thresholds of atmospheric CO2 concentrations the risks of danger would become unacceptable. Under the most stringent assumptions - an acceptable risk of 1% of exceeding a critical danger threshold set at 1.5°C warming, with non-CO2 greenhouse gas concentrations stabilized at today's level - CO2 concentrations would need to be limited to between 270 ppm (if climate sensitivity is as high as 8°C per CO2 doubling) and 330 ppmv (low climate sensitivity of 2°C/doubling). Since our current concentration is ~380 ppm, such stringent conditions are already unattainable. Under the most generous assumptions used in the study - a 10% risk of exceeding a critical threshold of 2.5°C warming - the permissible range increases to 310 ppm (high sensitivity) to 590 ppm (low sensitivity). Harvey concludes that current CO2 concentrations already represent or are close to representing unacceptable risk of dangerous climate change.

Kurz, W.A., Stinson, G., Rampley, G. et al., 2008. Risk of natural disturbances makes contribution of Canada's forests to the global carbon cycle highly uncertain. PNAS 105:1551-1555.
A new study into the future role of fire and insect disturbances in Canada's boreal forests concludes that it highly likely that these forests will become a significant net source of carbon dioxide during the first Kyoto Protocol reporting period of 2008-2012. This has important implications for forest management programs aimed at reducing net emissions of greenhouse gases into the atmosphere.
In 2007, Canadian Forestry Service scientists published a scientific paper indicating that fire and insect disturbances are likely to cause Canadian forests to become a net source of atmospheric CO2 in the coming decades. The same team of researchers, led by Werner Kurz, have now published a sequel that looks more closely at the related implications for emissions in the first Kyoto Protocol (KP) reporting period of 2008-2012, and beyond to 2022. They use the latest version of the Carbon Budget Model of the Canadian Forest Sector (CBM-CFS3) to simulate a broad range of plausible future scenarios for net sources and sinks of greenhouse gases related to Canadian forests, using a Monte Carlo type approach. Probability distributions of the resulting 5000 different estimates for nation-wide emissions for 2008-2012 were then developed. All of the 5000 estimates project Canadian forests to be a net source in the first reporting period. The magnitude of this source various between 30 and 245 Mt of CO2. At the upper end, this is equal to about 30% of Canada's current anthropogenic emissions of all greenhouse gases. The model simulations indicate that Canada's forests were still a net sink for CO2 as recently as 2001. However, by 2002, the mountain pine beetle infestation in western Canada had changed this into a net source. This infestation, expected to peak in 2009, will have related effects on emissions through decay, wildfire and accelerated harvesting for several decades. Meanwhile another outbreak of spruce budworm infestation is expected soon in eastern Canada. The authors note that current KP guidelines for emission reporting fail to provide incentives for developing good forest management practices that enhance sinks if that country already has significant natural sources of greenhouse gases. Since the latter is the case for Canada, it has elected not to include results of good forest management practices in its KP reporting of emissions.

McLeman, R. and Smit, B. 2006. Migration as an adaptation to climate change. Climatic Change, 76:31-53.
The authors have developed a conceptual model in which theories of vulnerability, adaptive capacity and exposure to climatic changes combine with migration theory. In response to a climatic stimulus, the authors posit community-level adaptation where community resources allow, in addition to household-level responses like migration or cooperation. The 1930s Oklahoma-to-California migration of many families is examined to see if it fits their conceptual model (which it does). Importantly, the authors point out the importance of different kinds of capital - economic, social, physical - in determining whether families chose to migrate, and to where. As an example, social capital in the form of friends and family in California modified the response of some Oklahomans to climate change, as did their ownership of capital like land, cars and farm equipment (or lack thereof). The case study is said to show the need to qualify sweeping predictions of mass migration in response to climate change, because migration responses are likely to be tempered by numerous physical and social factors. The study also made clear the maladaptive effect of some depression-era Oklahoma agricultural policies, showing the need for coherence between different policy responses by governments.

Mignone, B.K., R.H. Socolow, J.L. Sarmiento, and M. Oppenheimer. 2008. Atmospheric stabilization and the timing of carbon mitigation. Climatic Change 88:3-4:251-265. DOI 10.1007/s10584-007-9391-8.
With suitably intensive emission reductions in the future, some postponement of mitigation can occur while still realizing a targeted atmospheric CO2 concentration. However, delays of more than two or three decades would preclude achieving the target of not exceeding a CO2 doubling from pre-industrial (about 550 ppm).
Previous work has shown that, no matter what threshold CO2 concentration target (atmospheric outcome) might be selected as the basis for international climate policy, there is more than one emission scenario that would lead to achievement of the target. Some scenarios, for example, would allow postponement of emission reductions in the short-term with consequent greater rates of emission reduction required in the future, while others would do the opposite. The purpose of this paper, published in the journal Climatic Change, was to examine further the issue of timing of emission reductions. The authors select a stabilization target of 550 ppm CO2 as the basis for their analysis. In one set of scenarios, they fixed the rate of future emissions decline at 1%/year. In these scenarios, the achievement of the stabilization target is thus determined by the number of years beyond today that emissions are postponed. In the second set of analyses, the rate of emissions decline is varied (between 0.5% and 3%/year) in order to show how near-term postponement and future increases in the intensity of mitigation can be traded off to achieve the same atmospheric goal. In all scenarios, emissions increase from present values at a rate of 0.2 Pg C/year until the mitigation scenario forces emission rates to fall at some point of time. They show, with a fixed rate of emission decline, immediate emission reductions allow achievement of a target of about 475 ppm CO2. However, each additional year of delay increases the achievable target by about 9 ppm. Thus, stabilization below a pre-industrial doubling (~550 ppm) will require that mitigation begin within a decade. When the emission rate is flexible, they find that future intensification of mitigation can offset postponement. However, even in these cases, delays of more than two or three decades permanently remove the option to stabilize atmospheric CO2 concentrations below 550 ppm. The results suggest to the authors that a practical transition path to a stabilization target in range of 550 ppm CO2 can allow, at most, one or two decades of delay.

Nusbaumer, J and K. Matsumoto. 2008. Climate and carbon cycle changes under the overshoot scenario. Global and Planetary Change. 62:164-172.
Overshoot scenarios could potentially have 'dangerous' consequences as early as 50 years after the peak of emissions. These consequences are larger when the rate of emissions is faster and the emissions peak (or overshooting) higher, no matter by how much the emissions are reduced after the peak.
Some modeling studies investigating alternate pathways for reducing GHG emissions have employed "overshoot scenarios" - scenarios in which CO2 concentration in the atmosphere temporarily exceeds some pre-defined "dangerous" threshold, before being reduced to non-dangerous levels. In a recent study, two American scientists quantified the change in several climatic and environmental variables under a variety of overshoot scenarios, using a global climate model of intermediate complexity coupled with a carbon model to allow the representation of carbon-climate feedbacks. The control scenario is based on the IPCC WRE S650 scenario where atmospheric emissions peak in 2056 and CO2 concentrations increase until stabilization at 650 ppm in the year 2200. The overshoot scenarios also eventually stabilized atmospheric CO2 at 650 ppm and peaked emissions in 2056, but differed in the rate of emissions growth before peaking, the amplitude of the emissions peak (all 'overshooting' the 650 ppm target), and in peak duration. In all cases (control and overshoot scenarios), cumulative emissions at the end of the experiment (year 2300) were about the same. The results of their experiments show that over the long term, atmospheric CO2 concentration, ocean and surface temperature, air-sea CO2 flux, surface ocean pH, and large scale ocean overturning are nearly the same for all scenarios. This is due largely to the fact that the climate in all scenarios is afforded enough time (more than 200 years) to approach equilibrium. However, on a shorter time scale, the transient changes in the atmospheric and oceanic variables were significantly larger for all overshoot scenarios compared to the control scenario. The magnitude of the changes was related to the peak emissions level - the more you overshoot the target, the larger the consequences. Thus, the rate of emissions between now and the time to peak emissions (2056 in this case), and the magnitude of the peak itself are most important, although in the long term, cumulative emissions are also important. For the authors, this implies that our perspective on time is important in the debate on overshoot scenarios and the inter-generational equity the overshoot scenarios attempt to achieve.

Prabhakar, S., Srinivasan, A. and R. Shaw. 2008. Climate change and local level disaster risk reduction planning: need, opportunities and challenges. Mitig Adapt Strateg Glob Change, DOI 10.1007/s11027-008-9147-4.
Strategic thinking is needed to incorporate climate change into disaster risk reduction planning.
Disaster risk management planning involves understanding hazards, vulnerabilities and potential losses and developing appropriate preparedness and mitigation strategies to mitigate the losses. Climate change on a global scale moves slowly while at the local level, climate change may have significant impact on hazards such as severe weather. As such, this paper argues that there will be new risks brought by climate change to the existing disaster risk reduction planning and management approaches. The authors propose an approach that brings together different stakeholders to develop case studies that reflect long-term local impacts of climate change, and to build the capacity of local stakeholders to make decisions under uncertainty. They also propose a scheme to bring together experts from climate, disaster and policy communities to start a dialogue on understanding the wider aspects of long-term risk reduction at the local level. Appropriate tools and techniques will have to be developed to help people identify and deal with the impacts of climate change in their area. With this knowledge, local players will be able to design climate-proof disaster risk reduction planning and understand vulnerabilities in their locations. The scheme recognizes that learning, and involving community groups in disaster risk reduction planning, are necessary to address the uncertainty of climate change impacts.

Robock, A., L. Oman and G.L. Stenchikov. 2008. Regional climate responses to geoengineering with tropical and Arctic SO2 injections. JGR Vol 113 D16101, doi:10.1029/2008JD010050, 15 pp.
A global climate model is used to investigate the consequences of injecting sulphate aerosols into the stratosphere. Global warming can be reduced, but regional precipitation patterns are altered with implications for food and water supplies. Very rapid warming occurs if the geoengineering initiative is halted, highlighting one of the risks of such an approach in the absence of parallel reductions in greenhouse gas emissions.
Geoengineering solutions to global warming are under consideration in light of the continued growth in emissions of greenhouse gases. One category of geoengineering approaches involves reducing the amount of incoming solar radiation as a means to induce cooling of the climate system. Of the possibilities here, injection of aerosols into the stratosphere has received particular attention of late. This study by Robock et al. reports on the first comprehensive GCM experiment to evaluate the transient climate system response to stratospheric injection of sulphate aerosols. 40-year experiments were run involving either climate change alone (using the IPCC A1B scenario) or in combination with geoengineering. For the geoengineering runs, SO2 was injected at a constant rate (which varied between experiments) for 20 years, into either the tropical or Arctic stratosphere. Emissions were then shut-off and the experiments were continued for another 20 years to measure the ongoing response of the climate system. The results show that enhancing stratospheric aerosol concentrations does reduce global warming, with greater reductions for larger SO2 injections. Arctic SO2 injections have a smaller impact than tropical injections due in part to the shorter lifetime of Arctic aerosols. Therefore, if enough aerosols could be put into the stratosphere, on a continuous basis, they would cool the planet and even reverse global warming. However, global warming resumes very rapidly as soon as the geoengineering experiment halts, such that the rate of warming exceeds that projected under the A1B scenario. This is a reminder of the risks of relying on such initiatives to mitigate global warming. The study also investigated impacts on other aspects of climate, at regional as well as global scales. Of significance are the findings that tropical injection of sulphate aerosols led to changes in the African and Asian monsoons with serious implications for food and water supplies in these areas. That Arctic geoengineering had repercussions well beyond the Arctic is a result that does not support proposals to use Arctic geoengineering to limit the melting of summer sea ice.

Searchinger, T., R. Heimlich, R.A. Houghton et al., 2008. Use of US croplands for biofuels increases greenhouse gases through emissions from land use change. Published online February 7 2008; 10.1126/science.1151861 (Science Express Reports); Fargione,J., J. Hill,, D. Tilman et al., 2008. Land clearing and the biofuel carbon debt. Published online February 7 2008; 10.1126/science.1152747 (Science Express Reports)
Two new studies show that greenhouse gas emissions from land use change involved in biomass production can far exceed the emissions reductions achieved when using this biomass for energy instead of fossil fuels. However, net emissions can be reduced if the biofuel comes from crop wastes or is grown on abandoned farmland.
One of the greenhouse gas emission reduction policies being implemented by many Annex I countries (particularly the USA and the European Union) is that of switching from fossil fuels to biofuels. In theory, this could reduce net CO2 emissions because the carbon contained in the biomass used to generate biofuels comes from atmospheric carbon recently taken up by plants through photosynthesis. However, recent studies have suggested that the net CO2 emissions from generation of energy from biomass are very dependent on the method of biomass production. New studies undertaken independently by two teams of American researchers, published in Science, further add to this debate. Both conclude that net life-cycle emissions from biofuels when the biomass has been produced on croplands newly converted from native ecosystems like rainforests, peatlands, savannas or grasslands, may be significantly greater than that which would have been released by combustion of the fossil fuels they replace. This is primarily because of the CO2 released from the native soils during conversion and subsequent cropping. For example, world wide land use change to produce corn-based ethanol could double net greenhouse gas emissions over the next 30 years relative to that from the fossil fuels they replace. In some tropical regions, the increase would be much higher. Even energy from switchgrass, if grown on productive US cornlands, would increase net emissions. On the other hand, net life cycle emissions from biomass produced from crop wastes or grown on abandoned agricultural lands can be significantly less than that from the fossil fuels they replace. The authors caution that policies for using biofuels to reduce net greenhouse gas emissions must take these land use change factors into account.

Sternman, H.D. and Sweeney, L.B., 2007. Understanding public complacency about climate change: Adults' mental models of climate change violate conservation of matter. Climatic change 80:213-238.
An experiment using graduate students at MIT showed that the majority of subjects failed to apply basic laws of physics when asked to match CO2 emission trajectories to different scenarios for stabilization of atmospheric CO2. Participants thus mistakenly expected rapid stabilization of atmospheric CO2 in response to emission reductions. If such beliefs are commonly held, they help explain the 'wait-and-see' approach to climate change favoured by many people.
     Surveys have shown that most Americans favour a wait-and-see policy towards the reduction of GHG emissions despite agreeing that climate change poses serious risks. A recent American study, involving MIT graduates, sought to explain this contradiction. Various atmospheric CO2 stabilization scenarios were presented to the subjects who were asked to describe the emission trajectories required to obtain these. Results showed that the majority of subjects used a pattern matching approach, whereby a decrease in emissions automatically leads to decreased atmospheric CO2 concentrations and a subsequent reduction in global mean temperature. In doing so, however, basic laws of physics were ignored. Consequently, the participants in the study concluded erroneously that a reduction in emissions would lead to a rapid drop in CO2 concentrations and mean global temperature, ignoring the fact of long delays in the response of the climate system. Participants also believed that stabilization of atmospheric CO2 was possible as long as emissions were decreasing even if emissions into the atmosphere exceeded removal rates. Since these beliefs support wait-and-see policies the authors believe that similar beliefs are likely widely held by the American public.

Tilmes, S., R. Müeller and R. Salawitch, 2008. The Sensitivity of Polar Ozone Depletion to Proposed Geoengineering Schemes. Science Express, 24 April 2008. Science. 1153966.
Study shows that continuous injection of sulphur into the stratosphere to offset global warming would cause a drastic increase in the extent of Arctic ozone depletion during the next century and a delay of between 30 and 70 years in the expected recovery of the Antarctic ozone hole.
A number of geoengineering schemes have been proposed as options for alleviating the consequences of global warming. The consequence of one such scheme - continuous injection of sulphur into the stratosphere - was investigated in terms of the impact on polar ozone depletion. In this paper, the impact of artificially enhanced sulphate aerosol on future chemical ozone depletion is quantified using empirical relationships between ozone depletion and the potential for the activation of chlorine. The authors consider two sulphate aerosol injection scenarios beginning in the year 2010, with yearly injection of sulphate in the stratosphere: 2 TgS/year of large size aerosols and 1.5 TgS/year of small aerosols (which are more efficient at cooling). They also considered a no injection scenario (which used the sulphate content of the stratosphere in the year 2000). The risk of future polar ozone loss caused by these various scenarios was assessed under selected meteorological conditions: a recent very cold Artic winter, a moderately cold Arctic winter, and a typical Antarctic winter. Their results show that injection of sulphur into the stratosphere would lead to a drastically thinner ozone layer in the Arctic, especially during very cold winters and with injection of smaller aerosols. For the Antarctic, the primary effect would be to delay the recovery of Antarctic ozone, to conditions that prevailed in 1980 (reference year), until at least the last decade of this century. However, other important effects are not considered in this study, such as the potential strengthening of the polar vortex caused by enhanced stratospheric aerosol that could lead to even greater ozone loss than estimated, or earlier stratospheric warming at the end of Arctic winters, which could lessen the ozone loss. In conclusion, the authors suggest that comprehensive chemistry-climate model simulations are needed to fully evaluate the impact of geoengineering on stratospheric ozone.

Van Vuuren D.P., M. Meinshausen, G.K. Plattner et al. 2008. Temperature increase of 21st century mitigation scenarios, PNAS, 105 (40): 15258-15262.
New estimates of 21st century global warming for a range of mitigation scenarios show that even with the most stringent stabilization scenarios, an average minimum warming of ~ 1.4°C is projected. This value is above the previously estimated "committed warming" based on climate system inertia alone.
Progress in developing multigas mitigation scenarios now allows an assessment of the impact of climate policies on the range of projected warming. In this paper, the authors compile a set of baseline and associated mitigation scenarios, from a group of Integrated Assessment Models, which address emissions of all major greenhouse gases (CO2, CH4, N2O, and halocarbons), aerosols (SO2) and tropospheric ozone precursors (CO, NOX, VOCs). Atmospheric greenhouse gas concentrations (reported in equivalent CO2 concentration), radiative forcing and temperature increase for the 21st century were then simulated using two reduced complexity climate models (MAGICC and Bern2.5CC). The choice of models allowed an exploration of the sensitivity of the results to different assumptions about climate sensitivity and carbon cycle feedbacks. In the baseline (no climate policy) scenarios, the range in GHG emissions by year 2100 is from about 70% to almost 250% compared with year 2000, which is broadly consistent with the IPCC-SRES range. The mitigation scenarios follow a pattern with a peak in global emissions between 2020 and 2040, at a maximum value of 50% above current emissions levels. The mitigation scenarios were classified into categories according to their radiative forcing target with most scenarios falling within two categories: stabilization at 4.5 W/m2 compared to preindustrial and at < 3.5 W/m2 (lowest scenarios). The emission reductions in 2050 vary considerably as a function of the stabilization target. For the 4.5 W/m2 target, year 2050 emissions are 2-30% lower than the year 2000 emissions, but for the category of lowest scenarios, emissions are 20-60% lower than in year 2000. Projected temperature changes by year 2100 for the full range of mitigation scenarios show an increase of 0.5-4.4°C over 1990 levels by year 2100 or 0.3-3.4°C less than the no-policy cases. This range captures differences in the stringency of mitigation policy as well as uncertainty in our understanding of the climate system. The most stringent stabilization scenarios produced an average warming of 1.4°C (with a full range of 0.5-2.8°C). This value, the authors say, can be considered a more realistic estimate of committed warming since it reflects technological and economic inertia. 1.4°C is substantially above previously estimated committed warming based on climate system inertia alone (0.2-1.0°C), where no mitigation measures were considered. These results reconfirm that, although ambitious mitigation efforts can significantly reduce global warming, adaptation measures will be needed in addition to mitigation to reduce the impact of the residual warming.

Wilbanks, T.J. and J. Sathaye. 2007. Mitigation and Adaptation Strategies for Global Change 12: 957-962.
Integrating mitigation and adaptation into a single portfolio of climate change response options is now well recognized. Preliminary findings about the nature of such portfolios include the need to emphasize options that offer synergies, recognition of the potential for unintended indirect consequences, and consideration of capacities for implementation.
A special issue of the journal Mitigation and Adaptation Strategies for Global Change (volume 17) is focused on the practical reality of integrating mitigation and adaptation in climate change response policy. A summary paper by Wilbanks and Sathaye takes the findings of the 16 peer-reviewed articles and draws some preliminary findings about appropriate mitigation and adaptation portfolios. First it is important to differentiate between options that offer complementarities and synergies (e.g., tree planting and other biomass sink preservation/conservation as examples of adaptation options that also reduce emissions) rather than those that have secondary climate change impacts (e.g., if climate change stimulates disease vector transmission, a response might be to add filters and space conditioning to protect against those vectors, which could require more energy). Several articles point to the opportunity of integrating sustainable development pathways into this portfolio of mitigation and adaptation options. Finally, the need to consider not just the what to do but also the capacity for implementation is considered, an area that is made more difficult by the varying geographic scales of these options - mitigation policies are often discussed and implemented at a global or national scale whereas adaptation is more relevant and necessary at a local scale. This series of papers advances the discussion of the integrating of mitigation and adaptation options from a theoretical discussion to a practical one.

Date modified: