Internationsl Centre for Reindeer HusbandryASsociation of World Reindeer Herderssa-sucAbout Reindeer Husbandry
Good news for the earth’s climate system?
Written by Real climate   
Monday, 29 March 2010 13:34
AddThis Social Bookmark Button

Guest Commentary by Jim Bouldin (UC Davis). How much additional carbon dioxide will be released to, or removed from, the atmosphere, by the oceans and the biosphere in response to global warming over the next century? That is an important question, and David Frank and his Swiss coworkers at WSL have just published an  interesting new approach to answering it. They empirically estimate the distribution of gamma, the temperature-induced carbon dioxide feedback to the climate system, given the current state of the knowledge of reconstructed temperature, and carbon dioxide concentration, over the last millennium. It is a macro-scale approach to constraining this parameter; it does not attempt to refine our knowledge about carbon dioxide flux pathways, rates or mechanisms. Regardless of general approach or specific results, I like studies like this. They bring together results from actually or potentially disparate data inputs and methods, which can be hard to keep track of, into a systematic framework. By organizing, they help to clarify, and for that there is much to be said.

Gamma has units in ppmv per ºC. It is thus the inverse of climate sensitivity, where CO2 is the forcing and T is the response. Carbon dioxide can, of course, act as both a forcing and a (relatively slow) feedback; slow at least when compared to faster feedbacks like water vapor and cloud changes. Estimates of the traditional climate sensitivity, e.g. Charney et al., (1979) are thus not affected by the study. Estimates of more broadly defined sensitivities that include slower feedbacks, (e.g. Lunt et al. (2010), Pagani et al. (2010)), could be however.

Existing estimates of gamma come primarily from analyses of coupled climate-carbon cycle (C4) models (analyzed in Friedlingstein et al., 2006), and a small number of empirical studies. The latter are based on a limited set of assumptions regarding historic temperatures and appropriate methods, while the models display a wide range of sensitivities depending on assumptions inherent to each. Values of gamma are typically positive in these studies (i.e. increased T => increased CO2).

To estimate gamma, the authors use an experimental (“ensemble”) calibration approach, by analyzing the time courses of reconstructed Northern Hemisphere T estimates, and ice core CO2 levels, from 1050 to 1800, AD. This period represents a time when both high resolution T and CO2 estimates exist, and in which the confounding effects of other possible causes of CO2 fluxes are minimized, especially the massive anthropogenic input since 1800. That input could completely swamp the temperature signal; the authors’ choice is thus designed to maximize the likelihood of detecting the T signal on CO2. The T estimates are taken from the recalibration of nine proxy-based studies from the last decade, and the CO2 from 3 Antarctic ice cores. Northern Hemisphere T estimates are used because their proxy sample sizes (largely dendro-based) are far higher than in the Southern Hemisphere. However, the results are considered globally applicable, due to the very strong correlation between hemispheric and global T values in the instrumental record (their Figure S3, r = 0.96, HadCRUT basis), and also of ice core and global mean atmospheric CO2.

The authors systematically varied both the proxy T data sources and methodologicalvariables that influence gamma, and then examined the distribution of the nearly 230,000 resulting values. The varying data sources include the nine T reconstructions (Fig 1), while the varying methods include things like the statistical smoothing method, and the time intervals used to both calibrate the proxy T record against the instrumental record, and to estimate gamma.


Figure 1. The nine temperature reconstructions (a), and 3 ice core CO2 records (b), used in the study.

Some other variables were fixed, most notably the calibration method relating the proxy and instrumental temperatures (via equalization of the mean and variance for each, over the chosen calibration interval). The authors note that this approach is not only among the mathematically simplest, but also among the best at retaining the full variance (Lee et al, 2008), and hence the amplitude, of the historic T record. This is important, given the inherent uncertainty in obtaining a T signal, even with the above-mentioned considerations regarding the analysis period chosen. They chose the time lag, ranging up to +/- 80 years, which maximized the correlation between T and CO2. This was to account for the inherent uncertainty in the time scale, and even the direction of causation, of the various physical processes involved. They also estimated the results that would be produced from 10 C4 models analyzed by Friedlingstein (2006), over the same range of temperatures (but shorter time periods).

So what did they find?

In the highlighted result of the work, the authors estimate the mean and median of gamma to be 10.2 and 7.7 ppm/ºC respectively, but, as indicated by the difference in the two, with a long tail to the right (Fig. 2). The previous empirical estimates, by contrast, come in much higher–about 40 ppm/degree. The choice of the proxy reconstruction used, and the target time period analyzed, had the largest effect on the estimates. The estimates from the ten C4 models, were higher on average; it is about twice as likely that the empirical estimates fall in the model estimates? lower quartile as in the upper. Still, six of the ten models evaluated produced results very close to the empirical estimates, and the models’ range of estimates does not exclude those from the empirical methods.


Figure 2. Distribution of gamma. Red values are from 1050-1550, blue from 1550-1800.

Are these results cause for optimism regarding the future? Well the problem with knowing the future, to flip the famous Niels Bohr quote, is that it involves prediction.

The question is hard to answer. Empirically oriented studies are inherently limited in applicability to the range of conditions they evaluate. As most of the source reconstructions used in the study show, there is no time period between 1050 and 1800, including the medieval times, which equals the global temperature state we are now in; most of it is not even close. We are in a no-analogue state with respect to mechanistic, global-scale understanding of the inter-relationship of the carbon cycle and temperature, at least for the last two or three million years. And no-analogue states are generally not a real comfortable place to be, either scientifically or societally.

Still, based on these low estimates of gamma, the authors suggest that surprises over the next century may be unlikely. The estimates are supported by the fact that more than half of the C4-based (model) results were quite close (within a couple of ppm) to the median values obtained from the empirical analysis, although the authors clearly state that the shorter time periods that the models were originally run over makes apples to apples comparisons with the empirical results tenuous. Still, this result may be evidence that the carbon cycle component of these models have, individually or collectively, captured the essential physics and biology needed to make them useful for predictions into the multi-decadal future. Also, some pre-1800, temperature independent CO2 fluxes could have contributed to the observed CO2 variation in the ice cores, which would tend to exaggerate the empirically-estimated values. The authors did attempt to control for the effects of land use change, but noted that modeled land use estimates going back 1000 years are inherently uncertain. Choosing the time lag that maximizes the T to CO2 correlation could also bias the estimates high.

On the other hand, arguments could also be made that the estimates are low. Figure 2 shows that the authors also performed their empirical analyses within two sub-intervals (1050-1550, and 1550-1800). Not only did the mean and variance differ significantly between the two (mean/s.d. of 4.3/3.5 versus 16.1/12.5 respectively), but the R squared values of the many regressions were generally much higher in the late period than in the early (their Figure S6). Given that the proxy sample size for all temperature reconstructions generally drops fairly drastically over the past millennium, especially before their 1550 dividing line, it seems at least reasonably plausible that the estimates from the later interval are more realistic. The long tail–the possibility of much higher values of gamma–also comes mainly from the later time interval, so values of gamma from say 20 to 60 ppm/ºC (e.g. Cox and Jones, 2008) certainly cannot be excluded.

But this wrangling over likely values may well be somewhat moot, given the real world situation. Even if the mean estimates as high as say 20 ppm/ºC are more realistic, this feedback rate still does not compare to the rate of increase in CO2 resulting from fossil fuel burning, which at recent rates would exceed that amount in between one and two decades.

I found some other results of this study interesting. One such involved the analysis of time lags. The authors found that in 98.5% of their regressions, CO2 lagged temperature. There will undoubtedly be those who interpret this as evidence that CO2 cannot be a driver of temperature, a common misinterpretation of the ice core record. Rather, these results from the past millennium support the usual interpretation of the ice core record over the later Pleistocene, in which CO2 acts as a feedback to temperature changes initiated by orbital forcings (see e.g. the recent paper by Ganopolski and Roche (2009)).

The study also points up the need, once again, to further constrain the carbon cycle budget. The fact that a pre-1800 time period had to be used to try to detect a signal indicates that this type of analysis is not likely to be sensitive enough to figure out how, or even if, gamma is changing in the future. The only way around that problem is via tighter constraints on the various pools and fluxes of the carbon cycle, especially those related to the terrestrial component. There is much work to be done there.


References

Charney, J.G., et al. Carbon Dioxide and Climate: A Scientific Assessment. National Academy of Sciences, Washington, DC (1979).

Cox, P. & Jones, C. Climate change – illuminating the modern dance of climate and CO2. Science 321, 1642-1644 (2008).

Frank, D. C. et al. Ensemble reconstruction constraints on the global carbon cycle sensitivity to climate. Nature 463, 527-530 (2010).

Friedlingstein, P. et al. Climate-carbon cycle feedback analysis: results from the (CMIP)-M-4 model intercomparison. J. Clim. 19, 3337-3353 (2006).

Ganopolski, A, and D. M. Roche, On the nature of lead-lag relationships during glacial-interglacial climate transitions. Quaternary Science Reviews, 28, 3361-3378 (2009).

Lee, T., Zwiers, F. & Tsao, M. Evaluation of proxy-based millennial reconstruction methods. Clim. Dyn. 31, 263-281 (2008).

Lunt, D.J., A.M. Haywood, G.A. Schmidt, U. Salzmann, P.J. Valdes, and H.J. Dowsett. Earth system sensitivity inferred from Pliocene modeling and data. Nature Geosci., 3, 60-64 (2010).

Pagani, M, Z. Liu, J. LaRiviere, and A.C.Ravelo. High Earth-system climate sensitivity determined from Pliocene carbon dioxide concentrations. Nature Geosci., 3, 27-30


Posted: 2010-02-06 15:08:45

Read Full Article
Climate Change, Real Climate
 
arctic universi norden ipy research_council aid2rb8072dpi