Tuesday, August 18, 2009

It only seems like "warming"...

In a separate entry, I discussed how the politics of government funding have largely created the “global warming” crisis. In this entry I’ll discuss the technical merits of anthropogenic (human-caused) global warming.

…and here is the thing: Starting in the late 1300’s, the Earth went into a substantial period of cooling that lasted until the early 1800’s. This cold period has been called “the Little Ice Age” and was responsible, among other things, for the collapse of the thriving Viking colony on Greenland that had been established in a warmer time when southern Greenland really was “green”. The cause of the Little Ice Age has not been established conclusively, but in the early 1800’s the climate began to warm again and now, in the early 21st century, global temperatures have returned to the same levels they had been at during the Middle Ages. I’ll repeat that…global temperatures have just now returned to levels that were considered “normal” during the Middle Ages.

Proponents of anthropogenic global warming are fond of showing plots of global temperatures starting in the early 1800’s up through the present, as these show an alarming “hockey stick” increase in average global temperatures. I’ll discuss temperature trends in the last half of the 20th century presently, but if these same temperature graphs are extended back 1000 years instead of 100, the data look far less alarming. The temperature increase during the 19th and 20th centuries is seen to be merely the planet’s recovery from the so-called Little Ice Age. This 1000 year data also show that the recent increase is essentially symmetrical with the temperature decrease in the 13th and 14th centuries, i.e., the Earth warmed up at about the same rate as it cooled down. A credible argument of anthropogenic global warming would have to demonstrate warming above and beyond this recovery from an extended period of cooling. This simply has not been done and even the existence of the Little Ice Age has not entered into the popular discussion of global warming.

The 1000 year temperature data, which has been deduced from a combination of pollen counts in ice cores, tree ring growth patterns, and historical observations, has been criticized as accurate thermometers have only been generally available for the past few hundred years. However, agrarian societies, highly sensitive to the growing season, have been almost fanatical in recording dates of the first and the last frost, amounts of snow and rainfall, and any unusual weather conditions. When these dates are referenced to the lunar and solar calendars an accurate picture of temperature trends can be assembled and, as these observations are consistent with data from independent sources, i.e., pollen samples and tree ring data, this 1000 year temperature data can be viewed as entirely credible.

One of the more alarmist claims of proponents of anthropogenic global warming is that nine of the hottest years in the United States during the 20th century occurred in the 1990’s. These claims were based primarily on work published by NASA’s James Hansen and they would be alarming, indeed, if true. However, Hansen’s conclusions were based on temperatures measured at weather stations across the United States which were then averaged using an algorithm that accounted for differences in altitude and local climate. However, analysis of Hansen’s data after he made his hyperbolic claims revealed some glaring errors in his calculation methods. One of the more egregious of these was a failure to account for the increasing urbanization of the United States over the course of the 20th century. Simply put, a weather station sitting in a cow pasture in the 1930’s would read substantially lower temperatures than that same weather station sitting in a suburban parking lot in the 1960’s. When these analysis errors were corrected, it was found that the hottest years in the 20th century occurred during the 1930’s and that the early 21st century has been characterized by a slight cooling trend...again consistent with a climate where global temperatures have leveled off after recovering from an extended period of cooling.

If the increase in global temperature from the early 1800’s to the present is recognized as a recovery from an extended cooling period, then all of the global warming alarmism gets put in perspective. The supposedly dire consequences of anthropogenic global warming – melting ice packs, rising sea levels, plant and animal extinctions, droughts, floods, violent hurricanes, etc., etc., etc. – can be viewed, not as apocalyptic, end-of-life-as-we-know-it disasters, but as the consequences of an extremely complex interaction of fluctuating solar radiation, cyclic “wobbling” of the earth’s rotation on its axis, and changes in atmospheric and oceanic circulation patterns. Had all of these supposedly impending disasters not been blamed on the convenient catch-all of “global warming”, the billions of dollars and thousands of man-years spent on “proving” that our impending “doom” was a result of human activities might actually have been used to study the real mechanisms for climate change. Had that been the case, not only would we have a much better understanding of the natural world we inhabit, but planners in the southwest US might have useful tools for predicting the duration of the current drought, maritime nations might have tools for planning shipping routes in the high Arctic, agriculturalists might have more accurate knowledge of growing seasons…the list of benefits for serious climate research is a long one. Unfortunately for all of us, these resources have been squandered on a failed effort to use government-funded science to drive an agenda.

1 comment:

  1. Heretic! how dare you question the one true religion, you're as bad as the Galileo! Recant or face cleansing!

    ReplyDelete