Cooking the Science

The conflict between those who view humanity as an enemy of nature and those who view man as nature’s steward is culminating in the dispute over climate change. At odds in this debate is not the question of whether or not we are experiencing climate change, we are and always have. Rather, it is the urgent assertion that mankind is causing adverse changes to our climate, so adverse that our very survival is threatened. Humans have only been trying to measure the temperature on a consistent basis since about 1850, during which time we think the world may have warmed by about 0.6 degrees centigrade, within a margin of error of 0.2 degrees plus or minus. Most of the climate community has agreed since 1988 that global mean temperatures have increased about one-degree Fahrenheit, or roughly one-half degree centigrade, over the past century. The majority of this increase occurred from 1919 to 1940, then decreased between 1940 and the early ‘70s, increased again until the ‘90s, and has remained essentially flat since 1998. The fact is, these temperature fluctuations are nothing new. Scientists have long understood that the earth’s climate naturally experiences cyclical changes—even dramatic changes. In addition, when you stop to consider the massive variables involved in trying to pinpoint the earth’s mean temperature; it is very difficult to achieve a precise measurement even when exact measurements are available. The land heats and cools differently than the oceans or the atmosphere, temperatures are taken on a very disparate basis in proportion to land and water volume, location, seasonal fluctuations, and so on and so forth. Simply consider your local forecast. Temperatures vary between weather stations that are only a few blocks apart. The bottom line: determining an accurate mean global temperature is a very complex and difficult challenge that still hasn’t been solved. So, how did these long-recognized natural changes in climate and temperature suddenly transform into apocalyptic conditions that threaten to “send our entire planet into a tail-spin of epic destruction,” according to Al Gore? Michael Mann is a well-known climatologist whom you may not have heard of, but you are probably familiar with his famous conclusion: that the present temperature increase is “likely to have been the largest of any century during the past 1,000 years” and that the “1990s was the warmest decade and 1998 the warmest year” of the millennium. How could Mann make such claims when we have only consistently collected temperature readings since the latter 19th century? In the absence of direct temperature measurements, Mann reconstructed temperatures in the Northern Hemisphere over the last 1000 years using what are known as “proxy” methods. Essentially, scientists compare proxy data such as recent isotope data from snow and tree rings to local temperatures. This real time comparison then serves as a sort-of scale from which scientists can examine similar proxy sources to infer temperatures in the distant past. These proxy reconstructions are indirect inferences of temperature and obviously have less accuracy than data collected by direct methods such as a thermometer. Remember, the climate change debate centers around temperature variables of less than 1 degree centigrade within a margin of error of 0.2 degrees plus or minus even using the most sophisticated measuring devices. Mann published his findings in 1998, producing the infamous “hockey stick” graph illustrating cooler temperatures throughout the last 1000 years until the last decade of the 20th century, at which point temperatures seemed to rise “dramatically.” If you watched Al Gore’s 2006 film, An Inconvenient Truth, this graphic featured prominently. In simple visual terms, it was very compelling. Mann’s findings were arguably the single most influential study in swaying the public debate, and in 2001 they became the official view of the International Panel for Climate Change (IPCC), the UN body that is organizing the worldwide effort to combat global warming. However, Dr. Edward Wegman, a professor at the Center for Computational Statistics at George Mason University, chair of the National Academy of Sciences’ Committee on Applied and Theoretical Statistics, and board member of the American Statistical Association was asked by the energy and commerce committee of the U.S. House of Representatives to evaluate the statistical validity of Michael Mann’s findings. Wegman conducted his third-party review by assembling an expert panel of statisticians, including outside statisticians and the Board of the American Statistical Association. At its conclusion, the Wegman review repudiated Mann’s work saying: Our committee believes that the assessments that the decade of the 1990s was the hottest decade in a millennium and that 1998 was the hottest year in a millennium cannot be supported. … The paucity of data in the more remote past makes the hottest-in-a-millennium claims essentially unverifiable. Dr. Wegman is considered the preeminent expert in the world on “computational statistics” (He actually the coined the term). Wegman found that Mann made a basic error that “may be easily overlooked by someone not trained in statistical methodology.” When Wegman corrected Mann’s statistical mistakes, the “hockey stick” disappeared. However, even if Mann’s findings were correct, the natural world has tolerated greater than one-degree fluctuations in mean temperature and thus current changes are still within the range of natural variation. The fact is, there is no reason to believe that slightly lower temperatures are somehow preferable to slightly higher temperatures—there is no known “optimal” global temperature. Furthermore, Wegman pointed out “there is no evidence that Dr. Mann or any of the other authors in paleoclimate studies have had significant interactions with mainstream statisticians.” In other words, Wegman believes that much of the climate science that has been done should be taken with a grain of salt, despite the fact that the studies may have been peer reviewed, the reviewers were often unqualified in statistics. What is most telling, is the reaction to Wegman’s revelation from the climatology community. Many ultimately agreed that Mann's “hockey stick” graph was wrong but nonetheless say his conclusions are correct. In response, Wegman later testified before the energy and commerce committee saying: “I am baffled by the claim that the incorrect method doesn’t matter because the answer is correct anyway. Method Wrong + Answer Correct = Bad Science.” Science which begins with and stubbornly clings to its philosophical presuppositions is not science; it is false religion. Lesslie Newbigin, noted theologian and author, makes this point, “For modern Western peoples—nature has taken the place of God as the ultimate reality with which we have to deal.” The creation of an epic battle against catastrophic climate change provides such people with a tangible means of “dealing” with their new god, expressing their “worship,” and finding a sense of meaning and purpose. In support of these goals it should not be surprising to see them attempt to make the science fit their belief system even when presented with contradictory evidence. This is where the “faith” of these “catastrophic climate change” believers comes into play. Next week we will explore the question of causation.   [Editor’s Note: This is the fourth in a six-part series. Click on the links below to read the other articles.] Part One: Creation and the Human Equation Part Two: The Population Bomb & Other Fairy Tales Part Three: Malthusian Dreams Part Five: Sun May be Causing Global Warming—Seriously?


S. Michael Craven


  • Facebook Icon in Gold
  • Twitter Icon in Gold
  • LinkedIn Icon in Gold

Sign up for the Daily Commentary