Friday, April 04, 2008

Global Warming Curve Upside Down? UN Not Sure


Average global temperatures in 2008 are forecast to be lower than in previous years, thanks to the cooling effect of the ocean current in the Pacific, U.N. meteorologists say....The World Meteorological Organisation's secretary-general, Michel Jarraud, said it was likely that La Nina, an abnormal cooling of sea surface temperatures in the Pacific Ocean, would continue into the summer..a small number of scientists doubt whether this means global warming has peaked and the Earth has proved more resilient to greenhouse gases than predicted, but Jarraud insists this is not the case and notes that 1998 temperatures would still be well above average for the century...."When you look at climate change you should not look at any particular year," he told the BBC. "You should look at trends over a pretty long period and the trend of temperature globally is still very much indicative of warming..."La Nina is part of what we call 'variability'...."
UN Forecasters: Global Temperatures to Decrease, FoxNews.Com, 4/4/2008

Oh, that dog of variability, that cursed beast of contingency! For ten years, the trend of global temperatures has been slightly down, not up. That's not a very long time, but it almost exactly corresponds to the rising hype regarding global warming.

The problem, as Michael Crichton has described it (see State of Fear), is the use of calculated models as a substitute for evidence, what was once derisively described in the early PC days as spreadsheet knowledge. The problem with models is easy to describe. If you miss a key variable, what in the real world is described as a contingency, you will calculate utterly false results.

A classic example of this was a model developed by the Club of Rome in 1970. It calculated the demise of the human species by the 1980s. We were all going to devolve into starvelings in a universal war of attrition over diminishing resources, especially diminishing food resources. Evidently, the Club of Rome had never heard of Norman Borlaug. Food production more than doubled by the time we were all supposed to starve to death. That was one key variable. Population growth, especially in the northern hemisphere and South America, sharply declined. That was another key variable missed. Put the two together and all that remained of the Club of Rome's model by 1985 was an artifact of an era, something one might put in an exhibit at the Woodstock Museum that Senator Clinton would like the government to subsidize.

Before computer modeling came to attain the status of experimental evidence, such speculations were usually reserved to a certain class of science fiction. Antiquarians will point to some of the more spectacularly errant ideas realized in these novels and short stories, a staple of the paper pulps in the 1930s, 1940s and 1950s. One that arose, as computers were developed in the 1940s, was that the day would come when a few individuals with supercomputers would control the world. The writers who followed this lede had no way of knowing that more computing power than was available to the entire world in 1947 could be found in a $299 sale item at Wal-Mart in 2008. This kind of science fiction is almost always wrong, because the future is far too difficult to predict, especially in a free market economy.

It was easy to debunk this kind of fiction when the stories were bound in covers that looked like softcore pornography. (Old issues are now art collectibles on eBay.) But, after the startling realism of 2001, A Space Odyssey (Stanley Kubrick, 1968, director), the silly symphony approach to science fiction movies, even so-called serious ones like Metropolis, gave way to an increasingly popular perception that science fiction might have a thing or two to say about things to come, as it were. That perception is very big today, but needs closer examination. Take Bladerunner.

As 2001, A Space Odyssey became after 1968, Bladerunner after 1982 became an iconic film, an artifact not only of its maker's imagination (and that of Philip Dick), but of a burgeoning political paradigm. Human beings were hurtling into a future where the environment would be poisoned by the very technology that allowed us to advance; only the rich would have havens of safety; only the rich would ever see sunlight. Beautifully designed, presented with convincing noir realism, the film's setting and background became how many people saw an impending, doomed world.

Bladerunner was released twenty-six years ago. The closest environments today to that depicted in the movie are found in China and in the wreck of the Soviet Union, not in the West. Environmental cleanups in the the Americas and Europe are so dramatic that New York, where one could feel the soot falling out of the sky in 1968, now seems like the windswept port city that it has always been. Even the air over Los Angeles, a city thicker with cars now than ever, has been transformed. An area in New Jersey west of the city where the waterways and swamps were green with toxins in 1970 is now a protected wildlife preserve. The weirs that held industrial waste are gone, the waste with them. There are more trees in the Appalachians now than there were in the 1700s. And, far from the populations reduced to serfdom as in the movie, we have more problems with overconsumption in 2008 than with workers slaving in fiery mills. Those mills are gone. The new ones are startlingly clean, the workers well paid. The Bradbury Hotels of New York have been either torn down, or have been renovated as condominiums. The hulks of tenements torched for insurance in the Bronx in the 1970s have been rebuilt or replaced. The vision of the movie, utterly convincing, of a dystopian world of the future, while it may have encouraged us to do better, was as wrong in its forecast of the future as was 2001, A Space Odyssey.

But, both are fiction. Fiction, onscreen or on the page, is an art form. Its plausibility isn't based on how true it is, but on how well it's done. Writers and filmmakers who propose futures are almost always wrong; they're creating art. And it's too complicated to imagine all of the possible variables of the next fifty to one thousand years. But, surely,the immensely powerful software and processors of today's computers, networked by the millions, can outdo a single man or filmmaking team. That must have been the temptation that drew scientists away from their initial scorn for computer calculation as a substitute for fieldwork and paperwork.

Take the weather. For five decades, the US Meteorological Service has requested more and more powerful computation assets. They've generally gotten them. Weather, a natural system, is a big problem. Ask any citizen of New Orleans. Trouble is, we've known for almost four decades that there are limits on computation of weather patterns. 48 hours is the most we can forecast. After that, you might as well flip coins. (Compare any local weather forecaster with actual results for a week.) A lobbyist pitching for more computational power for the USMS would be lying if he or she claimed that more computers would allow deeper projections into the future. It won't. Lorenz proved that in the early 1960s. Weather is a chaotic system, relatively stable, but never the same. What more computer power will do, however, and this is why the money is worth spending, is to allow richer detailing of those forty-eight hours. There, the difference between 1960 and 2008 is astonishing.

But the weather, like all other global patterns, is still a chaotic system, not a neat problem for a spreadsheet or more complicated software. As such, even with the vast computational power of the US Metereorological Service in 2005, New Orleans citizens didn't know that a) the storm would head into Lake Ponchartrain and defeat the levee system, and b) that the political leaders in the city would completely fail to meet the test the storm presented.

The same applies to any chaotic system. Such systems fall into similar patterns as they repeat processes, but they never do exactly the same thing twice. It's the difference between a hurricane striking harmlessly on a vacant part of east Texas or fatally in a direct hit on Galveston. Based on science that's now over four decades old, it's easy to aver the following: presuming to project a hundred years into the future on a global system like temperature patterns is as arrogantly presumptive as the Club of Rome predicting that we'd all starve to death in 1985.

The trouble is that computation in the abstract, working without the messy interference of contingency, such as physical evidence, is in its own way a kind of fiction factory where, instead of a year to develop and shoot a feature film, a new fiction can be produced with each variable introduced to the software. As Crichton has pointed out with considerable force, politics can introduce its own variables to so-called objective calculation. We can (and do) deny that certain physical evidence is relevant, because the evidence is politically unacceptable. And we're still left with the fact that a computational model is just that, a model, not the real thing. Taking exactly the same model, and introducing a random element (the way real events can be), we might find it just as easy to forecast an ice age. Oh,but wait: that was done before. Those who read history, or who were there, will remember the reputations put on the line in the 1970s to predict global cooling.

Luther

No comments: