The Climate Snow Job - WSJ:
An East Coast blizzard howling, global temperatures peaking, the desert Southwest flooding, drought-stricken California drying up—surely there’s a common thread tying together this “extreme” weather.
There is.
But it has little to do with what recent headlines have been saying about the hottest year ever.
It is called business as usual.
Surface temperatures are indeed increasing slightly:
They’ve been going up, in fits and starts, for more than 150 years, or since a miserably cold and pestilential period known as the Little Ice Age.
Before carbon dioxide from economic activity could have warmed us up, temperatures rose three-quarters of a degree Fahrenheit between 1910 and World War II.
They then cooled down a bit, only to warm again from the mid-1970s to the late ’90s, about the same amount as earlier in the century.
Whether temperatures have warmed much since then depends on what you look at.
Until last June, most scientists acknowledged that warming reached a peak in the late 1990s, and since then had plateaued in a “hiatus.”
There are about 60 different explanations for this in the refereed literature.
That changed last summer, when the National Oceanic and Atmospheric Administration (NOAA) decided to overhaul its data, throwing out satellite-sensed sea-surface temperatures since the late 1970s and instead relying on, among other sources, readings taken from the cooling-water-intake tubes of oceangoing vessels.
The scientific literature is replete with articles about the large measurement errors that accrue in this data owing to the fact that a ship’s infrastructure conducts heat, absorbs a tremendous amount of the sun’s energy, and vessels’ intake tubes are at different ocean depths.
...There are two real concerns about warming, neither of which has anything to do with the El Niño-enhanced recent peak.
How much more is the world likely to warm as civilization continues to exhale carbon dioxide, and does warming make the weather more “extreme,” which means more costly?
Instead of relying on debatable surface-temperature information, consider instead readings in the free atmosphere (technically, the lower troposphere) taken by two independent sensors: satellite sounders and weather balloons).
As has been shown repeatedly by University of Alabama climate scientist John Christy, since late 1978 (when the satellite record begins),
the rate of warming in the satellite-sensed data is barely a third of what it was supposed to have been, according to the large family of global climate models now in existence.
Balloon data, averaged over the four extant data sets, shows the same.
It is therefore probably prudent to cut by 50% the modeled temperature forecasts for the rest of this century.
Doing so
would mean that the world—without any political effort at all—won’t warm by the dreaded 3.6 degrees Fahrenheit by 2100 that the United Nations regards as the climate apocalypse.
The notion that world-wide weather is becoming more extreme is just that: a notion, or a testable hypothesis.
As data from the world’s biggest reinsurer, Munich Re, and University of Colorado environmental-studies professor Roger Pielke Jr. have shown, weather-related losses haven’t increased at all over the past quarter-century.
In fact, the trend, while not statistically significant, is downward.
Last year showed the second-smallest weather-related loss of Global World Productivity, or GWP, in the entire record.
Without El Niño, temperatures in 2015 would have been typical of the post-1998 regime.
And,
even with El Niño, the effect those temperatures had on the global economy was de minimis.