Monday, September 1, 2014

New paper finds the last interglacial was warmer than today, not simulated by climate models

A new paper published in Climate of the Past compares temperature reconstructions of the last interglacial period [131,000-114,000 years ago] to climate model simulations and finds climate models significantly underestimated global temperatures of the last interglacial by ~0.67C on an annual basis and by ~1.1C during the warmest month. 

This implies that climate models are unable to fully simulate natural global warming, and the error of the underestimation is about the same as the 0.7C global warming since the end of the Little Ice Age in ~1850. Thus, the possibility that present-day temperatures could be entirely the result of natural processes cannot be ruled out in comparison to the last interglacial period.

Further, during the last interglacial, Greenland temperatures were naturally up to 8C higher and sea levels up to 43 feet higher than today. And, during another interglacial, all of Greenland and West Antarctica melted & sea levels were 79 feet higher. Since this low-CO2 global warming occurred entirely naturally, there is no evidence that global warming during the present interglacial is unnatural or man-made. 

Temperatures during the last interglacial period ~120,000 years ago were higher than during the present interglacial period.

First column is the warmest single period simulated by climate models, second column is the warmest period from a compilation of temperature reconstructions.
Clim. Past, 10, 1633-1644, 2014

P. Bakker1,2 and H. Renssen1

1Earth and Climate Cluster, Department of Earth Sciences, VU University Amsterdam, 1081HV Amsterdam, the Netherlands
2now at: College of Earth, Ocean and Atmospheric Sciences, Oregon State University, Corvallis, Oregon, USA

Abstract. The timing of the last interglacial (LIG) thermal maximum across the globe remains to be precisely assessed. Because of difficulties in establishing a common temporal framework between records from different palaeoclimatic archives retrieved from various places around the globe, it has not yet been possible to reconstruct spatio-temporal variations in the occurrence of the maximum warmth across the globe. Instead, snapshot reconstructions of warmest LIG conditions have been presented, which have an underlying assumption that maximum warmth occurred synchronously everywhere. Although known to be an oversimplification, the impact of this assumption on temperature estimates has yet to be assessed. We use the LIG temperature evolutions simulated by nine different climate models to investigate whether the assumption of synchronicity results in a sizeable overestimation of the LIG thermal maximum. We find that for annual temperatures, the overestimation is small, strongly model-dependent (global mean 0.4 ± 0.3 °C) and cannot explain the recently published 0.67 °C difference between simulated and reconstructed annual mean temperatures during the LIG thermal maximum. However, if one takes into consideration that temperature proxies are possibly biased towards summer, the overestimation of the LIG thermal maximum based on warmest month temperatures is non-negligible with a global mean of 1.1 ± 0.4 °C. 

The Economist claims heat sinks & bottom of ocean will become hotter than surface

An article published last week in The Economist proclaims, "the mystery of the pause in global warming may have been solved. The answer seems to lie at the bottom of the sea."

No, sadly, the mystery of the "pause" and Trenberth's "missing heat" have not been solved, nor the question whether the "missing heat" ever existed in the first place. The article claims "
People with a grasp of the law of conservation of energy are, however, sceptical in their turn of these positions and doubt that the pause is such good news," which is a false assumption because the only 'evidence' that there is any "missing heat" is the falsified output of climate models, which cannot provide 'data' or evidence.

Further, the article goes on to make astonishingly erroneous statements that violate physics/thermodynamics, including claims that man-made CO2 heat sinks to the bottom of the ocean and once the ocean depths become "hotter than the warming will resume." Thus, The Economist has become a denier of convection.

Other embarrassing basic physics/thermodynamics errors that fail grade school science include 

1) the assumption that the tail can wag the dog & the atmosphere, with 1/1000 of the heat capacity and thermal inertia of the oceans, can significantly heat the oceans

2) that an atmosphere which has not warmed in 18 years can warm the oceans, and undetected in the upper layers of the ocean

I had a subscription to the Economist for many years, but dropped it because they jumped on the fake-science CAGW bandwagon. 

Related tweets/comments:

In the article
Economist – Oceans and the climate: Davy Jones’ heat locker []
the reporter made a factual scientific error. They wrote at the end of the article
“The process of sequestration [of heat] must reverse itself at some point, since otherwise the ocean depths would end up hotter than the surface—an unsustainable outcome. And when it does, global warming will resume.”
The deeper ocean is very cold and involves an enormous amount of mass. There is no way that the ocean depths could become hotter due to the sequestration of heat from added CO2 in the atmosphere!
“No matter how warm the surface of the ocean gets, the ocean’s huge volume and deep basins keep temperatures at the bottom of the ocean at only slightly above freezing.”.
P Gosselin (@NoTricksZone)

9/1/14, 11:27 AM
@newscientist Rubbish...atmos warms 1°...heat absorbed by oceans leads to 0.01° rise in ocean temp...heat exits ocean only if atmos cools

Davy Jones’s heat locker

The mystery of the pause in global warming may have been solved. The answer seems to lie at the bottom of the sea

Aug 23rd 2014 The Economist [with added comments and links by HS]

OVER the past few years one of the biggest questions in climate science has been why, since the turn of the century, average surface-air temperatures on Earth have not risen, even though the concentration in the atmosphere of heat-trapping carbon dioxide has continued to go up. This “pause” in global warming has been seized on by those sceptical that humanity needs to act to curb greenhouse-gas emissions or even (in the case of some extreme sceptics) who think that man-made global warming itself is a fantasy. People with a grasp of the law of conservation of energy are, however, sceptical in their turn of these positions and doubt that the pause is such good news. They would rather understand where the missing heat has gone, and why—and thus whether the pause can be expected to continue.

The most likely explanation is that it is hiding in the oceans, which store nine times as much of the sun’s heat as do the atmosphere and land combined. [Actually, the ocean stores 1000 times as much heat as the atmosphere, and the tail does not wag the dog] But until this week, descriptions of how the sea might do this have largely come from computer models. Now, thanks to a study published in Science by Chen Xianyao of the Ocean University of China, Qingdao, and Ka-Kit Tung of the University of Washington, Seattle, there are data [not according to Josh Willis of JPL who says there are no such "robust" data in the "real ocean" as opposed to the modeled ocean].

Dr Chen and Dr Tung have shown where exactly in the sea the missing heat is lurking. As the left-hand chart below shows, over the past decade and a bit the ocean depths have been warming faster than the surface. This period corresponds perfectly with the pause, and contrasts with the last two decades of the 20th century, when the surface was warming faster than the deep. The authors calculate that, between 1999 and 2012, 69 zettajoules of heat (that is, 69 x 10^21 joules—a huge amount of energy) have been sequestered in the oceans between 300 metres and 1,500 metres down. If it had not been so sequestered, they think, there would have been no pause in warming at the surface [why can't climate scientists ever admit that the other obvious possibility is that there never was any "missing heat" to sequester?]

Hidden depths

The two researchers draw this conclusion from observations collected by 3,000 floats launched by Argo, an international scientific collaboration. These measure the temperature and salinity of the top 2,000 metres of the world’s oceans. In general, their readings match the models’ predictions. But one of the specifics is weird.

Most workers in the field have assumed the Pacific Ocean would be the biggest heat sink, since it is the largest body of water. A study published in Nature in 2013 by Yu Kosaka and Shang-Ping Xie of the Scripps Institution of Oceanography, in San Diego, argued that cooling in the eastern Pacific explained most of the difference between actual temperatures and models of the climate that predict continuous warming. Dr Chen’s and Dr Tung’s research, though, suggests it is the Atlantic (see middle chart) and the Southern Ocean that are doing the sequestering. The Pacific (right-hand chart), and also the Indian Ocean, contribute nothing this way—for surface and deepwater temperatures in both have risen in parallel since 1999.

This has an intriguing implication. Because the Pacific has previously been thought of as the world’s main heat sink, fluctuations affecting it are considered among the most important influences upon the climate. During episodes called El Niño, for example, warm water from its west sloshes eastward over the cooler surface layer there, warming the atmosphere. Kevin Trenberth of America’s National Centre for Atmospheric Research has suggested that a strong Niño could produce a jump in surface-air temperatures and herald the end of the pause. Earlier this summer, a strong Niño was indeed forecast, though the chances of this happening seem to have receded recently.

But if Dr Chen and Dr Tung are right, then the fluctuations in the Atlantic may be more important. In this ocean, saltier tropical water tends to move towards the poles (surface water at the tropics is especially saline because of greater evaporation). As it travels it cools and sinks, carrying its heat into the depths—but not before melting polar ice, which makes the surface water less dense, fresh water being lighter than brine. This fresher water has the effect of slowing the poleward movement of tropical water, moderating heat sequestration. It is not clear precisely how this mechanism is changing so as to send heat farther into the depths. But changing it presumably is.

Understanding that variation is the next task. The process of sequestration must reverse itself at some point, since otherwise the ocean depths would end up hotter than the surface—an unsustainable outcome. And when it does, global warming will resume.

Correction: This article originally stated that 69 zettajoules of heat was 69 x 10^11 joules. In fact it is 69 x 10^21 joules. Sorry.

Pravda exposes the errors and frauds of global warming science

Ironically, Pravda has published today an article about the fundamental "errors and frauds of global warming science." One of the most important points mentioned is perhaps the most fundamental flaw of CAGW theory- equating radiation to be the same as heat transfer, but radiation  heat due to both the first and second laws of thermodynamics. Radiation from a cold body cannot increase the heat content of a hot body; to do so would require an impossible decrease in entropy and violate the 2nd law of thermodynamics [and also the 1st law]. The false belief of many climate scientists that radiation = heat transfer is one of the most fundamental flaws of CAGW theory.

Note: there is indeed a ~33K "greenhouse effect" due to gravity/mass/pressure/lapse rate, but it is not significantly affected by man-made CO2. 

Errors and frauds of global warming science


By Gary Novak [plus links added by HS]

Modern global warming science began in 1979 with the publication of Charney et al in response to a request from a U.S. governmental office to create a study group for answering questions about global warming. Charney et al modeled atmospheric effects and drew the conclusion that the average earth temperature would increase by about 3°C upon doubling the amount of carbon dioxide in the air.

Charney et al did not have a known mechanism for global warming to base their modeling on. Their publication was total fakery stating deliberate absurdities, such as modeling "horizontal diffusive heat exchange," which doesn't exist.

In 1984 and 1988, Hansen et al did similar modeling but added a concept for heat produced by carbon dioxide, which they derived from assumed history. Over the previous century, a temperature increase of 0.6°C was assumed to have been caused by an increase in CO2 of 100 parts per million in the atmosphere. Their modeling then had the purpose of determining secondary effects, primarily caused by an assumed increase in water vapor. In other words, a primary effect was based upon the historical record, while secondary effects were modeled.

This is the approach taken to this day, while refinements are developed. There were major problems in using history for the primary effect. Firstly, the historical effect included secondary effects which could not be separated out, and no attempt was made to do so. This means the assumed primary effect included secondary effects. Secondly, there was no place for other effects in attributing the entire history to CO2.

Therefore, an attempt to determine the primary effect was made by Myhre et al in 1998 (4) by using radiative transfer equations. Those equations only show the rate of depletion of radiation as the concentration of a gas increases. They say nothing about heat. An impossibly complex analysis would be required to evaluate the resulting heat, but no such analysis was mentioned in the publication by Myhre et al. Even worse, Myhre et al added more atmospheric modeling in determining the primary effect including the effects of clouds.

These publications cannot be viewed as honest. They lack a consistent logic and fabricate conclusions with no scientific method at arriving at such conclusions. Furthermore, these publications are not science as the acquisition of evidence, since modeling is the projection of assumptions with no method of acquiring evidence. Modeling may be a tool for sociologists and politicians but has no place in science. Science attempts to verify through reproducible evidence, while modeling is nothing but an expression of opinions with no new evidence being acquired.

Even after Myhre et al supposedly determined the primary effect (said to be 5.35 times the natural log of final carbon dioxide concentration divided by prior concentration-a three component fudge factor) there was no known mechanism for carbon dioxide (or any greenhouse gas) creating global warming.

In 2001, three years after Myhre et al's publication, the IPCC described the mechanism this way: "Carbon dioxide absorbs infrared radiation in the middle of its 15 mm [sic] band to the extent that radiation in the middle of this band cannot escape unimpeded: this absorption is saturated. This, however, is not the case for the band's wings. It is because of these effects of partial saturation..."

Saturation means all available radiation gets used up. Heinz Hug stated in his publication that saturation occurs in 10 meters at the center of the absorption curve for the 15µm band ( On the shoulders of the absorption curves are molecules which have stretched bonds causing them to absorb at slightly altered wavelength. It is supposedly these molecules which do the heating for greenhouse gases, because they do not use up all available radiation; and therefore, more of the gases absorbs more radiation.

Scientists said that 5% of the CO2 molecules were effective on the shoulders for creating global warming. This roughly means that radiation would travel 20 times farther before being absorbed. But 20 times 10 meters is only 200 meters. Air mixes in such a short distance, which means there is no temperature change. Absorbing radiation in 200 meters is no different than absorbing it in 10 meters. In other words, the 5% claim was nothing but a fake statement for rationalizing. The shamelessness and gall of making up this subject on whim and then claiming it is science is unprecedented. Real scientists are not that way.

Since this mechanism would not stand up to criticism, scientists changed their mind about the mechanism a few years ago and said the real mechanism occurs about 9 kilometers up in the atmosphere. (The normal atmosphere, troposphere, goes up about 17 km average.) Trivial rationalizations were used, mainly that the absorption bands get narrower at lower air pressure, so they [allegedly] don't overlap with water vapor.

There are two major problems with the analysis for 9 km up. One, there is not much space left for adding heat. And two, the temperature increase required for radiating the heat back down to the surface is at least 24°C up there for each 1°C increase near the surface-not accounting for oceans ( Oceans will absorb the heat for centuries or millennia, which means 70% of the heat disappears during human influences. So the total would need to be 80°C at 9 km up to create the claimed 1°C near ground level. No temperature increase has been detected at 9 km up due to carbon dioxide [the missing "hot spot"]

Notice that the fakes didn't have a mechanism and didn't know where it was occurring 30 years after the first models were constructed in 1979 (said to be only off by 15%) and 10 years after the fudge factor was contrived for pinning down the primary effect, which the mechanism is supposed to represent. How could they get the primary effect (fudge factor) without knowing whether it was occurring at ground level or 9 km up?

Why do nonscientists assume it is self-evident that greenhouse gases create global warming, when scientists cannot describe a mechanism? Extreme over-simplification appears to be the reason. They assume that absorbing radiation is producing heat. Guess what. A jar of pickles absorbs radiation but it doesn't heat the kitchen. Total heat effects are complex, and they equilibrate.

What really happens is that the planet is cooled by radiation which goes around greenhouse gases, not through them.[I disagree in part, some IR is lost directly to space through the 'atmospheric window' but also greenhouse gases enhance cooling of the troposphere, tropopause, stratosphere, mesosphere, and thermosphere by increasing radiative surface area to space] Cooling results in an equilibrium temperature which is independent of how heat gets into the atmosphere. It means greenhouse gases have no influence upon the temperature of the planet.

The amount of carbon dioxide in the atmosphere is so low that all biology is on the verge of becoming extinct due to a shortage of CO2 which is needed for photosynthesis. There was twenty times as much CO2 in the atmosphere when modern photosynthesis evolved. Oceans continuously absorb CO2 and convert it into calcium carbonate and limestone. The calcium never runs out, and the pH of the oceans never drops below 8.1 for this reason. It's the pH which calcium carbonate buffers at. If not, why hasn't four billion years been long enough to get there?

Gary Novak

New paper finds a non-linear relationship between sunspots and global temperatures

A new paper by Dr. Nicola Scafetta published in Physica A: Statistical Mechanics and its Applications rebuts the assertion that sunspots and global temperatures are not related. Dr. Scafetta instead finds a non-linear relation between sunspots and temperatures 

"can be recognized only using specific techniques of analysis that take into account non-linearity and filtering of the multiple climate change contributions" and "Multiple evidences suggest that global temperatures and sunspot numbers are quite related to each other at multiple time scales. Thus, they are characterized by cyclical fractional models. However, solar and climatic indexes are related to each other through complex and non-linear processes."
A simple linear model based upon the "sunspot integral" and ocean oscillations explains 95% of climate change over the past 400 years. 

Dr. Scafetta's model [black line] based upon solar and anthropogenic forcing is performing much better than the IPCC models [green band] which dismiss the role of the Sun in climate change

Global temperatures and sunspot numbers. Are they related? Yes, but non-linearly. A reply to Gil-Alana et al. (2014)

Refers To


Gil-Alana et al. claimed that the sunspots and Earth’s temperature are unrelated.
I show that Gil-Alana et al.’s claims are based on a number of misunderstandings.
The global surface temperature does nor present any “zero” frequency “pole” or “singularity”.
The temperature signature is made of oscillations plus an anthropogenic component.
Appropriate solar proxy models demonstrate the existence of a significant sun-climate relation.


Recently Gil-Alana et al. (2014) compared the sunspot number record and the temperature record and found that they differ: the sunspot number record is characterized by a dominant 11-year cycle while the temperature record appears to be characterized by a “singularity  ” or “pole  ” in the spectral density function at the “zero  ” frequency. Consequently, they claimed that the two records are characterized by substantially different statistical fractional models and rejected the hypothesis that the Sun influences significantly global temperatures. I will show that: (1) the “singularity” or “pole” in the spectral density function of the global surface temperature at the “zero” frequency does not exist—the observed pattern derives from the post 1880 warming trend of the temperature signal and is a typical misinterpretation that discrete power spectra of non-stationary signals can suggest; (2) appropriate continuous periodograms clarify the issue and also show a signature of the 11-year solar cycle (amplitude View the MathML source), which since 1850 has an average period of about 10.4 year, and of many other natural oscillations; (3) the solar signature in the surface temperature record can be recognized only using specific techniques of analysis that take into account non-linearity and filtering of the multiple climate change contributions; (4) the post 1880-year temperature warming trend cannot be compared or studied against the sunspot record and its 11-year cycle, but requires solar proxy models showing short and long scale oscillations plus the contribution of anthropogenic forcings, as done in the literature. Multiple evidences suggest that global temperatures and sunspot numbers are quite related to each other at multiple time scales. Thus, they are characterized by cyclical fractional models. However, solar and climatic indexes are related to each other through complex and non-linear processes. Finally, I show that the prediction of a semi-empirical model for the global surface temperature based on astronomical oscillations and anthropogenic forcing proposed by Scafetta since 2009 has, up to date, been successful.

New paper finds another non-hockey-stick in Russian sub-Arctic, cooling over 4,500 years

A paper published today in Global and Planetary Change finds another non-hockey-stick in the Russian sub-Arctic with reconstructed temperatures showing a cooling trend over the past 4,500 years since the Holocene Climate Optimum. The paper adds to over 1,000 other worldwide non-hockey-sticks published in the scientific literature.

Proxy temperatures [2nd graph from left] show a decreasing trend from 4,500 years before the present [BP] to the end of the record in the 20th century
Proxy temperatures [2nd graph from bottom] show a decreasing trend from 4,500 years before the present [BP] to the end of the record in the 20th century


Especially in combination with other proxies, the oxygen isotope composition of diatom silica (δ18Odiatom) from lake sediments is useful for interpreting past climate conditions. This paper presents the first oxygen isotope data of fossil diatoms from Kamchatka, Russia, derived from sediment cores from Two-Yurts Lake (TYL). For reconstructing late Holocene climate change, palaeolimnological investigations also included diatom, pollen and chironomid analysis.
The most recent diatom sample (δ18Odiatom = + 23.3‰) corresponds well with the present day isotopic composition of the TYL water (mean δ18O = -14.8‰) displaying a reasonable isotope fractionation in the system silica-water. Nonetheless, the TYL δ18Odiatom record is mainly controlled by changes in the isotopic composition of the lake water. TYL is considered as a dynamic system triggered by differential environmental changes closely linked with lake-internal hydrological factors.
The diatom silica isotope record displays large variations in δ18Odiatom from + 27.3‰ to + 23.3‰ from about ~ 4.5 kyrs BP until today. A continuous depletion in δ18Odiatom of 4.0‰ is observed in the past 4.5 kyrs, which is good accordance with other hemispheric environmental changes (i.e. a summer insolation-driven Mid- to Late Holocene cooling). The overall cooling trend is superimposed by regional hydrological and atmospheric-oceanic changes. These are related to the interplay between Siberian High and Aleutian Low as well as to the ice dynamics in the Sea of Okhotsk. Additionally, combined δ18Odiatom and chironomid interpretations provide new information on changes related to meltwater input to lakes. Hence, this diatom isotope study provides further insight into hydrology and climate dynamics of this remote, rarely investigated area.

New UN propaganda "Bad weather for 2050 TV forecast"

A new climate propaganda video from the UN WMO claims to be a first edition of "weather reports from the future" predicting "floods, storms and searing heat from Arizona to Zambia within four decades, as part of a United Nations [propaganda] campaign on Monday to draw attention to a U.N. summit this month on fighting global warming."

Dr. Roger Pielke's button that he has to push far too often

Meanwhile, the UN IPCC reports find no established links between extreme weather, floods, storms... and the IPCC climate models upon which warming and  these "weather reports from the future" have already been falsified at confidence levels of 98%+ due to exaggerating global warming by a factor of 2-4 times. 

Bad weather for 2050 as TV forecasters imagine climate change

* Floods, heatwaves, droughts part of normal weather in 2050
* Campaign to draw attention to Sept. 23 U.N. summit
OSLO, Sept 1 (Reuters) - Imaginary television weather forecasts predicted floods, storms and searing heat from Arizona to Zambia within four decades, as part of a United Nations campaign on Monday to draw attention to a U.N. summit this month on fighting global warming.
"Miami South Beach is under water," one forecaster says in a first edition of "weather reports from the future", a series set in 2050 and produced by companies including Japan's NHK, the U.S. Weather Channel and ARD in Germany.
The U.N.'s World Meteorological Organization (WMO), which invited well-known television presenters to make videos to be issued before a U.N. summit on Sept. 23, said the scenarios were "imaginary but realistic" for a warming world.
A Zambian forecaster, for instance, describes a severe heatwave and an American presenter says: "the mega-drought in Arizona has claimed another casualty".
Some, however, show extreme change. One Bulgarian presenter shows a red map with temperatures of 50 degrees C (122 Fahrenheit) - far above the temperature record for the country of 45.2C (113F) recorded in 1916.
"Climate change is affecting the weather everywhere. It makes it more extreme and disturbs established patterns. That means more disasters; more uncertainty," U.N. Secretary-General Ban Ki-moon said in a statement.
Ban has asked world leaders to make "bold pledges" to fight climate change at the meeting in New York. The summit is meant as a step towards a deal by almost 200 nations, due by the end of 2015, to slow global warming.
A U.N. report last year concluded that it is at least 95 percent probable that human activities, rather than natural variations in the climate, are the main cause of global warming since 1950.
A 2011 survey by George Mason University, however, found that TV meteorologists were less likely than most climate scientists to reckon that human activity is the main cause of warming.
For a link to the WMO weather reports, click on:

New paper links solar activity to center of mass of solar system

A new paper by Dr. Willie Soon et al published in New Astronomy finds solar activity "corresponds remarkably well" with the Sun's orbital movement around the barycenter [center of mass] of the solar system. The authors
"find that the maximum variations of [the Sun's specific potential energy storage] correspond remarkably well with the occurrences of well-documented Grand Minima (GM) solar events throughout the available proxy solar magnetic activity records for the past 1000 yr."
The paper finds "Grand Minima are related to the Sun's closest approaches to the barycenter," and predicts another Grand Minimum in solar activity will occur around 2150 AD. Several other researchers have linked changes in solar activity to the Sun's orbital position relative to the barycenter of the solar system, which along with multiple solar amplification mechanisms may possibly one day lead to a "unified grand theory of Earth's climate."

Computation of the Sun's orbit relative to the center of mass [barycenter] of the solar system

Solar system barycenter 

Sun's potential energy [PE] is related to its position relative to the barycenter of the solar system, as shown in 2nd graph, and corresponds "remarkably well" with the minima of reconstructed sunspot numbers [SN] shown in bottom graph
Predicted solar Grand Minima based upon the theory


A physical model of Sun–Planets Interaction is described.
Solar activity Grand Minima (GM) are related to the Sun’s closest approaches to barycenter.
There are several candidate GM events in the next 1000 yr.


We numerically integrate the Sun’s orbital movement around the barycenter of the solar system under the persistent perturbation of the planets from the epoch J2000.0, backward for about one millennium, and forward for another millennium to 3000 AD. Under the Sun–Planets Interaction (SPI) framework and interpretation of Wolff and Patrone (2010), we calculated the corresponding variations of the most important storage of the specific potential energy (PE) within the Sun that could be released by the exchanges between two rotating, fluid-mass elements that conserve its angular momentum. This energy comes about as a result of the roto-translational dynamics of the cell around the solar system barycenter. We find that the maximum variations of this PE storage correspond remarkably well with the occurrences of well-documented Grand Minima (GM) solar events throughout the available proxy solar magnetic activity records for the past 1000 yr. It is also clear that the maximum changes in PE precede the GM events in that we can identify precursor warnings to the imminent weakening of solar activity for an extended period. The dynamical explanation of these PE minima is connected to the minima of the Sun’s position relative to the barycenter as well as the significant amount of time the Sun’s inertial motion revolving near and close to the barycenter. We presented our calculation of PE forward by another 1000 yr until 3000 AD. If the assumption of the solar activity minima corresponding to PE minima is correct, then we can identify quite a few significant future solar activity Grand Minima events with a clustering of PE minima pulses starting at around 2150 AD, 2310 AD, 2500 AD, 2700 AD and 2850 AD.

From the Annals of Failed Global Warming Predictions: Global cooling

WUWT has a post today

"Great moments in climate prediction: ‘World will warm faster than predicted in next five years, study warns’"

That now failed 2009 headline is from Duncan Clark in the Guardian.
see humorous and 100% accurate rebuttal at WUWT

The authors of this alleged "skeptic silencing" paper were Judith Lean [of "Judithgate" fame] and David Rind of NASA GISS, who very confidently predicted global temps would warm 0.15C plus or minus 0.03C [i.e. warm from 0.12C-0.18C] during the five year period 2009 to 2014.

The trend in observations from the following datasets show that over the five years 2009-2014, the globe instead cooled by:
HadCRU4 surface data: -0.044C 
RSS satellite data: -0.09C 

Further, the paper predicts there will be a "pause" in warming due to low solar activity for the subsequent 5 years from 2014-2019 with a temperature change of only 0.03C +/- 0.01.

Ironically, this paper which was claimed by the Guardian rag in 2009 to "silence global warming skeptics" was not only wrong about the predicted anthropogenic warming, but also [possibly correctly] predicts that low solar activity will lead to a "pause" in warming from 2014-2019. 

Graph from Lean & Rind paper with predicted warming from the NASA GISS climate model shown in top graph to 2014 [labeled as "A"] and to 2019 [labeled as "B"]. The globe actually cooled 2009-2014. Lean & Rind predicted a "pause" in warming will occur 2014-2019 due to low solar activity. 

Saturday, August 30, 2014

New paper finds climate change is explained by...fractals

A Mandelbrot multifractal which sorta kinda looks like the blade of a hockey stick, followed by a "pause"

A paper published today in Theoretical and Applied Climatology finds the global monthly temperature anomalies over the past 162 years from 1850-2012 are "surprisingly" "well-described" by a simple mathematical model of fractals with multiple exponents, so-called "multifractals." Multifractals can be used describe complex nonlinear phenomena in the real world, including chaos:

"A multifractal system is a generalization of a fractal system in which a single exponent (the fractal dimension) is not enough to describe its dynamics; instead, a continuous spectrum of exponents (the so-called singularity spectrum) is needed.[1] 
Multifractal systems are common in nature, especially geophysics. They include fully developed turbulencestock market time series, real world scenes, the Sun’s magnetic field time seriesheartbeat dynamics, human gait, and natural luminosity time series. Models have been proposed in various contexts ranging from turbulence in fluid dynamics to internet traffic, finance, image modeling, texture synthesis, meteorology, geophysics and more. The origin of multifractality in sequential (time series) data has been attributed, to mathematical convergence effects related to the central limit theoremthat have as foci of convergence the family of statistical distributions known as the Tweedie exponential dispersion models[2] as well as the geometric Tweedie models.[3] The first convergence effect yields monofractal sequences and the second convergence effect is responsible for variation in the fractal dimension of the monofractal sequences.[4] 
From a practical perspective, multifractal analysis uses the mathematical basis of multifractal theory to investigate datasets, often in conjunction with other methods of fractal analysis and lacunarity analysis. The technique entails distorting datasets extracted from patterns to generate multifractal spectra that illustrate how scaling varies over the dataset. The techniques of multifractal analysis have been applied in a variety of practical situations such as predicting earthquakes and interpreting medical images."

According to the IPCC, only man-made CO2 can possibly explain the global temperature record since 1950. However, IPCC models are unable to model natural variability including ocean oscillations, solar amplification mechanisms, and internal variability, and thus these factors cannot be excluded as possible causes. The fractal model as described in this study might be a potential way to model natural internal variability of the climate system, and suggests that internal variability alone could account for climate change since 1850, without any contribution from man-made CO2. 

Could multifractals be another cause for the "pause?"

Multifractal characterization of global temperature anomalies

The global monthly temperature anomaly time series for the period 1850–2012 has been investigated in terms of multifractal detrended fluctuation analysis (MF-DFA). Various multifractal observables, such as the generalized Hurst exponent, the multifractal exponent, and the singularity spectrum, are extracted and are fitted to a generalized binomial multifractal model consists of only two free parameters. The results of this analysis give a clear indication of the presence of long-term memory in the global temperature anomaly time series which causes multifractal pattern in the data. We investigate the possible other source(s) of multifractality in the series by random shuffling as well as by surrogating the original series and find that the probability density function also contributes to the observed multifractal pattern along with the long-memory effect. Surprisingly, the temperature anomaly time series are well described by the two-parameter multifractal binomial model.