Statistics is thus substituted for physics, is used to assign causality, and is made to pose material time series as temperature. In a classic of scientific non-sequiturs, the entire field of consensus proxy paleo-thermometry has decided that correlation equals causation, and also that correlation in the present proves causation in the past. This is physics by fiat. Patrick Frank
Causality in the past is inferred from correlation in the present
The hockey stick graph is based on the premise that a great resemblance to thermometer measurements in the period 1902-1980 (that is, having a high correlation with the thermometer signal in the calibration period) means that this proxy has behaved like a thermometer (cause-effect) since its birth, that is, that there has been causality in the temperature – ring widths relationship. In the same way, a proxy that does not resemble the thermometer values in the calibration period is considered to not behave like a thermometer.
That is, at best, this graph is based on the inference of causality from a correlation: causality in past centuries is deduced from correlation.
Why do the authors of the graph believe that in the proxies they collected there is temperature information and they are not just noise? Following exactly the same procedure and from the same data, can they do a reconstrucion of pluviosity, CO2 concentration in the atmosphere, light received by the tree or any other variable that comes to their mind? They just need to have reliable data today and they can do those reconstructions. And if they get a hockey stick, one possibility is that they have successfullyreconstructed that signal in the past. But another possibility would be that they are playing with noise, and depending on the specific noise they use, they get a different outcome.
The final rise in the graph doesn’t come from the data: it is created by the mathematical processing. And the shape of the handle is determined by the shape of those proxies that are important in the reconstruction of the twentieth century.
This is very important: the proxies used in the creation of the graph didn’t say that the temperature had risen in the last century. This is easy to checked: average them. It is the processing, i.e. the MBH algorithm, who forces the final rise in the reconstruction. The computer program used in MHB98/99 seeks the best way to linearly combine the 112 proxies to regenerate the shape of the thermometer temperatures in the 20th century. And that shape can be achieved almost always if you have an ample set of signals. The uptick is not information that has been extracted from the proxies: it is a shape that is imprinted by the MBH algorithm in the final result (whenever the data allows it). The uptick of the MBH hockey stick in the 20th century is not real.
I find these ideas really important. Therefore, I insist a little: let’s say I have 110 proxies that are pure random (i.e. noise), 1 that has a rise in the twentieth century and one that has a decline in the twentieth century (note that the one that goes down will be converted to a rise since the sign of the correlation with the thermometer measurement is ignored in MBH). If I average the proxies I will conclude that the signals I have are essentially noise without a visible pattern and with no special trend in the twentieth century. But the processing in MBH98/99 doesn’t average proxies: it seeks the best way to combine the 112 proxies to generate a rise in the twentieth century that resembles as closely as possible the instrumental measures. It is the mathematical processing what generates a rise in the twentieth century but it does not do so because it is deduced from the proxies: the rise exists in the graph because the algorithm itself forces the graph to have that shape.
The proof that this rise is not real is that we know (as was demonstrated by Stephen McIntyre and Ross McKitrick), that the MBH98/99 code generates a graph with a rise in the 20th century even when red noise (low-pass filtered) is used instead of the original proxies. And the rest of the graph is basically horizontal because those specific proxies that are most successful in helping to reconstruct the calibration period shape happen to be flat in the rest of the centuries. The Gaspé cedars and the NOAMER PC1 stand out in this regard, although the latter shouldn’t have a hockey stick shape but it had it thanks to the use of a pseudo-PCA instead of a conventional PCA.
Many noise signals + some signals shaped like a hockey stick => The hockey stick shape is guaranteed by the algorithm. The MBH algorithm is a hockey stick generating machine.
we conclude unequivocally that the evidence for a ”long-handled” hockey stick (where the shaft of the hockey stick extends to the year 1000 AD) is lacking in the data. McShane and Wyner
The graph didn’t even pass the quality control
The graph did not pass the verification step and the full verification data wasn’t published in the article. The emulation of the hockey stick graph by other authors showed that the R² of the verification was very low, i.e. the reconstruction lacked statistical significance. The hockey stick has no scientific validity but this fact was hidden.
But we are not done: in the next posts we will learn about “the trick to hide the decline”. Or perhaps we should say “the tricks to hide the decline”, because it’s not just one trick.