They’ve got incredible supercomputers to play with, but still our climate scientists build up global anomaly charts from only a handful of stations (1,000 or so if the guys at Climate Monitor are not mistaken, or one on average for every area the size of Bangladesh).
You’d think world-rescuing climate specialists would have computers and bandwidth enough to handle data from a million stations by now, or a billion even. But they say they don’t even need that.
Why not? Because they have discovered that 1,200-km grids are ok. Why are they ok? Because
temperature anomaly patterns tend to be large scale
How so? Here’s the root of it all, from 1987:
The 1200-km limit is the distance at which the average correlatiom coefficient of temperature variations falls to 0.5 at middle and high latitudes and 0.33 at low latitude
Who would believe that it all depends on a correlation coefficient of…0.5?