From 1645 to 1715 the Sun entered an extended quiet period known as the Maunder minimum. During that time there was almost no observed sunspot activity on the Sun. It was also a period of abnormally cold winters in Europe, known as the Little Ice Age. A similar, but smaller period of solar quiet from 1796 to 1820 (known as the Dalton solar minimum) correlates to the period of Charles Dickens’ childhood, whose books engrained the idea of a “White Christmas” in English society. These events support the idea that sunspot activity could have an effect on global temperatures. But a new look at the Little Ice Age finds that the connection isn’t quite so clear.
The existence of the Maunder minimum is well established. Sunspot observations of the time were quite good, and they agree with indirect measurements of sun activity such as the levels of deuterium and oxygen-18 found in ice cores of the time, which are lower in periods of minimum solar activity. But what about those record cold temperatures?
A measure that’s often used is the number of frost fairs held in London. These were Winter festivals held when the Thames would freeze over. There’s a higher number of recorded frost fairs during the Maunder minimum than in other times. Add to that historical reports of bitter cold at the time, and you have an image of an unusually cold few decades. But when a team looked at the details of these historical records things got a bit more fuzzy. For one thing, there were years where the Thames froze over, but no frost fair was held. Things like widespread disease, food shortages, and cultural or religious opposition could prevent heavily influence whether frost fairs were held. Likewise, reports of an unusually cold period of Winter often speak of a brief period of unusual cold rather than an extended Winter. Then there are reports of Summer temperatures that would seem to contradict the “ice age” idea. Reports from both London and Paris tell of an unbearably hot Summer of 1701, which was in the center of the Maunder minimum. If sunspots really had an ice age effect, both summers and winters should be unusually cool.
Fortunately we do have some objective temperature data. The Central England Temperature record (CET) has made monthly temperature reports since 1659, and daily reports since 1722. It is the oldest set of instrument-based temperature measurements we have. We also have indirect measurements such as isotope ratios, tree rings and ice cores for the period. What we find is that there was a decrease in temperature for the period of about 0.5 °C. The colder temperatures were also centered around Northern Europe, and global temperatures were not affected in the same way. So it was a real and noticeable effect, and it agrees with reports of the time, but is hardly large enough to make the “little ice age” label very accurate.
It is still possible that solar activity does have some correlation with global temperatures, but it isn’t as strong or direct an effect as is implied by the legend of the Little Ice Age. And that’s perhaps the most important takeaway from this work. Personal experience of a cold winter or hot summer can be extremely compelling when talking about an issue like global climate change. It’s hard to accept global warming when you’re in the middle of a cold February. That’s why objective data matters, and that’s why the science of global climate change works.
Paper: Mike Lockwood, et al. Frost fairs, sunspots and the Little Ice Age. Astronomy & Geophysics (2017)