Bad Star, No Biscuit?

I imagine everyone has probably heard about the CRU hack by now. Given that it concerns private correspondence, I shan’t link to any of it – if you really want to know, Google is your friend. I also shan’t go into excessive detail about the whole affair, except to observe that the fruit-loop portion of the blogosphere have predictably gone fruit-loopy over it. There’s been particular nit-wittery over the use of the term ‘trick’, which in this case merely represented a technique for plotting data. In the same sense, using an IDL routine to create a .ps file is also a ‘trick’. So is doing a bar chart in Excel. It doesn’t imply any intent to deceive.

The skeptics, of course, don’t see it that way. Of course, climate change ‘skepticism’ is somewhat misnamed. Climate change is the only thing a lot of these people show any skepticism toward. They often show none toward, say, Ronald Reagan’s economic policies or Big Oil or any possible downside to super-corporate commercialism. And their skepticism of climate change has something distinctly fanatical about it. Rather then being skepticism as such, it seems to be based more on a quasi-religious conviction that their beliefs couldn’t possibly be wrong.

You see, in this ‘debate’ the so-called ‘skeptics’ are the ones with all the money. Most scientific institutions are run on close to a shoe-string. (Here at my place of work, we frequently run out of Fairy Liquid in the office kitchen, because the budget is often that tight – or so we’re told, anyway.) Instrument time is expensive, telescope time is expensive, conferences cost … the money all goes. By contrast, some climate-change skeptics have corporate doners with deep pockets. Some others have personal fortunes. Some particularly-noxious individuals get large amounts of money from American religious think-tanks. (There’s a strain of ‘thought’ – I use the term loosely – in Dominionist Christianity that we have a moral requirement to burn oil – God has given us enough oil to last until the Rapture comes along and hoovers up all the Tru-Xians, therefore we shouldn’t even consider environmental degradation or resource depletion. What I think of this is, frankly, unprintable. It’s depressing how much money there is in wilful ignorance.)

Anyway, my point is, we in the reality-based community have a hard time competing with these well-oiled PR machines. We have nothing like their funds, their media presence or even as much time to spare on the issue. We can only even begin to keep up simply because the factual evidence we have is so overwhleming. Also, that evidence is steadily getting better. Even as recently as five years ago, there was still reasonable scientific doubt over the scale and cause of climate change. Those are gone now – quite simply, new research has filled in most of the gaps.

One of the longest-running controversies, of course, is the role played by the Sun.

On an intuitive level it’s obvious enough. Walk outside on a sunny day and you’ll notice the extra warmth straight away. The Sun is the primary energy source for life on Earth and changes in sunlight intensity caused by axial tilt drive the seasons. (Total number of daylight hours is of course also a significant factor in seasonal variation, although perhaps not as strong as you might think.) However, year-on-year, seasonal variations are pretty stable. We have some miserable summers but we also have good ones too. (Or at least, the parts of the world that aren’t the UK do.)

Of course, if the Sun’s output wasn’t stable, then this would drive climate more than any axial tilt. However, the evidence seems to be that in fact, our Sun is quite well-behaved.

But what about sunspots? one might ask. Well, here it gets complicated. Sunspots are darker regions on the solar surface; they’re darker because they’re cooler than the surrounding areas. (Luminosity is proportional to the fourth power of the temperature – double T and L jumps by a factor of 16[!].) If you saw one close up, it would actually be quite bright. They only look ‘dark’ due to the contrast. (Sunspots are typically around maybe ~4000K, compared to ~5500 for the rest of the Sun. From that, a straightforward T_eff argument suggests that any given bit of sunspot should be only about ~28% as bright as the surrounding ‘un-spotty’ Sun.) So, this would tend to suggest that a lot of sunspots would reduce solar luminosity. And a fall in energy into the Earth’s atmosphere should also cool it.

The Sun has an 11-year sunspot cycle, with troughs and peaks. We see lots of spots at peaks and few to none at minima. So, if sunspots are reducing the amount of heat reaching the Earth, then we should see a regular 11-year cycle in terrestrial temperatures.

We don’t.

So, what’s going on? It turns out there’s a catch – a ‘trick’, if you will, that the Sun plays on us. This trick comes in the form of faculae. Faculae are sinuous, strand-like formations that emerge around sunspots. They are hotter than the Sun’s normal temperature. This means they are brighter than the rest of the Sun. In fact, sunspots usually produce enough faculae to compensate for their own dimming, sometimes even slightly too many – leading to the rather-surprising discovery that the Sun is actually slightly more luminous at sunspot maximum!

In fact, estimates of the luminosity change associated with the sunspot cycle is tiny, around ~0.095%. This will be entirely swamped by variations within the Earth’s atmosphere and also by the thermal time-lag of the oceans. (Water is a pain to heat up, as anyone who’s ever been kept waiting for a cuppa by a kettle will attest. But this also means water is a pain to cool down, too. Basically a change this small would take a lot longer than 11 years to produce a visible effect on the Earth, due to the enormous thermal mass of the oceans.) Also, most of this variation is in the ultraviolet, which is largely blocked by our atmosphere anyway. Very little of it will seep down to the troposphere, which is the part of the atmosphere where our weather happens.

So, sunspots are out. But what about longer cycles?

Well, there is some evidence for a ~300 year long-term cycle. Part of this comes from the Maunder Minimum, between 1645 and 1714, when next to no sunspots were observed. (Note: this was genuinely due to a lack of sunspots, not due to a lack of observations. European and Chinese records actually agree on this one.) Also, this did coincide with a period of bitterly cold winters in the Northern Hemisphere. There are accounts of the Thames freezing solid, on a regular basis. On the face of it, you’d think case nailed, right?

Maybe not.

There are a couple of problems. First of all, although a period without sunspots might reduce the average luminosity a little (faculae, remember), this isn’t necessarily that significant. The dip in flux received at the Earth between sunspot maximum and sunspot minimum is about 1.3 Watts/square metre. If we assume a really deep, long sunspot minimum then perhaps maybe 2-3 WM^-2 is possible. But the flux would still be at least 1363 Wm^-2 (it’s normally 1366). We’re hardly talking a radical drop here, so it’s not clear to me that the Maunder Minimum can be described as the Sun ‘all but going out’ (to quote one particularly-uninformed commentator).

The second problem is this business of the Thames freezing. I’m led to understand that perhaps this isn’t as spectacular as it sounds. Apparently, it tended to do this around the bridges. Bridge designs in the 17th and 18th Centuries were cruder and more bulky than modern ones. They tended to reduce the water velocity a lot more than modern ones tend to. And of course, slower-moving water builds up ice a lot more easily. Also, it’s not absolutely unknown for ice to form on the Thames even in the modern day. Okay, it doesn’t freeze over but if you go on a daytrip to London in January or February, you will see some patches.

There is one killer thing with the Maunder Minimum suggestions, though: global temperatures. Quite simply, although Europe and America appear to have been colder, it’s not absolutely clear that the rest of the world was. A solar variation would tend to produce a global variation. Granted, atmospheric and ocean phenomena will blur the picture, as we discussed earlier, but the absence of any clear signal from the rest of the world tends to undermine the model all the same.

On the basis of all of this, it seems hard to sustain the simplistic argument of ‘bad sun, no biscuit’. While it certainly remains possible that the Maunder Minimum played a role in the Little Ice Age, it certainly doesn’t appear to have been the only factor. Also, there is little clear evidence of a long-term variation in solar luminosity. (Well, on the extremely long term of course we run into red giant expansion … but 5 billion years down the line is too far ahead to be a serious concern.)

My point is that although the ‘it’s the Sun’s fault’ argument is one of the more-rational ‘climate-skeptical’ suggestions, it’s still wrong.

Leave a comment