In my last post, I made the point that techno-optimists, such as Ray Kurweil, see technological change transforming economies through the exponential growth of productivity as the present century progresses. Critically, the analysis of Kurweil and his fellow travellers makes no mention of societal costs—so called externalities in the language of economics. Each innovation or invention is basically self-contained—overcoming a particular problem but without creating any secondary problems in another part of the system.
Unfortunately, this tunnel vision of the benefits of technology does, on many occasions, not correspond to the actual historical record. One technology I have in mind is Thomas Midgley Jr.’s creation of a compound known as chlorofluorocarbon (CFC-12), better know as Freon. CFCs are a classic Kurzweil type solution to a particular problem, in this case the need for a substitute for the highly poisonous gases used up until the 1930s for refrigeration. At the time of their creation and for many years later, CFCs were believed to be inert and totally harmless to human health. In reality, as the CFCs accumulated in the upper atmosphere, they led to the creation of the Antarctic ozone hole. The journalist and author Dianne Dumanoski in her book “The End of the Long Summer” described the ozone hole phenomenon as the most important single event of the 2oth century, even eclipsing Neil Armstrong’s first steps on the moon, since it symbolised “the arrival of a new and ominous epoch when human activity began to disrupt the essential but invisible planetary systems that sustain a dynamic, living Earth.” Even more telling, the environmental historian J.R. McNeill described Midgley himself as having “had more impact on the atmosphere than any other single organism in earth’s history.”
In the lower atmosphere, ozone (O3) is a noxious gas since humans are able to breath it in directly, but in the stratosphere, 10-50 kilometres above the earth’s surface, ozone has a positive role since it screens the earth from harmful ultraviolet rationation. Specifically, it cuts out around 100% of the highly energetic and harmful UVc ratiation, around 90% of slightly less energetic UVb and 50% of UVa. If ozone were not present in the high atmosphere, ultraviolet radiation would, in effect, sterlise the earth and life on land as we know it would cease to exist.
If you look at the chart above, you see the unit of measurement is the Dobson, which is 100th of a millimetre of pure ozone at zero degrees Celsius at the earth’s surface. Normally, a column of air will contain 300 Dobson units (3 millimetres) of pure ozone. By convention, when ozone is depleted below 220 Dobson units (2.2 millimetres) it is classed as an ozone ‘hole’ since no historical values below this level existed prior to 1979 (in other words, CFCs pushed ozone levels below the historical norm).
The atmospheric physicists Herman and Newman from NASA have an interesting commentary on the Skin Cancer Foundation site (here) on what could have happened if the depletion of ozone had not been discovered and the Montreal Protocol of 1987 limiting CFC production had not come into effect.
What might have happened if we had done nothing about CFCs? In the 1970s, prior to discovery of the ozone problem, CFC production was increasing 7-10 percent per year. Using the same computer models that predict the future recovery, we estimated that CFC emissions would have increased by three percent per year after 1974. By 2060, the levels of stratospheric chlorine would have been 16 times above 1980 levels. Average global ozone levels would have decreased by two thirds. The UV index in the northern mid-latitudes would have increased to a value near 30 for midsummer noon conditions. The average mid-summer UV index value now is about 10 in these regions. Typically, it takes about 15 minutes for a fair-skinned person to develop perceptible sunburn in mid-summer. In this theoretical world (“world avoided”) it would have taken less than five minutes to develop a perceptible burn.
Luckily for humanity, the existence of the ozone hole was discovered at a relatively early stage. The British Antarctic Survey had been taking ozone measurements continuously since 1956, and in a seminal article published in the journal Nature in May 1985 three scientists working for the Survey announced that Antarctic ozone was in a steep decline and named CFCs as the likely culprit. Jonathan Shanklin, one of the original authors of the Nature paper, explains the background to the discovery here:
The Nature paper came as a shock to the scientific community because NASA had been monitoring ozone levels over the Antarctic with much more sophisticated measuring equipment since 1978; however, satellite data had failed to flag any unusual changes. The reason why this happened shows one limitation of technology: human error. In short, the computer program recording the data had been designed so as to throw out any extreme readings from the analysis on the presumption that they were anomalies. Subsequently, upon a thorough review of the data, NASA’s satellite was found to have accurately recorded the burgeoning ozone hole.
What would have happened if the rather quixotic data gathering exercise of the British had been terminated once satellite measurements had come on stream? Undoubtedly, the ozone hole would eventually have spilled into South America, South Africa, Australia and New Zealand, and a mirror ozone hole would have developed over the Arctic. Of course, at some stage land-based Dobson spectrophotometers in more populous regions than Antarctica would have noted the change, the NASA satellite data analysis error would have been discovered and the Montreal Protocol would have been enacted (but a lot more later than 1987). Levels of skin cancer world wide would though have risen, especially in southern latitudes. Nonetheless, both the global economy and mankind would have survived with relatively limited harm.
There could, however, have been a far darker outcome. Paul Crutzen, who shared a Nobel Prize for Chemistry in 1995 for his work related to gases and the ozone layer, stressed in his acceptance speech (page 214) that things could have been a lot worse if refrigerants had been designed around bromine and not chlorine:
This makes bromine on an atom to atom basis almost a hundred times more dangerous for ozone than chlorine. This brings up the nightmarish thought that if the chemical industry had developed organobromine compounds instead of the CFCs – or alternatively, if chlorine chemistry would have run more like that of bromine – then without any preparedness, we would have been faced with a catastrophic ozone hole everywhere and at all seasons during the 1970s, probably before the atmosphe-ric chemists had developed the necessary knowledge to identify the problem and the appropriate techniques for the necessary critical measurements. Noting that nobody had given any thought to the atmospheric consequences of the release of Cl or Br before 1974, I can only conclude that mankind has been extremely lucky, that Cl activation can only occur under very special circumstances. This shows that we should always be on our guard for the potential consequences of the release of new products into the environment.
In short, if luck had not been in our favour, humanity could have come close to collapse due to the technological wonder of CFC equivalents.
Against this background, we should note that the analysis of Kurzweil style techo-optimists fails to recognise that it is not only the benefits of technology but also the costs that can on occasions grow exponentially. And further, the cost curve may not be smooth: costs could jump from one state to another that is far worse. Humanity did get lucky with the ozone hole as Paul Crutzen states, but there is no logical reason to believe that this will always be the case. Technology can thus sometimes be a double-edged sword.