Category Archives: Climate Change

Links for the Week Ending 16 July 2014

I haven’t posted for quite a while. Basically, family commitments have eaten into my blogging time, and this state of affairs will likely continue for an indefinite period longer. Nonetheless, I will try to get some posts out as we grind through the last few innings of what I would term the ‘Great Hiatus': a hiatus period—or pause— amid the longer term trend of rising global mean temperatures, higher oil prices, increasing resource constraints and greater global economic instability.

For example, with a 70-80% chance of an El Nino by year-end, temperature records have the potential to start falling again. Further, oil has built a solid base above $100 per barrel but appears poised to go higher in the next year or so as oil companies struggle to find new fields that can be developed at the right price.

At the same time, many of the financial fragilities in the system posed by ageing demographics, declining productivity and increasing resource constraints have to date been countered by the super easy monetary policy pursued worldwide. The aggressive, unprecedented and unorthodox monetarism  led by the Federal Reserve Board has been a policy triumph over the short term. Since the credit crunch of 2008/2009, the sky has not fallen down.

Yet the jury is still out as to whether the provision of free money can be maintained long enough to see a return to sustainable economic growth, or whether it will beget a new cycle of chronic instability through having fostered the extension of credit into intrinsically poor investments and a generalized asset price inflation that benefits few but the rich.

In the meantime, here are some links which I hope will flesh out some of the themes of this blog:

  • Occasionally, my left-learning friends berate me for reading the right-of-centre Daily Telegraph. I offer two defences: first, you need to read opinion with which you may instinctively disagree, but find of some merit with a bit of reflection. Second, a good newspaper has intellectual mavericks—and The Telegraph has many (probably more than The Guardian). Here is an article by Ambrose Evans-Pritchard portraying the fossil fuel industry as poor capitalists; in short, the oil majors have been investing ever more, to reap ever less; while renewables are slowing sloughing off their subsidies. Joseph Schumpeter would be proud of this epic creative destruction.
  • And despite all the new technology we are bringing to bear on oil extraction, when fields go into decline it is damn tough fighting the tide. North Sea oil was a much ignored saviour of the British economy in the 1980s, but is decline is inexorable and, according to the Office for Budget Responsibility (OBR), accelerating. The Financial Times has the story here (access to FT articles after free registration), but if you want to go to the primary OBR source you can find it here.
  • We are still seeing a lot of commentary over “Capital in the Twenty-First Century” by Thomas Piketty. Piketty argues that the relative reduction in inequality in advanced countries over the post-war period was something of an aberration. Accordingly to his analysis, without direct political intervention (or in the most extreme case revolution), capital will gradually accrue to a relative few. In short, when the return on capital is greater than the growth rate, it is the owners of capital who prosper most, not those in capital’s employ. For a fuller treatment, I recommend Cory Doctorow’s summary here,  and an interview by Maththew Yglesias of Vox  a while back with Piketty here.
  • You can also slice growing inequality in different ways. The Institute for Fiscal Studies (IFS) in the UK has just issued a report detailing how the real incomes of young people are falling much faster than those of any other age cohort (here). Meanwhile, I have often commented on how London has detached itself form the rest of the UK. In the US, Emily Badger of The Washington Post’s Wonk Blog charts a similar divergence between cities showing a virtuous cycle of education and growth and those showing a vicious cycle of poor education and decline (here)
  • Climate sceptics love to start any global mean temperature chart with a data point centred on 1997/98, which happens to coincide with the largest El Nino for a century. This monster El Nino ushered in the record breaking hot year of 1998 (slightly eclipsed in later years depending on which data set you look at, but still one of the hottest years on record: see NASA’s data set here). Global mean temperature is a construct of short-term weather volatility, long-term green-house gas induced temperature rise and the medium-term ENSO cycle. Eventually, CO2 will do its stuff and records will fall regardless of whether we have an El Nino. But for us to quickly retire all the talk of a hiatus in temperature rise will require a new and powerful El Nino. True, an El Nino appears on the cards by year-end, but quite how strong it will be is still clouded in uncertainty as this post at Skeptical Science explains here.
  • If you visit London, take time to visit some of the quirky, smaller museums. One of the most intriguing (and downright disturbing) is the Old Operating Theatre that used to be part of St Thomas Hospital just south of The Thames. This is no Disney Land reconstruction, but a perfectly preserved part of pre-antiseptic medical history.  Despite appearing to be a set from a particularly dark Harry Potter movie scene, the Old Operating Theatre shows how and where surgeons removed a damaged limb in around two minutes flat, with minimal anaesthetic. The museum demonstrates how far we have come health-wise in an historical blink of an eye (150 years or so). And for those who would welcome an economic collapse as a route toward a more authentic form of living, I direct you to a post at Club Orlov explaining a world of post-collapse, or village, medicine. Humanity is put right back on the St Thomas Hospital’s operating table. Pray for four strong men to hold you down—and a surgeon who has not only washed his hands, but is also quick with blade and saw.

Data Watch: UAH Global Mean Temperature April 2014 Release

On May 6th, Dr Roy Spencer released the University of Alabama-Huntsville (UAH) global average lower tropospheric temperature anomaly as measured by satellite for April 2014.

The anomaly refers to the difference between the current temperature reading and the average reading for the period 1981 to 2010 as per satellite measurements.

April 2014: Anomaly +0.19 degrees Celsius

This is the 6th warmest April temperature recorded since the satellite record was started in December 1978 (35 April observations). The warmest April to date over this period was in 1998, with an anomaly of +0.66 degrees Celsius. Incidentally, April 1998 was also the warmest month ever recorded for this time series.

The El Nino Southern Oscillation (ENSO) cycle is the main determinant of when global mean temperature hits a new record over the medium term (up to 30 years). In this connection, the U.S. government’s Climate Prediction Center is now giving a 65% chance of an El Nino developing this summer or fall (here). Should this happen, I would expect the UAH anomalies to head back up into the 0.5s, 0.6s or higher.

As background, five major global temperature time series are collated: three land-based and two satellite-based. The terrestrial readings are from NASA GISS (Goddard Institute for Space Studies), HadCRU (Hadley Centre/Climate Research Unit in the U.K.), and NCDC (National Climate Data Center). The lower-troposphere temperature satellite readings are from RSS (Remote Sensing Systems, data not released to the general public) and UAH (Univ. of Alabama at Huntsville).

The most high profile satellite-based series is put together by UAH and covers the period from December 1978 to the present. Like all these time series, the data is presented as an anomaly (difference) from the average, with the average in this case being the 30-year period from 1981 to 2010. UAH data is the earliest to be released each month.

The official link to the data at UAH can be found here, but most months we get a sneak preview of the release via the climatologist Dr Roy Spencer at his blog.

Spencer, and his colleague John Christy at UAH, are noted climate skeptics. They are also highly qualified climate scientists, who believe that natural climate variability accounts for most of recent warming. If they are correct, then we should see some flattening or even reversal of the upward trend within the UAH temperature time series over a long time period. To date, we haven’t (click for larger image).

UAH Global Temp Apr 14 jpeg

That said, we also haven’t seen an exponential increase in temperature either, which would be required for us to reach the more pessimistic temperature projections for end of century. However, the data series is currently too short to rule out such rises in the future. The Economist magazine published a very succinct summary of the main factors likely accounting for the recent hiatus in temperature rise (here).

One of the initial reasons for publicising this satellite-based data series was due to concerns over the accuracy of terrestrial-based measurements (worries over the urban heat island effect and other factors). The satellite data series have now been going long enough to compare the output directly with the surface-based measurements. All the time series are now accepted as telling the same story (for a fuller mathematical treatment of this, see Tamino’s post at the Open Mind blog here).

Note that the anomalies produced by different organisations are not directly comparable since they have different base periods. Accordingly, to compare them directly, you need to normalise each one by adjusting them to a common base period.

Data Watch: UAH Global Mean Temperature March 2014 Release

On April 7th, Dr Roy Spencer released the University of Alabama-Huntsville (UAH) global average lower tropospheric temperature anomaly as measured by satellite for March 2014.

The anomaly refers to the difference between the current temperature reading and the average reading for the period 1981 to 2010 as per satellite measurements.

March 2014: Anomaly +0.17 degrees Celsius

This is the joint 7th warmest March temperature recorded since the satellite record was started in December 1978 (35 March observations). The warmest March to date over this period was in 2010, with an anomaly of +0.57 degrees Celsius.

The El Nino Southern Oscillation (ENSO) cycle is the main determinant of when global mean temperature hits a new record over the medium term (up to 30 years). In this connection, the U.S. government’s Climate Prediction Center is now giving a 50% chance of an El Nino developing this summer or fall (here). Should this happen, I would expect the UAH anomalies to head back up into the 0.5s, 0.6s or higher. The next update is on the 10th of April.

As background, five major global temperature time series are collated: three land-based and two satellite-based. The terrestrial readings are from NASA GISS (Goddard Institute for Space Studies), HadCRU (Hadley Centre/Climate Research Unit in the U.K.), and NCDC (National Climate Data Center). The lower-troposphere temperature satellite readings are from RSS (Remote Sensing Systems, data not released to the general public) and UAH (Univ. of Alabama at Huntsville).

The most high profile satellite-based series is put together by UAH and covers the period from December 1978 to the present. Like all these time series, the data is presented as an anomaly (difference) from the average, with the average in this case being the 30-year period from 1981 to 2010. UAH data is the earliest to be released each month.

The official link to the data at UAH can be found here, but most months we get a sneak preview of the release via the climatologist Dr Roy Spencer at his blog.

Spencer, and his colleague John Christy at UAH, are noted climate skeptics. They are also highly qualified climate scientists, who believe that natural climate variability accounts for most of recent warming. If they are correct, then we should see some flattening or even reversal of the upward trend within the UAH temperature time series over a long time period. To date, we haven’t (click for larger image).

UAH March 14 jpeg

That said, we also haven’t seen an exponential increase in temperature either, which would be required for us to reach the more pessimistic temperature projections for end of century. However, the data series is currently too short to rule out such rises in the future. Surprisingly, The Economist magazine has just published a very succinct summary of the main factors likely accounting for the recent hiatus in temperature rise (here).

One of the initial reasons for publicising this satellite-based data series was due to concerns over the accuracy of terrestrial-based measurements (worries over the urban heat island effect and other factors). The satellite data series have now been going long enough to compare the output directly with the surface-based measurements. All the time series are now accepted as telling the same story (for a fuller mathematical treatment of this, see Tamino’s post at the Open Mind blog here).

Note that the anomalies produced by different organisations are not directly comparable since they have different base periods. Accordingly, to compare them directly, you need to normalise each one by adjusting them to a common base period.

Links for the Week Ending 6 April 2014

  • The second instalment of The Intergovernmental Panel on Climate Change’s (IPCC) Fifth Assessment Report (AR5), titled “Impacts, Adaption and Vulnerability”, was released in Tokyo on the 31st March and can be found here. The “Summary for Policymakers” can be downloaded here. On page 19 of the Summary, the IPCC states that “the incomplete estimates of global annual economic losses for additional temperature increases of around 2 degrees Celsius are between 0.2 and 2.0% of income (± one standard deviation around the mean)” with the risk for higher rather than lower losses. The report then goes on to say “Losses accelerate with greater warming, but few quantitative estimates have been completed for additional warming around 3 degrees Celsius or above”. Given that it looks almost impossible that we will constrain warming to 2 degrees Celsius based on the current CO2 emission path and the installed fossil fuel energy infrastructure base, the world really is going into an unknown world of risk with climate change.
  • A key area of economic loss from climate change relates to drought. To date, most models have focussed on precipitation as the principal driver of drought. A new paper by Cook et al in the journal Climate Dynamics titled “Global Warming and Drought in the 21st Century” gives greater emphasis to the role of evaporation (more technically, potential evapotranspiration or PET) in drought. Through better modelling of PET, the paper sees 43% of the global land area experiencing significant dryness by end of 21st century, up from 23% for models that principally looked at precipitation alone. A non-technical summary of the paper can be found here.
  • Meanwhile, the general public has lapsed back into apathy around the whole climate change question, partially due to the hiatus period in temperature rise we are currently experiencing. However, evidence is slowly mounting that we could be about to pop out of the hiatus on the back of a strong El Nino event (periods of high global temperature are linked to El Ninos). Weather Underground has been doing a good job of tracking this developing story, with another guest post from Dr. Michael Ventrice (here) explaining the major changes in the Pacific Ocean that have taken place over the last two months and which are setting us up for an El Nino event later in the spring or summer.
  • Changing subject, The Economist magazine ran a special report last week on robotics titled “Immigrants from the Future“. In some ways, I came away less impressed by the capabilities of the existing generation of robots than more.
  • I often blog on happiness issues (most recently here). This may seem strange for a blog whose stated focus is on such global risks as resource depletion and climate change, but I don’t see the contradiction. For me, much of our striving to extract and burn as much fossil fuel as possible comes through the pursuit of goals that don’t necessarily make us more happy. A new book by Zachary Karabell titled “The Leading Indicators” adds a new dimension to this argument. Karabell argues that over the last century or so we have created a series of statistics that are more than pure measurements of economic success. In short, they are ideology laden more than ideology free. Political parties set out their manifestos based on a mishmash of economic achievements and goals based on GDP, unemployment, inflation, the trade balance, interest rates, the strength of their national currency and so on and so forth. But these number encapsulate only part of well-being. Yet such statistics totally dominate political discourse because that is how we have been taught to keep score in a modern capitalist economy. As we career towards extremely dangerous climate change, I think it is time that we recognise these economic indicators for what they frequently have become: false gods. Karabell has an article in The Atlantic setting out the book’s main ideas here and there is a good review in The Week here.
  • Rising inequality has been one of the major economic development over the past 40 years. I am a great fan of the Word Bank economist Branko Milanovic, who wrote a wonderful book called “The Haves and Have-Nots: A Brief and Idiosyncratic History of Global Inequality“, in which he pulls together many strands of the inequality literature within a global context. I blogged on this once here. A nice complement to this book is the new web site titled Chartbook of Economic Inequality, which has been put together by two academic economists Anthony Atkinson and Salvatore Morelli. If you like infographics, you will love this site.

Links for the Week Ending 9 March 2014

Apologies for the late posting of this week’s links. Has been a crazy week.

  • For those of a non-business background, any reference to The Economist magazine with respect to climate change may appear strange. Who cares what The Economist writes on the subject? I would beg to disagree. Few, if any, senior business executives will read posts on Real Climate or Skeptical Science, let alone academic articles on the subject. For English speakers, most climate change commentary will come out of the pages (much of which will, of course, be online these days) of The Wall Street Journal, The Financial Times, other serious non-financial dailies like The New York Times in the U.S. and The Telegraph in the U.K., a motley collection of weeklies like Forbes, and, of course, The Economist. And The Economist is rather special in terms of its reach into board rooms across the globe (and for that matter cabinet offices). For example, Playboy Magazine once asked Bill Gates what he reads. The answer: “The Economist, every page”. A year ago, The Economist wrote an extended article on the global warming ‘hiatus’ that, I thought, gave too much weight to a few studies suggesting that climate sensitivity was far lower than previously thought (here, free registration). This week, however, the magazine made amends by publishing an excellent piece titled “Who pressed the pause button?” on the so called ‘hiatus’ in temperature rise. It ended with this statement:  “Most of the circumstances that have put the planet’s temperature rise on “pause” look temporary. Like the Terminator, global warming will be back.”
  • Talking of ‘The Terminator’, The Guardian carries an interview with the Crown Prince of techno-optimists and Google geek in chief Ray Kurzweil. God help us if anyone actually believes this stuff.
  • Up the road from me in Oxford is the NGO Climate Outreach and Information Network (COIN). Its founder George Marshall has an interesting blog that looks at the narratives surrounding climate change. In a post called “How the Climate Change Messengers Became Blamed for the Floods” he deconstructs the media’s reaction to the recent U.K. floods. It’s somewhat depressing stuff.
  • One of the sharpest observers of the shale hype has been the petroleum geologist Art Berman. He has a site called The Petroleum Truth Report, but, frustratingly, doesn’t keep it current. Fortunately, he has just given a new interview with Oilprice.com updating us on his recent thinking. The interview is full of gems such as this: “Oil companies have to make a big deal about shale plays because that is all that is left in the world. Let’s face it: these are truly awful reservoir rocks and that is why we waited until all more attractive opportunities were exhausted before developing them. It is completely unreasonable to expect better performance from bad reservoirs than from better reservoirs.” I highly recommend you read the whole thing.
  • The economist Noah Smith writes a lively blog called Noahpinion. In this post he makes some keen observations on the ‘jobs and robots’ debate, while in this article in The Week he compares America’s decline with the collapse of the Ming Dynasty.

Data Watch: UAH Global Mean Temperature February 2014 Release

On March 5th, Dr Roy Spencer released the University of Alabama-Huntsville (UAH) global average lower tropospheric temperature anomaly as measured by satellite for February 2014.

The anomaly refers to the difference between the current temperature reading and the average reading for the period 1981 to 2010 as per satellite measurements.

February 2014: Anomaly +0.17 degrees Celsius

This is the 10th warmest February temperature recorded since the satellite record was started in December 1978 (35 January observations). The warmest February to date over this period was in 1998, with an anomaly of +0.65 degrees Celsius due to the super El Nino that year.

The El Nino Southern Oscillation (ENSO) cycle is the main determinant of when global mean temperature hits new records over the medium term (up to 30 years). In this connection, the U.S. government’s Climate Prediction Center is now giving a 50% chance of an El Nino developing this summer or fall (here). Should this happen, I would expect the UAH anomalies to head back up into the 0.5s, 0.6s or higher.

As background, five major global temperature time series are collated: three land-based and two satellite-based. The terrestrial readings are from NASA GISS (Goddard Institute for Space Studies), HadCRU (Hadley Centre/Climate Research Unit in the U.K.), and NCDC (National Climate Data Center). The lower-troposphere temperature satellite readings are from RSS (Remote Sensing Systems, data not released to the general public) and UAH (Univ. of Alabama at Huntsville).

The most high profile satellite-based series is put together by UAH and covers the period from December 1978 to the present. Like all these time series, the data is presented as an anomaly (difference) from the average, with the average in this case being the 30-year period from 1981 to 2010. UAH data is the earliest to be released each month.

The official link to the data at UAH can be found here, but most months we get a sneak preview of the release via the climatologist Dr Roy Spencer at his blog.

Spencer, and his colleague John Christy at UAH, are noted climate skeptics. They are also highly qualified climate scientists, who believe that natural climate variability accounts for most of recent warming. If they are correct, then we should see some flattening or even reversal of the upward trend within the UAH temperature time series over a long time period. To date, we haven’t (click for larger image).

UAH Satellite Tempertures Feb 14 jpeg

That said, we also haven’t seen an exponential increase in temperature either, which would be required for us to reach the more pessimistic temperature projections for end of century. However, the data series is currently too short to rule out such rises in the future. Surprisingly, The Economist magazine has just published a very succinct summary of the main factors likely accounting for the recent hiatus in temperature rise.

One of the initial reasons for publicising this satellite-based data series was due to concerns over the accuracy of terrestrial-based measurements (worries over the urban heat island effect and other factors). The satellite data series have now been going long enough to compare the output directly with the surface-based measurements. All the time series are now accepted as telling the same story (for a fuller mathematical treatment of this, see Tamino’s post at the Open Mind blog here).

Note that the anomalies produced by different organisations are not directly comparable since they have different base periods. Accordingly, to compare them directly, you need to normalise each one by adjusting them to a common base period.

Links for the Week Ending 2 March 2014

  • Martin Wolf has been revisiting the robots and jobs topic over the past few weeks in a couple of articles in The Financial Times here and here (free access after registration). This is a theme I have been addressing a lot recently in a series of posts starting here. Wolf finishes his last article with the observation that technology does not always have to shape institutions; it should be the other way around: “A form of techno-feudalism is unnecessary. Above all, technology itself does not dictate the outcomes. Economic and political institutions do. If the ones we have do not give the results we want, we must change them.” I agree, but this will not be easy.
  • I have also just discovered a fascinating blog that pulls together articles on the new robot economy called RobotEnomics (sic). For example, check out this post on the economic implications of driverless cars.
  • California has experienced significant rainfall over the last few days. The latest Drought Monitor (released weekly) doesn’t capture this rainfall, so we should see some slight improvement when the next update comes out. Critically though, California’s water bank—its high mountain snow pack—is still running at around 20% of average. You can see the end month figures as measured by the Department of Water Resources here and an article giving background to the snowpack hereMother Jones has some nice graphics on the crops being hurt by the drought here, while The Atlantic has a very interesting (and very long) article on the history and future of California’s massive water engineering projects here.
  • Here I go again: linking to the March 1998 Campbell and Laherrere article titled “The End of Cheap Oil” in Scientific American. The authors ended the article with this sentence “The world is not running out of oil—at least not yet. What our society does face, and soon, is the end of the abundant and cheap oil on which all industrial nations depend.” Average price of Brent crude in 1998: $13.2 per barrel, equivalent nowadays to around $19 after adjusting for inflation. Brent now: $109 per barrel. But isn’t fracking going to give us an endless supply of cheap oil?  Here is an article in Bloomberg titled “Dream of Oil Independence Slams Against Shale Costs”. In other words, Campbell and Laherrere continue to be proved right and the energy cornucopians continue to be proved very wrong.
  • For technological optimists the dream is for a transformational technology that can permanently alter the energy supply equation. Fusion has always been one such hope, but forever decades away from commercial development. The New Yorker has just published a superb article called “A Star in a Bottle” on the International Experimental Thermonuclear Reactor (ITER) being built in France. The audacity and scope of the project is extraordinary. Yet my takeaway from the article is that fusion provides little hope of providing a timely saviour with respect to either climate change or fossil fuel depletion.

U.K. Floods: If Only They Had Dredged?

Over the last couple of months, we have seen a pack of welly-booted U.K. journalists from News at 10, Newsnight and Channel 4 News interviewing a series of rather depressed and bedraggled flood-victims of Somerset and the Thames Valley. A good proportion of these have been parroting The Daily Mail‘s claim that it is all the Environment Agency’s fault through a lack of dredging.

Is it true?

Let’s look at an Environment Agency presentation called “River Dredging and Flood Defence: To Dredge or Not to Dredge“. To start with, we have a river channel and an adjacent floodplain.

EA Floodplain jpeg

When we have extreme precipitation, the channel overflows into the floodplain.

EA What Happens jpeg

The critical question for the Environment Agency relates to the relationship between the flow of water that can be accommodated in the channel to the flow of water that can be contained within the adjacent floodplain during an extreme weather event.

EA Change to River jpeg

And in the case of the January floods, we do have data to suggest that river flow has been setting records and so could not have been accommodated in most river channels unless extraordinary aggressive and expensive dredging had taken place. The chart below is from the Centre for Ecology and Hydrology’s January 2014 Hydrological Summary (click for larger image):

River Flow jpeg

The black circles indicate exceptionally high flow, and the arrows a new record. So the Thames, for example, recorded a flow 263% above the long-term average in January (you may need to follow the link to the monthly report to see this clearly).

The Environment Agency then goes on to point out that if you want to keep water off the floodplain then you would need to both deepen and widen river channels and repeat dredging at regular intervals to prevent silting. This all costs money, but it can be done.

In my former home of Japan, flood defence has to account for deluges following typhoons. Accordingly, the system must cope with irregular floods that are many multiples of the long-term average river flow. As a result, the country is covered in massive concreted river channels through which just a trickle of water flows for many years—until a big typhoon hits. It is not pretty flood defence. You could do this to the Thames, but it certainly would no longer look like the Thames of Jerome K. Jerome or Kenneth Grahame by the time such work had finished.

Moreover, you can’t modify the river channel at one point and not at others. If you did, water flow would hit pinch points, which are frequently structures like bridges.

Pinch Points jpeg

The agency also directly contradicts an assertion made by Christopher Booker of the Daily Telegraph. Booker claimed that if only dredgers were allowed to dump silt on the river banks costs could be kept down. The dreaded EU is supposed to prevent such a common sense action for environmental reasons. As with all Booker’s writings, anything that doesn’t fit the story is, of course, left out; in particular, the common sense result of dumping silt on river banks is to raise the floodplain and reduce the volume of water that can be held therein.

Raising the Floodplain jpeg

Against this background, let’s return to The Daily Mail accusation; “non-dredging “was what did it”. Below is an example of the reportage. As an aside, almost all the U.K. climate skeptic web sites have been carrying these photos as well. The Daily Telegraph also ran with them in an article here. They specifically relate to the Somerset Levels.

Daily Mail Dredging jpeg

What I don’t like about these ‘before and after’ photos is that they are taken from a different angle. In the second photo, we can’t see the span of the river channel since the left bank blocks our view. However, there does appear to be some build-up of the river bank on the right bank as evidenced by the circular hole being partially obscured. Nonetheless, we can’t calculate the change in flow capacity from these ‘before and after’ pictures alone.

We also don’t know exactly when the original photo was taken: only in the ‘1960s’. We do know, however, that the Somerset Levels had a massive flood in 1960. You can see this in a County Gazette 50th anniversary article here, with a series of photos here. There were also, incidentally, major floods in the area in 1951, and these led to comprehensive flood defence works—which obviously weren’t sufficient to cope with the rainfall in 1960.

The Daily Telegraph story with the ‘before and after’ photos also includes this assertion from a local pro-dredging group Flooding on the Somerset Levels Action Group (FLAG).

A spokesman for the FLAG group has got hold of meticulous rainfall records for the area around the Parrett and Tone for the last 20 years.

They reveal between December 1993 and February 1994 around 20 inches of rain fell – five inches less than during the same time this year.

A spokesman for the group said: “So roughly the same rainfall but far more flooding now.

“What has changed? Dredging seems to be the biggest obvious difference between then and now.”

Based on FLAG’s figures, 20 inches fell in 1993/4 against 25 inches on this occasion. This is not really “roughly the same rainfall”; 2013/4 is 25% higher than 2013/14 on their numbers. Given we experience flooding when a river channel reaches capacity, this 25% by itself looks significant.

Moreover, we don’t have access to FLAG’s data, but we do have access to rainfall records via the Met Office for the nearby Yeovilton weather station. Yeovilton is approximately 18 miles away from Burrowbridge, which is in turn in the heart of the Parrett/Tone area and the site of the bridge photoed above. Yeovilton is also 5 miles north of Yeovil in the Environment Agency map below (although not marked) and firmly inside the Parrett Catchment Flood Management Plan (CFMP).

Parrett CFMP jpeg

In December 1993 rainfall was 117.9 mm at Yeovilton, 99.8mm in January 1994 and 81.1 mm in February 1994. That comes out at 28.9 cm or 11.4 inches, although I admit it is very difficult to tell if we are comparing apples and pears here. We don’t yet have Yeovilton data for the current flooding episode (is released with a lag), but when it comes out we will have a direct comparison of 1993/4 and 2013/4 rainfall.

Interestingly, while the Somerset Levels have been badly affected by the current floods, centres of population that are deemed at risk have mostly come through unscathed. A June 2013 report on the Parrett CFMP pinpointed the larger towns susceptible to flooding, none of which have been hit hard this time around:

One Percent Change of Flooding jpeg

So Environment Agency flood defence work to date can be seen as quite successful in terms of protecting built-up areas in the Parrett CFMP. It has been less successful in protecting isolated villages and rural areas, but to a large degree this was due to choice rather than neglect.

At this point, I will stress that the Environment Agency’s own work does show that dredging would have a substantial impact on reducing the number of days villages and agricultural land remain under water. The charts below have been going around all the usual climate skeptic blogs and are taken from an Environment Agency hydraulic modelling study that can be found here:

Drainage Model Parrett jpeg

The charts show a far quicker recovery from flooding events after theoretical dredging, although in the case of Curry Moor the peak flood level is very similar before and after the dredging.

So is that game, set and match to the pro-dredgers? Not quite. For a start, we should always keep in mind that the Somerset Levels are quite unique in terms of their hydrological challenges and don’t provide any real lessons for major rivers such as the Thames. But far more important is the fact that the same climate skeptic blogs who printed the above chart rarely print this one (source here):

Parrett and Tone Cost Benefit jpeg

The above charts show benefits and costs under different assumptions So the choice to dredge could have been made, but the benefit-to-cost ratio (BCR) would be somewhere around one to one.

Is such a project in line with government guidelines? In May 2011, a new approach to the funding of projects was unveiled by the Department for Environment, Food and Rural Affairs (DEFRA) with respect to flood and coastal erosion risks . You can find details here. The underlying methodology underpinning all government benefit-cost-ratio calculations can be found in the Green Book, the Treasury’s bible for expenditure decisions.

An evaluation of Defra’s flood defence project decision-making was undertaken by the hydrology specialist JBA Consulting in March 2012 (here); they looked at a series of case studies to see which ones would have received government grants. You can see these here (click for larger image):

JBA Consulting BCR jpeg

The media frequently reports a benefit-to-cost ratio of eight as the hurdle rate for flood projects to go ahead, but, as the table above shows, the methodology is more complex than that. Nonetheless, only three of the six projects that would have been eligible for flood defence grant in aid—the ones which had partnership funding scores above 100%. These three had benefit-to-cost ratios of 7.23, 8.14 and 4.68, respectively. The Parrett and Tone dredging scheme can’t get anywhere close to these numbers.

In short, David Cameron, who has just announced the recommencement of dredging on the Somerset Levels, has thrown Treasury benefit-to-cost ratio guidance out the window due to the political fallout from the floods. I doubt, however, that this ad hoc response will translate into the longer term promotion of flood defence projects with low BCR ratios. Why should flood defences be treated differently from cancer wards, kindergarden places, pedestrian crossings or pollution control? And given that the Treasury has a hard budget restraint, my belief is that they won’t be once the political hullabaloo has subsided.

Meanwhile, climate change has barely begun to change the risk equation. The Environment Agency has this to say about the Parrett catchment in a time of climate change:

Climate Change Parrett jpeg

The 500 mm sea level rise figure for 2100 is possible, but the scientific consensus in now for an upper level of possible sea level rise of one metre or more by century end. Nonetheless, even using the agency’s current, rather conservative assumptions, flood risk for  towns in the Parrett catchment jumps significantly:

Current and Future Flood jpeg

My own personal opinion is this: since climate change is radically altering flood risk, individuals should proactively protect themselves. Expecting the government to always bail oneself out through dredging, insurance or whatever is, in itself, a high risk strategy. There is a limit to what the government can do—or afford to do. And when the government reaches such a limit, you are on your own.

Links for the Week Ending 23 February 2014

  • The so called hiatus period of flat-lining global mean temperatures has certainly been a godsend for the climate skeptic lobby. A lot of this recent change in temperature trend is due to the ENSO cycle: El Nino years are generally hot years, and we haven’t recently had many strong El Ninos. You can see this effect in the NASA chart here. So the next time we get a strong El Nino year expect to see a new global mean temperature record. When will we see the next one? This intriguing guest post on Jeff Masters Wunderblog suggests we may be due for a big El Nino in 2014. If true, expect to see the ‘hiatus period’ disappear from the climate skeptic lexicon.
  • By coincidence, I saw the NASA chart above reproduced in a blog post on Econbrowser by Menzie Chinn. Hamilton and Chinn, who co-author the blog, are two of the most respected economists in the world. Hamilton wrote one of the standard time series texts that a generation of econometricians grew up on. Chinn’s post is titled “Economic Implications of Anthropogenic Climate Change and Extreme Weather“. He takes aim at those who think we can easily adapt to climate change, pointing out that not only will the trend change but also volatility. All of this will cost a lot of money.
  • The global media has picked up on the Californian drought to a certain extent. If you want to track it yourself, click the U.S. Drought Monitor page here. There has been far less coverage, however, of the Brazilian drought; here is a rare piece of coverage by National Public Radio. And what is happening in Brazil is already having an effect on food prices as witnessed by the skyrocketing price of coffee; see here.
  • I have frequently commented that despite rising resource constraints  and a productivity slowdown, global GDP growth has motored on at around 3% per annum regardless. The is mostly because China has acted as a growth locomotive for everyone else, offsetting anaemic growth in the U.S., Europe and Japan. So if China’s growth collapses, this will likely mean that global growth takes a step-change downward (the other BRICs and MINTs have their own problems). Having seen Japan’s experience first hand (one day growth, the next day no growth), I have been a huge skeptic of China’s economic model. But to date, the sky has not fallen down. The BBC’s economics correspondent Robert Peston has just produced a short documentary called “How China Fooled the World” that sets out the pessimist’s case and can be found on iPlayer. If you have a problem accessing BBC content, try this link at YouTube here.
  • Most web-based technology favours scale: it facilitates ‘winner takes all’ economics. Think Google and Facebook. Yet it also reduces the cost of information and, potentially, small production runs. This, in turn, favours the so called ‘long tail’. This strange dance between the centrifugal and centripetal forces of information technology is a source of both fragility and resilience as we face resource and climate change challenges. For a slightly different riff on the same theme see this article by the economist Robert Frank in The New York Times.

UK Floods: Don’t Say We Weren’t Warned (Part 2)

In my last post, I looked at the the recent history of climate change policy with respect to flood risk in the U.K.; in particular, a series of benchmark reports that starkly set out the flood risk choices the government and British people would have to take.

The realisation that climate change had transformed the flood risk game, together with the fallout from the 2007 floods as documented in The Pitt Review, led the Environment Agency to establish a long-term strategy (2010 to 2035), which was set out in a 2009 report entitled “Investing for the Future: Flood and Coastal Risk Management“.

The report identified 5.2 million households at risk of fluvial flooding (river and coastal), pluvial flooding (surface water and ground water) and both fluvial and pluvial.

Properties at Risk jpeg

And of the 2.4 million fluvial category, a more detailed breakdown in terms of expected frequency of flood was given.

Fluvial Flood Risk jpeg

Moreover, here are the critical regions for such fluvial flooding:

Properties at Flood Risk by Area jpeg

The media still reports the 5.2 million ‘at risk’ number of properties, despite climate science having moved on since 2009. Indeed, the above figures are all from the Environment Agency’s “Flooding in England: A National Assessment of Flood Risk” which came out in 2009 but is based on data for 2008. In my previous series of posts on flood risk, I pointed out that the current risk categories do not adequately take into account climate change or non-fluvial flooding (i.e., non-river and coastal flooding). Accordingly, all these numbers now look to be significant underestimates.

A few leaks have emerged with respect to the 2013 National Assessment (for example, in The Guardian here), but not enough to get a good impression of how the Environment Agency has moved the numbers around.

More controversially, the 2009 “Investing for the Future” report also gives us some cost-benefit estimates out to 2035. We start with 1.2 million houses at significant or moderate risk and then see how many still fall into these categories 25 years later. The base is £800 million per year divided between £570 million in tangible investment (building new flood defences and maintaining existing ones) and £230 million in intangible investment (mapping, analysis, warning systems and so on). Note that this was a forecast expenditure for 2010/11, since the report was written in 2009.

Flood and Coastal Risk Expenditure jpeg

From this future base period, five different expenditure scenarios were plotted, ranging from conservative to aggressive.

Investment Scenarios jpeg

Under the most conservative investment scenario, number 1, expenditures would flat line at £800 million per annum. Due to inflation, however, this would mean a reduction in spending in real terms, with the result that the U.K.’s flood defences would not keep up with rising climate change-related flood risk. As a result, an additional 330,000 homes would be added into the ‘significant’ and ‘moderate’ risk categories by the year 2035. By contrast, under the more free-spending scenarios, properties ‘at risk’ would fall.

The preferred scenario was Scenario 4, which would give a benefit-to-cost ratio of seven. In the words of the report:

It provides the greatest overall benefit to society, because it generates the greatest net return on investment. However, with this scenario, spending needs to increase from the £570 million asset maintenance and construction budget in 2010-2011 to around £1,040 million by 2035, plus inflation. This equates to an increase in investment in asset construction and maintenance of around £20 million plus inflation each and every year.

Cost Benefit Flood Investment jpeg

So what actually happened? Well, for a start, the Lehman shock. Then, the global economy fell into its worst recession since the Great Depression. Subsequently, David Cameron replaced Gordon Brown in May 2010, and the new Chancellor of the Exchequer, George Osborne, pledged the party to a policy of austerity. And the flood defence agenda was not immune from the cuts.

The House of Commons Library has all the up-to-date annual expenditure numbers in a research briefing called Flood Defence Spending in England. This shows that spending fell in real terms under the coalition government:

Flood Defence Spending jpeg

Accordingly, the expenditure path since the 2009 report has been lower than Scenario 1. In other words, a fall in spending in both real and nominal terms was witnessed, against a recommended incremental increase in real terms of £20 million per year.

Would a different expenditure path have made much difference to the flood-related losses currently being experienced? Probably not. The kind of precipitation seen over 2013/14 would have required vast expenditures to defend against. But the path taken has certainly increased future risk: each yearly incremental shortfall leads to a growing overall flood defence deficit.

The key question, then, is how the current flood defence expenditure path and flood trend will translate into future losses. The political battering the coalition governing is currently taking has already led to something of a U-turn, with a number of pledges being made to increase flood-defence spending. These include:

Response to 2014 Winter Floods jpeg

However, we are yet to see how such spending fits in to the original flood defence investment recommendations made by the Environment Agency.

Ironically, as part of Britain were drowning, the new public-private flood insurance agreement was winding its way through parliament as part of the Water Bill. A summary can be found here and a useful Q&A is available at the Association of British Insurer’s site here.

As with the previous Statement of Principles, the new mechanism relies on a cross subsidy from low risk properties to high risk properties, although it is all organised though a new legal entity: Flood Re. In other words, those properties not at risk of flood pay slightly more in insurance premiums than their actual risk profile would warrant, and these premiums build up to become the capital of Flood Re.

Nonetheless, every household subject to any kind of flood risk must be aware of the many caveats to the scheme. Principally:

  • Flood Re’s risk numbers are based on the Environment Agency’s “Flooding in England: A National Assessment of Flood Risk” which came out in 2009, but is based on data for 2008. Accordingly, all the property at risk numbers are out of date and do not adequately take account of climate change.
  • This is a transitional arrangement slated to run for 20 to 25 years. At the end of this period, market-based pricing is supposed to rule. Given the climate change trend, this means that ‘at risk’ households will be subject to a) major insurance premium increases in future or b) no availability of flood insurance at all. Moreover, the property market will not wait until the end of the scheme, but will mark valuation down well in advance; remember, no insurance means no access to a mortgage.
  • High-end houses in Council Tax Bank H are not included. So all those Thames-side mansions currently under water are on their own.
  • Properties built after 2009 do not fall under the scheme.
  • Businesses and buy-to-let properties are not covered.
  • ‘Uninsurable’ properties will not be included. This is suitably vague so as to provide a get-out clause for the industry. It will be interesting to see if all those Thames-side properties flooded in both the 2012/13 and 2013/14 winters will be classed as ‘uninsurable’.
  • The government has provided a comfort letter, but not subjected itself to a legal requirement to support the scheme should it fall into financial distress. It promises to back stop the scheme if a one-in-200-year event takes place, giving rise to a loss of £2.5 billion or more (note 2007 would fall into this category with a loss of £3 billion). With climate change progressing, however, what is a one-in-200-year event now will not be a-one-in-200 event in future.

To support my assertion that Flood Re is currently begin designed around dodgy ‘at risk’ numbers I refer you to a paper by Swenja Surminski and colleagues from the Grantham Research Institute on Climate Change and the Environment critiquing Flood Re, which can be found here. Two bullet points from the report summary stand out:

Grantham Flood Report jpeg

Note that Dr Surminski was the Association of British Insurer’s Climate Change Advisor from 2007 to 201o.

Against this background, if we get a couple of 2007 type floods in quick succession in the relatively near future (i.e., before Flood Re can build up capital), then the government would likely be expected to act on its comfort letter and recapitalize Flood Re.

At some point, however, the government interest in excluding ‘uninsurable properties’ will fall in line with that of the insurance industry—otherwise the government could get into the untenable position of bailing out property owners whose houses are getting flooded a number of times per decade. Or, alternatively, the government may have to investment massive amounts of money in flood defence to protect these properties. But for many properties, the flood defence expense would dwarf the value of the properties themselves, making abandonment the most sensible policy choice.

The bottom line for all this is that Flood Re will likely provide only a temporary respite for most holders of ‘at risk’ properties. As climate change progresses, more and more ‘at risk’ properties will be shunted out of Flood Re and into the ‘uninsurable’ category. And for many properties, flood defence may just prove financially unfeasible. Don’t say you weren’t warned.