Category Archives: Climate Change

Data Watch: UAH Global Mean Temperature March 2014 Release

On April 7th, Dr Roy Spencer released the University of Alabama-Huntsville (UAH) global average lower tropospheric temperature anomaly as measured by satellite for March 2014.

The anomaly refers to the difference between the current temperature reading and the average reading for the period 1981 to 2010 as per satellite measurements.

March 2014: Anomaly +0.17 degrees Celsius

This is the joint 7th warmest March temperature recorded since the satellite record was started in December 1978 (35 March observations). The warmest March to date over this period was in 2010, with an anomaly of +0.57 degrees Celsius.

The El Nino Southern Oscillation (ENSO) cycle is the main determinant of when global mean temperature hits a new record over the medium term (up to 30 years). In this connection, the U.S. government’s Climate Prediction Center is now giving a 50% chance of an El Nino developing this summer or fall (here). Should this happen, I would expect the UAH anomalies to head back up into the 0.5s, 0.6s or higher. The next update is on the 10th of April.

As background, five major global temperature time series are collated: three land-based and two satellite-based. The terrestrial readings are from NASA GISS (Goddard Institute for Space Studies), HadCRU (Hadley Centre/Climate Research Unit in the U.K.), and NCDC (National Climate Data Center). The lower-troposphere temperature satellite readings are from RSS (Remote Sensing Systems, data not released to the general public) and UAH (Univ. of Alabama at Huntsville).

The most high profile satellite-based series is put together by UAH and covers the period from December 1978 to the present. Like all these time series, the data is presented as an anomaly (difference) from the average, with the average in this case being the 30-year period from 1981 to 2010. UAH data is the earliest to be released each month.

The official link to the data at UAH can be found here, but most months we get a sneak preview of the release via the climatologist Dr Roy Spencer at his blog.

Spencer, and his colleague John Christy at UAH, are noted climate skeptics. They are also highly qualified climate scientists, who believe that natural climate variability accounts for most of recent warming. If they are correct, then we should see some flattening or even reversal of the upward trend within the UAH temperature time series over a long time period. To date, we haven’t (click for larger image).

UAH March 14 jpeg

That said, we also haven’t seen an exponential increase in temperature either, which would be required for us to reach the more pessimistic temperature projections for end of century. However, the data series is currently too short to rule out such rises in the future. Surprisingly, The Economist magazine has just published a very succinct summary of the main factors likely accounting for the recent hiatus in temperature rise (here).

One of the initial reasons for publicising this satellite-based data series was due to concerns over the accuracy of terrestrial-based measurements (worries over the urban heat island effect and other factors). The satellite data series have now been going long enough to compare the output directly with the surface-based measurements. All the time series are now accepted as telling the same story (for a fuller mathematical treatment of this, see Tamino’s post at the Open Mind blog here).

Note that the anomalies produced by different organisations are not directly comparable since they have different base periods. Accordingly, to compare them directly, you need to normalise each one by adjusting them to a common base period.

Links for the Week Ending 6 April 2014

  • The second instalment of The Intergovernmental Panel on Climate Change’s (IPCC) Fifth Assessment Report (AR5), titled “Impacts, Adaption and Vulnerability”, was released in Tokyo on the 31st March and can be found here. The “Summary for Policymakers” can be downloaded here. On page 19 of the Summary, the IPCC states that “the incomplete estimates of global annual economic losses for additional temperature increases of around 2 degrees Celsius are between 0.2 and 2.0% of income (± one standard deviation around the mean)” with the risk for higher rather than lower losses. The report then goes on to say “Losses accelerate with greater warming, but few quantitative estimates have been completed for additional warming around 3 degrees Celsius or above”. Given that it looks almost impossible that we will constrain warming to 2 degrees Celsius based on the current CO2 emission path and the installed fossil fuel energy infrastructure base, the world really is going into an unknown world of risk with climate change.
  • A key area of economic loss from climate change relates to drought. To date, most models have focussed on precipitation as the principal driver of drought. A new paper by Cook et al in the journal Climate Dynamics titled “Global Warming and Drought in the 21st Century” gives greater emphasis to the role of evaporation (more technically, potential evapotranspiration or PET) in drought. Through better modelling of PET, the paper sees 43% of the global land area experiencing significant dryness by end of 21st century, up from 23% for models that principally looked at precipitation alone. A non-technical summary of the paper can be found here.
  • Meanwhile, the general public has lapsed back into apathy around the whole climate change question, partially due to the hiatus period in temperature rise we are currently experiencing. However, evidence is slowly mounting that we could be about to pop out of the hiatus on the back of a strong El Nino event (periods of high global temperature are linked to El Ninos). Weather Underground has been doing a good job of tracking this developing story, with another guest post from Dr. Michael Ventrice (here) explaining the major changes in the Pacific Ocean that have taken place over the last two months and which are setting us up for an El Nino event later in the spring or summer.
  • Changing subject, The Economist magazine ran a special report last week on robotics titled “Immigrants from the Future“. In some ways, I came away less impressed by the capabilities of the existing generation of robots than more.
  • I often blog on happiness issues (most recently here). This may seem strange for a blog whose stated focus is on such global risks as resource depletion and climate change, but I don’t see the contradiction. For me, much of our striving to extract and burn as much fossil fuel as possible comes through the pursuit of goals that don’t necessarily make us more happy. A new book by Zachary Karabell titled “The Leading Indicators” adds a new dimension to this argument. Karabell argues that over the last century or so we have created a series of statistics that are more than pure measurements of economic success. In short, they are ideology laden more than ideology free. Political parties set out their manifestos based on a mishmash of economic achievements and goals based on GDP, unemployment, inflation, the trade balance, interest rates, the strength of their national currency and so on and so forth. But these number encapsulate only part of well-being. Yet such statistics totally dominate political discourse because that is how we have been taught to keep score in a modern capitalist economy. As we career towards extremely dangerous climate change, I think it is time that we recognise these economic indicators for what they frequently have become: false gods. Karabell has an article in The Atlantic setting out the book’s main ideas here and there is a good review in The Week here.
  • Rising inequality has been one of the major economic development over the past 40 years. I am a great fan of the Word Bank economist Branko Milanovic, who wrote a wonderful book called “The Haves and Have-Nots: A Brief and Idiosyncratic History of Global Inequality“, in which he pulls together many strands of the inequality literature within a global context. I blogged on this once here. A nice complement to this book is the new web site titled Chartbook of Economic Inequality, which has been put together by two academic economists Anthony Atkinson and Salvatore Morelli. If you like infographics, you will love this site.

Links for the Week Ending 9 March 2014

Apologies for the late posting of this week’s links. Has been a crazy week.

  • For those of a non-business background, any reference to The Economist magazine with respect to climate change may appear strange. Who cares what The Economist writes on the subject? I would beg to disagree. Few, if any, senior business executives will read posts on Real Climate or Skeptical Science, let alone academic articles on the subject. For English speakers, most climate change commentary will come out of the pages (much of which will, of course, be online these days) of The Wall Street Journal, The Financial Times, other serious non-financial dailies like The New York Times in the U.S. and The Telegraph in the U.K., a motley collection of weeklies like Forbes, and, of course, The Economist. And The Economist is rather special in terms of its reach into board rooms across the globe (and for that matter cabinet offices). For example, Playboy Magazine once asked Bill Gates what he reads. The answer: “The Economist, every page”. A year ago, The Economist wrote an extended article on the global warming ‘hiatus’ that, I thought, gave too much weight to a few studies suggesting that climate sensitivity was far lower than previously thought (here, free registration). This week, however, the magazine made amends by publishing an excellent piece titled “Who pressed the pause button?” on the so called ‘hiatus’ in temperature rise. It ended with this statement:  “Most of the circumstances that have put the planet’s temperature rise on “pause” look temporary. Like the Terminator, global warming will be back.”
  • Talking of ‘The Terminator’, The Guardian carries an interview with the Crown Prince of techno-optimists and Google geek in chief Ray Kurzweil. God help us if anyone actually believes this stuff.
  • Up the road from me in Oxford is the NGO Climate Outreach and Information Network (COIN). Its founder George Marshall has an interesting blog that looks at the narratives surrounding climate change. In a post called “How the Climate Change Messengers Became Blamed for the Floods” he deconstructs the media’s reaction to the recent U.K. floods. It’s somewhat depressing stuff.
  • One of the sharpest observers of the shale hype has been the petroleum geologist Art Berman. He has a site called The Petroleum Truth Report, but, frustratingly, doesn’t keep it current. Fortunately, he has just given a new interview with Oilprice.com updating us on his recent thinking. The interview is full of gems such as this: “Oil companies have to make a big deal about shale plays because that is all that is left in the world. Let’s face it: these are truly awful reservoir rocks and that is why we waited until all more attractive opportunities were exhausted before developing them. It is completely unreasonable to expect better performance from bad reservoirs than from better reservoirs.” I highly recommend you read the whole thing.
  • The economist Noah Smith writes a lively blog called Noahpinion. In this post he makes some keen observations on the ‘jobs and robots’ debate, while in this article in The Week he compares America’s decline with the collapse of the Ming Dynasty.

Data Watch: UAH Global Mean Temperature February 2014 Release

On March 5th, Dr Roy Spencer released the University of Alabama-Huntsville (UAH) global average lower tropospheric temperature anomaly as measured by satellite for February 2014.

The anomaly refers to the difference between the current temperature reading and the average reading for the period 1981 to 2010 as per satellite measurements.

February 2014: Anomaly +0.17 degrees Celsius

This is the 10th warmest February temperature recorded since the satellite record was started in December 1978 (35 January observations). The warmest February to date over this period was in 1998, with an anomaly of +0.65 degrees Celsius due to the super El Nino that year.

The El Nino Southern Oscillation (ENSO) cycle is the main determinant of when global mean temperature hits new records over the medium term (up to 30 years). In this connection, the U.S. government’s Climate Prediction Center is now giving a 50% chance of an El Nino developing this summer or fall (here). Should this happen, I would expect the UAH anomalies to head back up into the 0.5s, 0.6s or higher.

As background, five major global temperature time series are collated: three land-based and two satellite-based. The terrestrial readings are from NASA GISS (Goddard Institute for Space Studies), HadCRU (Hadley Centre/Climate Research Unit in the U.K.), and NCDC (National Climate Data Center). The lower-troposphere temperature satellite readings are from RSS (Remote Sensing Systems, data not released to the general public) and UAH (Univ. of Alabama at Huntsville).

The most high profile satellite-based series is put together by UAH and covers the period from December 1978 to the present. Like all these time series, the data is presented as an anomaly (difference) from the average, with the average in this case being the 30-year period from 1981 to 2010. UAH data is the earliest to be released each month.

The official link to the data at UAH can be found here, but most months we get a sneak preview of the release via the climatologist Dr Roy Spencer at his blog.

Spencer, and his colleague John Christy at UAH, are noted climate skeptics. They are also highly qualified climate scientists, who believe that natural climate variability accounts for most of recent warming. If they are correct, then we should see some flattening or even reversal of the upward trend within the UAH temperature time series over a long time period. To date, we haven’t (click for larger image).

UAH Satellite Tempertures Feb 14 jpeg

That said, we also haven’t seen an exponential increase in temperature either, which would be required for us to reach the more pessimistic temperature projections for end of century. However, the data series is currently too short to rule out such rises in the future. Surprisingly, The Economist magazine has just published a very succinct summary of the main factors likely accounting for the recent hiatus in temperature rise.

One of the initial reasons for publicising this satellite-based data series was due to concerns over the accuracy of terrestrial-based measurements (worries over the urban heat island effect and other factors). The satellite data series have now been going long enough to compare the output directly with the surface-based measurements. All the time series are now accepted as telling the same story (for a fuller mathematical treatment of this, see Tamino’s post at the Open Mind blog here).

Note that the anomalies produced by different organisations are not directly comparable since they have different base periods. Accordingly, to compare them directly, you need to normalise each one by adjusting them to a common base period.

Links for the Week Ending 2 March 2014

  • Martin Wolf has been revisiting the robots and jobs topic over the past few weeks in a couple of articles in The Financial Times here and here (free access after registration). This is a theme I have been addressing a lot recently in a series of posts starting here. Wolf finishes his last article with the observation that technology does not always have to shape institutions; it should be the other way around: “A form of techno-feudalism is unnecessary. Above all, technology itself does not dictate the outcomes. Economic and political institutions do. If the ones we have do not give the results we want, we must change them.” I agree, but this will not be easy.
  • I have also just discovered a fascinating blog that pulls together articles on the new robot economy called RobotEnomics (sic). For example, check out this post on the economic implications of driverless cars.
  • California has experienced significant rainfall over the last few days. The latest Drought Monitor (released weekly) doesn’t capture this rainfall, so we should see some slight improvement when the next update comes out. Critically though, California’s water bank—its high mountain snow pack—is still running at around 20% of average. You can see the end month figures as measured by the Department of Water Resources here and an article giving background to the snowpack hereMother Jones has some nice graphics on the crops being hurt by the drought here, while The Atlantic has a very interesting (and very long) article on the history and future of California’s massive water engineering projects here.
  • Here I go again: linking to the March 1998 Campbell and Laherrere article titled “The End of Cheap Oil” in Scientific American. The authors ended the article with this sentence “The world is not running out of oil—at least not yet. What our society does face, and soon, is the end of the abundant and cheap oil on which all industrial nations depend.” Average price of Brent crude in 1998: $13.2 per barrel, equivalent nowadays to around $19 after adjusting for inflation. Brent now: $109 per barrel. But isn’t fracking going to give us an endless supply of cheap oil?  Here is an article in Bloomberg titled “Dream of Oil Independence Slams Against Shale Costs”. In other words, Campbell and Laherrere continue to be proved right and the energy cornucopians continue to be proved very wrong.
  • For technological optimists the dream is for a transformational technology that can permanently alter the energy supply equation. Fusion has always been one such hope, but forever decades away from commercial development. The New Yorker has just published a superb article called “A Star in a Bottle” on the International Experimental Thermonuclear Reactor (ITER) being built in France. The audacity and scope of the project is extraordinary. Yet my takeaway from the article is that fusion provides little hope of providing a timely saviour with respect to either climate change or fossil fuel depletion.

U.K. Floods: If Only They Had Dredged?

Over the last couple of months, we have seen a pack of welly-booted U.K. journalists from News at 10, Newsnight and Channel 4 News interviewing a series of rather depressed and bedraggled flood-victims of Somerset and the Thames Valley. A good proportion of these have been parroting The Daily Mail‘s claim that it is all the Environment Agency’s fault through a lack of dredging.

Is it true?

Let’s look at an Environment Agency presentation called “River Dredging and Flood Defence: To Dredge or Not to Dredge“. To start with, we have a river channel and an adjacent floodplain.

EA Floodplain jpeg

When we have extreme precipitation, the channel overflows into the floodplain.

EA What Happens jpeg

The critical question for the Environment Agency relates to the relationship between the flow of water that can be accommodated in the channel to the flow of water that can be contained within the adjacent floodplain during an extreme weather event.

EA Change to River jpeg

And in the case of the January floods, we do have data to suggest that river flow has been setting records and so could not have been accommodated in most river channels unless extraordinary aggressive and expensive dredging had taken place. The chart below is from the Centre for Ecology and Hydrology’s January 2014 Hydrological Summary (click for larger image):

River Flow jpeg

The black circles indicate exceptionally high flow, and the arrows a new record. So the Thames, for example, recorded a flow 263% above the long-term average in January (you may need to follow the link to the monthly report to see this clearly).

The Environment Agency then goes on to point out that if you want to keep water off the floodplain then you would need to both deepen and widen river channels and repeat dredging at regular intervals to prevent silting. This all costs money, but it can be done.

In my former home of Japan, flood defence has to account for deluges following typhoons. Accordingly, the system must cope with irregular floods that are many multiples of the long-term average river flow. As a result, the country is covered in massive concreted river channels through which just a trickle of water flows for many years—until a big typhoon hits. It is not pretty flood defence. You could do this to the Thames, but it certainly would no longer look like the Thames of Jerome K. Jerome or Kenneth Grahame by the time such work had finished.

Moreover, you can’t modify the river channel at one point and not at others. If you did, water flow would hit pinch points, which are frequently structures like bridges.

Pinch Points jpeg

The agency also directly contradicts an assertion made by Christopher Booker of the Daily Telegraph. Booker claimed that if only dredgers were allowed to dump silt on the river banks costs could be kept down. The dreaded EU is supposed to prevent such a common sense action for environmental reasons. As with all Booker’s writings, anything that doesn’t fit the story is, of course, left out; in particular, the common sense result of dumping silt on river banks is to raise the floodplain and reduce the volume of water that can be held therein.

Raising the Floodplain jpeg

Against this background, let’s return to The Daily Mail accusation; “non-dredging “was what did it”. Below is an example of the reportage. As an aside, almost all the U.K. climate skeptic web sites have been carrying these photos as well. The Daily Telegraph also ran with them in an article here. They specifically relate to the Somerset Levels.

Daily Mail Dredging jpeg

What I don’t like about these ‘before and after’ photos is that they are taken from a different angle. In the second photo, we can’t see the span of the river channel since the left bank blocks our view. However, there does appear to be some build-up of the river bank on the right bank as evidenced by the circular hole being partially obscured. Nonetheless, we can’t calculate the change in flow capacity from these ‘before and after’ pictures alone.

We also don’t know exactly when the original photo was taken: only in the ’1960s’. We do know, however, that the Somerset Levels had a massive flood in 1960. You can see this in a County Gazette 50th anniversary article here, with a series of photos here. There were also, incidentally, major floods in the area in 1951, and these led to comprehensive flood defence works—which obviously weren’t sufficient to cope with the rainfall in 1960.

The Daily Telegraph story with the ‘before and after’ photos also includes this assertion from a local pro-dredging group Flooding on the Somerset Levels Action Group (FLAG).

A spokesman for the FLAG group has got hold of meticulous rainfall records for the area around the Parrett and Tone for the last 20 years.

They reveal between December 1993 and February 1994 around 20 inches of rain fell – five inches less than during the same time this year.

A spokesman for the group said: “So roughly the same rainfall but far more flooding now.

“What has changed? Dredging seems to be the biggest obvious difference between then and now.”

Based on FLAG’s figures, 20 inches fell in 1993/4 against 25 inches on this occasion. This is not really “roughly the same rainfall”; 2013/4 is 25% higher than 2013/14 on their numbers. Given we experience flooding when a river channel reaches capacity, this 25% by itself looks significant.

Moreover, we don’t have access to FLAG’s data, but we do have access to rainfall records via the Met Office for the nearby Yeovilton weather station. Yeovilton is approximately 18 miles away from Burrowbridge, which is in turn in the heart of the Parrett/Tone area and the site of the bridge photoed above. Yeovilton is also 5 miles north of Yeovil in the Environment Agency map below (although not marked) and firmly inside the Parrett Catchment Flood Management Plan (CFMP).

Parrett CFMP jpeg

In December 1993 rainfall was 117.9 mm at Yeovilton, 99.8mm in January 1994 and 81.1 mm in February 1994. That comes out at 28.9 cm or 11.4 inches, although I admit it is very difficult to tell if we are comparing apples and pears here. We don’t yet have Yeovilton data for the current flooding episode (is released with a lag), but when it comes out we will have a direct comparison of 1993/4 and 2013/4 rainfall.

Interestingly, while the Somerset Levels have been badly affected by the current floods, centres of population that are deemed at risk have mostly come through unscathed. A June 2013 report on the Parrett CFMP pinpointed the larger towns susceptible to flooding, none of which have been hit hard this time around:

One Percent Change of Flooding jpeg

So Environment Agency flood defence work to date can be seen as quite successful in terms of protecting built-up areas in the Parrett CFMP. It has been less successful in protecting isolated villages and rural areas, but to a large degree this was due to choice rather than neglect.

At this point, I will stress that the Environment Agency’s own work does show that dredging would have a substantial impact on reducing the number of days villages and agricultural land remain under water. The charts below have been going around all the usual climate skeptic blogs and are taken from an Environment Agency hydraulic modelling study that can be found here:

Drainage Model Parrett jpeg

The charts show a far quicker recovery from flooding events after theoretical dredging, although in the case of Curry Moor the peak flood level is very similar before and after the dredging.

So is that game, set and match to the pro-dredgers? Not quite. For a start, we should always keep in mind that the Somerset Levels are quite unique in terms of their hydrological challenges and don’t provide any real lessons for major rivers such as the Thames. But far more important is the fact that the same climate skeptic blogs who printed the above chart rarely print this one (source here):

Parrett and Tone Cost Benefit jpeg

The above charts show benefits and costs under different assumptions So the choice to dredge could have been made, but the benefit-to-cost ratio (BCR) would be somewhere around one to one.

Is such a project in line with government guidelines? In May 2011, a new approach to the funding of projects was unveiled by the Department for Environment, Food and Rural Affairs (DEFRA) with respect to flood and coastal erosion risks . You can find details here. The underlying methodology underpinning all government benefit-cost-ratio calculations can be found in the Green Book, the Treasury’s bible for expenditure decisions.

An evaluation of Defra’s flood defence project decision-making was undertaken by the hydrology specialist JBA Consulting in March 2012 (here); they looked at a series of case studies to see which ones would have received government grants. You can see these here (click for larger image):

JBA Consulting BCR jpeg

The media frequently reports a benefit-to-cost ratio of eight as the hurdle rate for flood projects to go ahead, but, as the table above shows, the methodology is more complex than that. Nonetheless, only three of the six projects that would have been eligible for flood defence grant in aid—the ones which had partnership funding scores above 100%. These three had benefit-to-cost ratios of 7.23, 8.14 and 4.68, respectively. The Parrett and Tone dredging scheme can’t get anywhere close to these numbers.

In short, David Cameron, who has just announced the recommencement of dredging on the Somerset Levels, has thrown Treasury benefit-to-cost ratio guidance out the window due to the political fallout from the floods. I doubt, however, that this ad hoc response will translate into the longer term promotion of flood defence projects with low BCR ratios. Why should flood defences be treated differently from cancer wards, kindergarden places, pedestrian crossings or pollution control? And given that the Treasury has a hard budget restraint, my belief is that they won’t be once the political hullabaloo has subsided.

Meanwhile, climate change has barely begun to change the risk equation. The Environment Agency has this to say about the Parrett catchment in a time of climate change:

Climate Change Parrett jpeg

The 500 mm sea level rise figure for 2100 is possible, but the scientific consensus in now for an upper level of possible sea level rise of one metre or more by century end. Nonetheless, even using the agency’s current, rather conservative assumptions, flood risk for  towns in the Parrett catchment jumps significantly:

Current and Future Flood jpeg

My own personal opinion is this: since climate change is radically altering flood risk, individuals should proactively protect themselves. Expecting the government to always bail oneself out through dredging, insurance or whatever is, in itself, a high risk strategy. There is a limit to what the government can do—or afford to do. And when the government reaches such a limit, you are on your own.

Links for the Week Ending 23 February 2014

  • The so called hiatus period of flat-lining global mean temperatures has certainly been a godsend for the climate skeptic lobby. A lot of this recent change in temperature trend is due to the ENSO cycle: El Nino years are generally hot years, and we haven’t recently had many strong El Ninos. You can see this effect in the NASA chart here. So the next time we get a strong El Nino year expect to see a new global mean temperature record. When will we see the next one? This intriguing guest post on Jeff Masters Wunderblog suggests we may be due for a big El Nino in 2014. If true, expect to see the ‘hiatus period’ disappear from the climate skeptic lexicon.
  • By coincidence, I saw the NASA chart above reproduced in a blog post on Econbrowser by Menzie Chinn. Hamilton and Chinn, who co-author the blog, are two of the most respected economists in the world. Hamilton wrote one of the standard time series texts that a generation of econometricians grew up on. Chinn’s post is titled “Economic Implications of Anthropogenic Climate Change and Extreme Weather“. He takes aim at those who think we can easily adapt to climate change, pointing out that not only will the trend change but also volatility. All of this will cost a lot of money.
  • The global media has picked up on the Californian drought to a certain extent. If you want to track it yourself, click the U.S. Drought Monitor page here. There has been far less coverage, however, of the Brazilian drought; here is a rare piece of coverage by National Public Radio. And what is happening in Brazil is already having an effect on food prices as witnessed by the skyrocketing price of coffee; see here.
  • I have frequently commented that despite rising resource constraints  and a productivity slowdown, global GDP growth has motored on at around 3% per annum regardless. The is mostly because China has acted as a growth locomotive for everyone else, offsetting anaemic growth in the U.S., Europe and Japan. So if China’s growth collapses, this will likely mean that global growth takes a step-change downward (the other BRICs and MINTs have their own problems). Having seen Japan’s experience first hand (one day growth, the next day no growth), I have been a huge skeptic of China’s economic model. But to date, the sky has not fallen down. The BBC’s economics correspondent Robert Peston has just produced a short documentary called “How China Fooled the World” that sets out the pessimist’s case and can be found on iPlayer. If you have a problem accessing BBC content, try this link at YouTube here.
  • Most web-based technology favours scale: it facilitates ‘winner takes all’ economics. Think Google and Facebook. Yet it also reduces the cost of information and, potentially, small production runs. This, in turn, favours the so called ‘long tail’. This strange dance between the centrifugal and centripetal forces of information technology is a source of both fragility and resilience as we face resource and climate change challenges. For a slightly different riff on the same theme see this article by the economist Robert Frank in The New York Times.

UK Floods: Don’t Say We Weren’t Warned (Part 2)

In my last post, I looked at the the recent history of climate change policy with respect to flood risk in the U.K.; in particular, a series of benchmark reports that starkly set out the flood risk choices the government and British people would have to take.

The realisation that climate change had transformed the flood risk game, together with the fallout from the 2007 floods as documented in The Pitt Review, led the Environment Agency to establish a long-term strategy (2010 to 2035), which was set out in a 2009 report entitled “Investing for the Future: Flood and Coastal Risk Management“.

The report identified 5.2 million households at risk of fluvial flooding (river and coastal), pluvial flooding (surface water and ground water) and both fluvial and pluvial.

Properties at Risk jpeg

And of the 2.4 million fluvial category, a more detailed breakdown in terms of expected frequency of flood was given.

Fluvial Flood Risk jpeg

Moreover, here are the critical regions for such fluvial flooding:

Properties at Flood Risk by Area jpeg

The media still reports the 5.2 million ‘at risk’ number of properties, despite climate science having moved on since 2009. Indeed, the above figures are all from the Environment Agency’s “Flooding in England: A National Assessment of Flood Risk” which came out in 2009 but is based on data for 2008. In my previous series of posts on flood risk, I pointed out that the current risk categories do not adequately take into account climate change or non-fluvial flooding (i.e., non-river and coastal flooding). Accordingly, all these numbers now look to be significant underestimates.

A few leaks have emerged with respect to the 2013 National Assessment (for example, in The Guardian here), but not enough to get a good impression of how the Environment Agency has moved the numbers around.

More controversially, the 2009 “Investing for the Future” report also gives us some cost-benefit estimates out to 2035. We start with 1.2 million houses at significant or moderate risk and then see how many still fall into these categories 25 years later. The base is £800 million per year divided between £570 million in tangible investment (building new flood defences and maintaining existing ones) and £230 million in intangible investment (mapping, analysis, warning systems and so on). Note that this was a forecast expenditure for 2010/11, since the report was written in 2009.

Flood and Coastal Risk Expenditure jpeg

From this future base period, five different expenditure scenarios were plotted, ranging from conservative to aggressive.

Investment Scenarios jpeg

Under the most conservative investment scenario, number 1, expenditures would flat line at £800 million per annum. Due to inflation, however, this would mean a reduction in spending in real terms, with the result that the U.K.’s flood defences would not keep up with rising climate change-related flood risk. As a result, an additional 330,000 homes would be added into the ‘significant’ and ‘moderate’ risk categories by the year 2035. By contrast, under the more free-spending scenarios, properties ‘at risk’ would fall.

The preferred scenario was Scenario 4, which would give a benefit-to-cost ratio of seven. In the words of the report:

It provides the greatest overall benefit to society, because it generates the greatest net return on investment. However, with this scenario, spending needs to increase from the £570 million asset maintenance and construction budget in 2010-2011 to around £1,040 million by 2035, plus inflation. This equates to an increase in investment in asset construction and maintenance of around £20 million plus inflation each and every year.

Cost Benefit Flood Investment jpeg

So what actually happened? Well, for a start, the Lehman shock. Then, the global economy fell into its worst recession since the Great Depression. Subsequently, David Cameron replaced Gordon Brown in May 2010, and the new Chancellor of the Exchequer, George Osborne, pledged the party to a policy of austerity. And the flood defence agenda was not immune from the cuts.

The House of Commons Library has all the up-to-date annual expenditure numbers in a research briefing called Flood Defence Spending in England. This shows that spending fell in real terms under the coalition government:

Flood Defence Spending jpeg

Accordingly, the expenditure path since the 2009 report has been lower than Scenario 1. In other words, a fall in spending in both real and nominal terms was witnessed, against a recommended incremental increase in real terms of £20 million per year.

Would a different expenditure path have made much difference to the flood-related losses currently being experienced? Probably not. The kind of precipitation seen over 2013/14 would have required vast expenditures to defend against. But the path taken has certainly increased future risk: each yearly incremental shortfall leads to a growing overall flood defence deficit.

The key question, then, is how the current flood defence expenditure path and flood trend will translate into future losses. The political battering the coalition governing is currently taking has already led to something of a U-turn, with a number of pledges being made to increase flood-defence spending. These include:

Response to 2014 Winter Floods jpeg

However, we are yet to see how such spending fits in to the original flood defence investment recommendations made by the Environment Agency.

Ironically, as part of Britain were drowning, the new public-private flood insurance agreement was winding its way through parliament as part of the Water Bill. A summary can be found here and a useful Q&A is available at the Association of British Insurer’s site here.

As with the previous Statement of Principles, the new mechanism relies on a cross subsidy from low risk properties to high risk properties, although it is all organised though a new legal entity: Flood Re. In other words, those properties not at risk of flood pay slightly more in insurance premiums than their actual risk profile would warrant, and these premiums build up to become the capital of Flood Re.

Nonetheless, every household subject to any kind of flood risk must be aware of the many caveats to the scheme. Principally:

  • Flood Re’s risk numbers are based on the Environment Agency’s “Flooding in England: A National Assessment of Flood Risk” which came out in 2009, but is based on data for 2008. Accordingly, all the property at risk numbers are out of date and do not adequately take account of climate change.
  • This is a transitional arrangement slated to run for 20 to 25 years. At the end of this period, market-based pricing is supposed to rule. Given the climate change trend, this means that ‘at risk’ households will be subject to a) major insurance premium increases in future or b) no availability of flood insurance at all. Moreover, the property market will not wait until the end of the scheme, but will mark valuation down well in advance; remember, no insurance means no access to a mortgage.
  • High-end houses in Council Tax Bank H are not included. So all those Thames-side mansions currently under water are on their own.
  • Properties built after 2009 do not fall under the scheme.
  • Businesses and buy-to-let properties are not covered.
  • ‘Uninsurable’ properties will not be included. This is suitably vague so as to provide a get-out clause for the industry. It will be interesting to see if all those Thames-side properties flooded in both the 2012/13 and 2013/14 winters will be classed as ‘uninsurable’.
  • The government has provided a comfort letter, but not subjected itself to a legal requirement to support the scheme should it fall into financial distress. It promises to back stop the scheme if a one-in-200-year event takes place, giving rise to a loss of £2.5 billion or more (note 2007 would fall into this category with a loss of £3 billion). With climate change progressing, however, what is a one-in-200-year event now will not be a-one-in-200 event in future.

To support my assertion that Flood Re is currently begin designed around dodgy ‘at risk’ numbers I refer you to a paper by Swenja Surminski and colleagues from the Grantham Research Institute on Climate Change and the Environment critiquing Flood Re, which can be found here. Two bullet points from the report summary stand out:

Grantham Flood Report jpeg

Note that Dr Surminski was the Association of British Insurer’s Climate Change Advisor from 2007 to 201o.

Against this background, if we get a couple of 2007 type floods in quick succession in the relatively near future (i.e., before Flood Re can build up capital), then the government would likely be expected to act on its comfort letter and recapitalize Flood Re.

At some point, however, the government interest in excluding ‘uninsurable properties’ will fall in line with that of the insurance industry—otherwise the government could get into the untenable position of bailing out property owners whose houses are getting flooded a number of times per decade. Or, alternatively, the government may have to investment massive amounts of money in flood defence to protect these properties. But for many properties, the flood defence expense would dwarf the value of the properties themselves, making abandonment the most sensible policy choice.

The bottom line for all this is that Flood Re will likely provide only a temporary respite for most holders of ‘at risk’ properties. As climate change progresses, more and more ‘at risk’ properties will be shunted out of Flood Re and into the ‘uninsurable’ category. And for many properties, flood defence may just prove financially unfeasible. Don’t say you weren’t warned.

UK Floods: Don’t Say We Weren’t Warned (Part 1)

The public’s reaction to the recent extensive flooding in the U.K. has been one of surprise and frustration. Surprise at the government’s lack of prior planning for flood and frustration at the speed and size of response.

Yet, frankly, there is nothing really to be surprised about. The 2013/14 floods are part of an emerging pattern, which includes the 2003, 2007 and 2012/13 deluges. Accordingly, decisions have not been made in a vacuum; risks have been assessed by the government based on recent flood events and conscious cost-benefit choices made—and there is a paper trail of documents to prove this.

Tackling flood risk is also a classic case of decision-making under uncertainty. Floods have always followed their own probabilistic logic, but climate change has made the range of outcomes far more uncertain. Risk is probability times effect, and climate changes is pushing up flood frequency and severity. This is what both policy-makers and private individuals face.

The realisation that the U.K. has a problem with respect to flood and climate change moved from academia into government over a decade ago. This post looks at the major reports that have shaped government thinking, while the next looks at how this has translated into the speed of flood-defense roll-out and the extent of insurance provision that will be available going forward.

A decade ago, two benchmark reports were published on flood risk: one a private sector report under the auspices of the Association of British Insurers (ABI) called “The Changing Climate for Insurers”, the other a government-sponsored report called “Future Flooding” from the state’s futurology think tank The Foresight Programme. Four years later, “The Pitt Review” was released in response to the catastrophic floods of 2007. And following a recommendation within the Pitt Review, in 2009 The Environment Agency published its long-term strategy document called “Investing for the Future: Flood and Coastal Risk Management”, which looked out to 2035.

All four reports took climate change as a given. All four reports set out the hard choices that have to be made as climate change loads the dice for flood risk.

Let’s start with the ABI’s “The Changing Climate for Insurers” the Executive Summary of which can be found here. The report contains a forward by the then Head of General Insurance of the Association of British Insurers (ABI), John Parker, who was unequivocal about the role of climate change:

Climate change is no longer a marginal issue. We live with its effects every day. And we should prepare ourselves for its full impacts in the years ahead. It is time to bring planning for climate change into the mainstream of business life.

The report suggested that the finger prints of climate change were already showing up in the data:

Extreme Months jpeg

And the author of the report, Dr Andrew Dlugolecki, went on to put the government on notice that things would have to change:

Climate change will increase the frequency and severity of extreme weather events, as well as longer-term trends in weather. The possibility of weather-related catastrophic losses will be much greater, raising issues for insurers of both insurability and capacity. Insurance can only provide a suitable risk transfer mechanism if risks are kept to a manageable level. The insurance industry has a role on behalf of its customers and shareholders, in highlighting the scale of the risks, and examining steps that are needed by all parties (including the Government) to manage risks and ensure that financial protection remains available for the majority of customers.

He also put private-sector customers on alert that things would change for them too. Specifically, owners of ‘at risk’ properties would need to be prepared for ‘at risk’ insurance pricing. Moreover, ‘at risk’ would be less based on what had gone before than on what is likely to come:

The increasing use of risk pricing could enable future claims to be met, provided risks are assessed accurately. Historic claims and climate data will not provide adequate models, so reliable alternatives need to be developed.

Meanwhile, the government had put its futurology team on the case.  The Foresight Programme (which looks 20 to 80 years into the future) was asked to answer the following two questions:

  • How might the risks of flooding and coastal erosion change in the UK over the next 100 years?
  • What are the best options for Government and the private sector for responding to the future challenges?

The Foresight Flooding and Coastal Defence Report came out in April 2004 and was built around four global growth scenarios: World Markets, National Enterprise, Local Stewardship and Global Sustainability. The Executive Summary can be found here.

World Markets is the global capitalism scenario with minimal mitigation of greenhouse gas emissions; National Enterprise is World Markets with protectionism; Local Stewardship is green utopia; and Global Sustainability is statist in nature.

Based on the four scenarios, Foresight produced estimated annual damage (EAD) figures out to 2080 for both fluvial (river in this case) and coastal flooding as well as what they called intra-urban flooding (pluvial, i.e. rainwater run-off, and groundwater).

Annual Average Flood Damage jpeg

For the World Markets scenario—the dash for growth and ignore carbon emissions one—flood damage is estimated to rise 20-fold by 2080. On a GDP basis the position at first glance looks a lot better with damage only rising by 50%.

Annual Average Flood Damage % of GDP jpeg

Against current annual average cost of £1.4 billion from floods, the report estimated that £800 million per annum was spent on flood and coastal defences. Moreover, it also calculated than an annual increase in above-inflation spending of £10 to £30 million would be required just to maintain flood risk at its current level.

The report then looked at a suite of policy measures. Some would be at the macro level in terms of carbon emission mitigation, others at the local level such as increased flood defence and flood proofing. Rather optimistically, the report believed that if all their micro measures where enacted, there would be a dramatic reduction in flood-related damage. Note ‘high’ risk means a greater than one in 75 chance of flooding in any given year.

People at Flood Risk jpeg

Finally, Foresight also realised that the private sector would likely take a much more hard-nosed approach to flood insurance going forward, with the result that the government may have to step in to fill a potential insurance void.

The availability of insurance to cover the costs of flood damage will vary depending on changes in risk and society’s ability to pay. Cover could range from a continuation of the current situation to progressive withdrawal of cover for areas at greatest risk of flooding. Government might have to consider how to respond to pressure to act as insurer of last resort if the insurance market withdrew cover from large parts of the UK, or if there was a major flood which the insurance market could not cover.

The issues raised by the 2004 reports were brought into sharp relief when the British insurance industry took a £3 billion hit in 2007 after major fluvial (river and sea) and pluvial (rain) related flooding. The 2007 floods were unusual in a numbers of ways:

  • The floods took place during the summer
  • Surface water flooding (pluvial) caused as much damage as river and coastal (fluvial)
  • 55,000 houses were flooded and 13 people died
  • Half a million people lost access to mains water and electricity
  • Insurance industry claims reached £3 billion and total damage over £5 billion

In response to the 2007 floods, the government commissioned a report by Sir Michael Pitt, which was published in June 2008 as the Pitt Review. You can find the Executive Summary here.

In conjunction with the review, Sir Michael Pitt requested an update of the Foresight Future Flooding Report that I referred to above. The new Foresight report contained a more pessimistic assessment of climate change impacts on flooding. In particular, it estimated higher sea level rise and rising precipitation leading to more fluvial and pluvial flooding as compared with the 2004 analysis. It did not, however, provide updated figures for estimated annual damage from flood.

Based on the experience of 2007 and the new numbers from Foresight, Pitt recommended that the government commit to a long-term approach to investment in flood risk management, planning up to 25 years ahead. Following Pitt’s recommendation, “Investing for the Future: Flood and Coastal Risk Management” was published by the Environment Agency in 2009, and this has provided the foundation for all cost-benefit decisions regarding flood defence investment since then.

At the same time, Pitt supported the maintenance of the status quo with regard to insurance provision as contained in the Association of British Insurer’s “Statement of Principles”. Under this agreement, the ABI agreed to continue providing insurance to ‘at risk’ households at, in effect a discounted price. Low risk households would foot the bill, being charged slightly more than their risk profile would warrant. The government, meanwhile, would agree to beef up flood defence, but would not get directly involved in either insurance provision or in back-stopping major catastrophe loss.

The ABI had other ideas. The 2007 floods saw 180,000 claims and insurance payouts of £3 billion. The worry for insurance executives was that if 2007 could produce a £3 billion loss event, then what could happen once climate change really accelerated? The possibility existed of an open-ended loss of inestimable proportions. The insurance industry can absorb large losses by just hiking premiums in future years, but a super-large tail risk loss event is a different matter. At a certain point, insurers can be knocked out of business, and thus not survive long enough to hike future premiums.

Faced with the ABI’s implicit threat to walk away from ‘at risk’ households and leave them uninsured, the government realised Pitt’s idea of a continuation of the status quo was a non-starter—and so was born Flood Re. I will look at this agreement in my next post, along with the cost-benefit choices the government is being forced to make with flood defences.

Flood Risk in the UK: You Ain’t Seen Nothing Yet

Rather frustratingly, I have had little time to blog this week in the midst of unprecedented flooding in some areas of the U.K. However, lots of climate change chickens will come home to roost over the next few decades, and flooding will be prime among them.

I will newly blog on this topic as soon as possible (not least on the joint government-insurance industry initiative called Food Re currently being wheeled out).

Nonetheless, I did a very detailed set of blog posts on UK flood risk last year (here, here, here and here) and I believe the analysis still stands. Flood risk is like the hot potato game at a children’s party: No-one wants to hold the risk and everyone wants to pass it on to some-one else.

Moreover, this is a horrible, evolving risk that even industry experts are having trouble pinning down. My advice is that if you can’t quantify a risk, try to avoid it. This is the message of the last blog post in the series which I repeat below:

The National Flood Risk Assessment  (NaFRA) of 2008, conducted by the Environment Agency (EA), calculates that 330,000 properties are at ‘significant’ risk (defined as one in 75 years) of fluvial flood in England. The survey is a bit long in the tooth nowadays, and I expect that if they repeated the assessment exercise today, more houses would fall into the ‘significant’ risk category.

In a similar vein, The Association of British Insurers’ (ABI) submission to the U.K. parliament talks of around 200,000 homes (some 1 to 2 percent of the total housing stock) that would now find it difficult to obtain flood insurance if open market conditions solely determined availability (and if they can’t get insurance, they won’t be able to support a mortgage).

For climate change “skeptics” who believe in free markets, the fact that the British insurance industry takes climate change as a given, and has done so for many years, is a difficult fact to face. In a forward to a report called “The Changing Climate for Insurers” back in 2004, The ABI’s then Head of General Insurance John Parker was unequivocal:

Climate change is no longer a marginal issue. We live with its effects every day. And we should prepare ourselves for its full impacts in the years ahead. It is time to bring planning for climate change into the mainstream of business life.

What the ABI is doing through requesting the government to create a new insurance arrangement after the expiry of the Statement of Principles agreement in June 2013 is to “prepare ourselves for (climate change’s) full impacts in the years ahead”. We can hardly say we were not warned.

We can also hardly say that climate change is alarmist nonsense or a socialist plot. The insurance industry is about as close to “red in tooth and claw” capitalism as one could get. And the message from Mr. Market in his insurance industry incarnation is very clear: climate change is coming to a place near you very soon—get used to it.

Yet the ABI has blurred the line between uninsurability and unafordability. Tim Hartford in his Financial Times’ “Undercover Economist” column sets out three hard-to-insure risks. First, genuine unknown unknowns, where the insurer has no idea of the shape of the frequency distribution and severity distribution. Second, the adverse selection situation, where there is an asymmetry of information acting against the insurer: those who know they are bad risks use their effective insider knowledge to seek out and profit from insurance. Finally, insurance that is just expensive. He puts flood insurance into the final category:

Now the third kind of hard-to-insure risk is stuff that’s expensive and happens quite often. I’m trying to buy a house, I’m nearly 40 and so I’m trying to buy insurance for my family in case I die or become too ill to work. This is perfectly possible: it’s just expensive, because it’s not unusual for middle-aged men to get seriously ill. This sounds like a much better description of allegedly uninsurable homes: if there is a one in five chance of a flood, and a flood is going to cost £50,000, don’t expect to pay less than £10,000 a year for flood insurance.

…..but unaffordability is not uninsurability. It’s insurable but expensive.

If these homes actually were uninsurable the government would need to step in and cut some kind of deal with the insurance industry – exactly the kind of deal that has lasted for the past few years and seems about to unravel. But if the problem is unaffordability, trying to solve it by cutting a deal with the insurance industry is just a way of obscuring what is really going on. The real solution is simple and stark: the government needs to decide whether it wants to pay people thousands of pounds a year to live in high-risk areas or not.

And in austerity Britain, no Chancellor of the Exchequer really wants to shoulder these extra payments.Hartford goes on:

If (the government does not want to pay), then people who currently live in flood-risk areas will see the price of their homes collapse…..

……To whatever price would tempt people to live somewhere that was not only prone to distressing and disruptive floods, but was also hugely expensive to insure. Which in extreme cases will be “zero”.

The charity the Joseph Rowntree Foundation in its March 2012 Viewpoint Report entitled “Social Justice and the Future of Flood Insurance” describes Hartford’s preferred approach as ‘individualist, risk-sensitive insurance’ as opposed to their preferred approach called ‘solidaristic, risk-insensitive insurance’.

Personally, I prefer to call a spade a spade: what we are talking about here is the socialization of risk, or alternatively we could call it the socialization of insurance-related losses. The word ‘solidaristic’ just appears a euphemism to me. But that is not to be judgemental. The right as well as the left appear desperate to keep the word ‘socialization’ hidden in the closet. For example, when the well-heeled Thames Valley occupants of river-front properties write to their local Conservative Party MPs to complain about their treatment at the hands of their insurers, I am sure the letters will contain few passages calling for the socialization of risk. So ‘individualistic, risk-sensitive insurance’ has few natural supporters among home-owners.

The Rowntree Foundation, however, goes on to define three types of fairness: 1) actuarial (espoused by Tim Hartford and Mr. Market), 2) choice-based and 3) social justice.

The choice-based distinction appears a bit of a diversion to me: few people consciously choose to buy a house with the potential for repeated episodes of flooding. Perhaps, as with all things climate change, many people chose not to know by failing to deepen their knowledge of climate change, but this seems an impractical basis on which decide whether any individual property on a flood-prone street gets insurance cover or not. Further, we must always remember the terrifying speed with which climate change has appeared on the horizon as a major social- and economic-policy issue. The Intergovernmental Panel on Climate Change (IPCC) was only created in 1988, and the issue has still not fully made the transition from academia to the general public. So I think few people have knowingly decided to take their chances with climate change when it comes to flood.

So given the foundation’s mandate to support society’s disadvantaged, it is not surprising that social-justice fairness is proposed as the governing philosophy, and ‘solidaristic, risk insensitive’ insurance as the preferred option. Whether the Labour Party will adopt the same approach we do not know. To date the only utterances I  have heard from them on the subject are criticisms of the ruling coalition government for prevarication in reaching an agreement with the ABI. Ironically, a Labour Party victory may be Mr. Angry of Pangbourne on Thames’ best bet to get insurance cover reinstated at a reasonable cost. But even if the Labour Party does adopt the flood-risk socialization principle (or its euphemism-laden equivalent), I still see trouble ahead.

The trouble comes from the degree to which climate change will distort the insurance companies’ frequency and severity probability distributions as we go out beyond a year or two. Unfortunately, data in this area appears sorely lacking. The best data we have comes from Climate Change Risk Assessment (CCRA) published by the Department for Environment and Rural Affairs in January 2012. (Note that the CCRAs will come out in a five-year cycle, so we have no update until 2017.)

The CCRA shows the total number of residential properties subject to ‘significant’ risk of flooding rising from 370,000 today (actually at the time of the last NaFRA in 2008) for England and Wales to 700,000 to 1,160,000 by the 2020s (click for larger image).

Total Properties CCRA Risk jpeg

You can see these numbers graphically here (click for larger image):

Number of Properties at Significant Risk of River or Tidal Flooding jpeg

Accordingly, the estimated annual damage (EAD) will likely rise from £640 million now to £750 million to £1.6 billion (all at current prices) in the 2020s. However, these numbers have a plethora of holes. First they refer to fluvial flooding only. The CCRA is quite clear about this:

These figures cover tidal and river flooding, but not surface water flooding….

….There are estimated to be more properties exposed to the risk of surface water flooding than river and tidal flooding. There is therefore an urgent need to develop projections of future surface water flood risk for the next CCRA.

Second, these numbers are based on the 2008 UK Climate Projection, which in turn has a whole range of assumptions, some realistic some not. The most important weakness, to me, is the sea level rise estimate (click for larger image):
.
UK Sea Level Rise Estimates jpeg
.
As with the Intergovernmental Panel on Climate Change’s Fourth Assessment Report (AR4), the analysis excludes dynamic ice sheet melt effects (Greenland and the Antarctic). As a result, the next IPCC report is likely to see a substantial hike in the sea level rise estimates as such dynamic impacts are included, and this will worsen the outlook for coastal flooding in the U.K. The next U.K. Climate Projection will almost inevitably show sea level rise estimates going up.
.
So while it is very difficult to put an exact time and amount, we appear destined to see annual damages from flood exceed £1 billion before too long. Moreover, spikes are soon likely to exceed the £5 billion of damages in 2007 (the insurance bill was the lower figure of £3 billion since not all properties were covered by insurance; moreover, even for those that were covered, the home-owners had to stump up their insurance policy excesses).
.
Then what part of those £1 billion average losses, or £5 billion plus spikes, will society decide to ‘socialize’? I think the ABI research brief dated January 2011 that I referred to in my previous post on flood risk may give us some tentative answers. According to the ABI numbers, at the time of the survey the average cost of home insurance was £220. Those within the ‘significant risk’ category paid slightly more at £260, but the ABI estimates that the true risk-adjusted payment should have been £690. The £430 is the current socialization of risk—a risk transfer from ‘significant’ flood risk home-owners to non-’significant’ risk home-owners.
.
For holders of buildings and contents policies, the ‘significant’ risk homeowners pay £340 against £282 for the average policy nationwide. Among these policy holders, most of the risk is held within the high risk tail; that is, those properties whose risk of flooding is a lot greater than one in 75 and whose depth of flooding is likely to be measured in the tens of centimetres. This can clearly be seen in the ABI table below (click for larger image):
.
Under-Pricing of Flood Risk jpeg
.
For this reason, even if we do migrate to a ‘solidaristic, risk insensitive’ flood insurance regime as advocated by the Joseph Rowntree Foundation, I still think it inevitable that the high risk tail will get chopped off as climate change progresses.
.
Fot this reason, home owners needs to take a long, hard look at their properties before the flood risk chickens come home to roost. First, they need to know whether a property is already in the one-in-75-year risk category including surface water and ground water? The answer to this won’t be found by consulting the Environment Agency’s flood map.
.
Second, owners need to ask themselves whether their property could migrate into the high end of the one-in-75 category due to climate change? If the home enters the one-in-10 category, then I think flood insurance will ultimately cease to be available whatever new agreement is hashed out between the government and the ABI. And that, in turn, means the property will not be able to act as collateral for a mortgage loan.
.
Third, if a property does get flooded , how will Mr. Market value it in the future? Stuart Clarke, Principal Engineer of West Berkshire Council, sounded one positive note in his talk at Pangbourne (that I started this series of posts with). According to one study he quoted, the value of properties falls by 12% after a flooding event, but recovers to the market level after 5 years.
Unfortunately, a review of the literature by the Adam Smith Research Foundation (which sounds like an ‘Astroturf’ libertarian think tank, but in reality is a legitimate academic outpost of Glasgow University) finds that such backward-looking analysis of how flood impacts on housing values to be all but useless. In short, the ‘bounce back’ house price effect found in the housing economics literature is dying and will soon be dead. I will quote the Abstract:
.

This paper argues that major gaps exist in the research and policy understanding  of the intersection of flood risk, climate change and housing markets. When extrapolating the research on historical flooding to the effects of future floods – the frequency and severity of which are likely to be affected by global warming – housing economists must be careful to avoid a number of methodological fallacies: (a) The Fallacy of Replication, (b) The Fallacy of Composition, (c) The Fallacy of Linear Scaling, and (d) The Fallacy of Isolated Impacts. We argue that, once these are taken into account, the potential magnitude and complexity of future flood impacts on house prices could be considerably greater than existing research might suggest. A step change is needed in theory and methods if housing economists are to make plausible connections with long-term climate projections.

This all sounds very technical, but in actual fact it isn’t. And I urge anyone of thinks they may possibly be in a-one-75- year ‘significant’ risk, or worse, property to read this report. The ‘Four Fallacies’ deserve a quick exposition at least:

a) The Fallacy of Replication: “Properties that currently experience floods are of type x and not type y. Therefore, properties that experience floods in the future will also be of type x, and not type y.”

As an example, your well-heeled owner of £2 million Thames side property already knows about flood risk and has defended against it. Further, the property will always retain an amenity value that only a river-front property, with its charming boat house, can provide. The properties 5o metres back are, in effect, subject to ‘virgin risk’; they are sub-prime properties relative to the riverfront prime, and are not set up to absorb the impact of a full-blown flooding event that goes beyond the river front (which will inevitably happen).

b) The Fallacy of Composition: “Significant financial safety nets are viable if a single area is flooded. Therefore, significant financial safety nets will be viable if all areas are flooded.”

This has been a recurrent theme of this entire blog series. We have had a system priced off a long-lasting status quo. And now the ABI says the status quo cannot continue. In the words of The Adam Smith Research Foundation:

Existing major UK floods occurred in a financial environment that ensures prospective buyers of flood-liable properties will be able to obtain both insurance and mortgage finance. It is perhaps not surprising that price effects have so far been negligible and temporary. However, once this regime starts to break down, the price effects of floods could be marked, most notably in those flood- prone neighbourhoods abandoned by insurers and mortgage lenders.

c) The Fallacy of Linear Scaling: “The impact of a flood of severity y, is of magnitude z. Therefore, the impact of a flood twice the severity of y, will be twice the magnitude of z.”

The authors pick out three examples under this heading, but I will just highlight one, since I have a keen interest in the psychology of risk:

Humans have a tendency to underestimate risks that appear distant or global, or which others seem to accept without concern (Kousky & Zeckhauser, 2006). Recent studies indicate that buyers may not have full information about properties due to  high search costs and sellers’ unwillingness to reveal information about dis-amenities such as flood risk (Chivers & Flores, 2002; Lammond, 2009).

Disclosure of flood risk is therefore likely to decrease market value (Troy & Romm, 2004; Pope, 2008). While individuals may underestimate flood risk, this may not be true if floods become frequent and ubiquitous, because flood events raise people’s awareness of potential risk (Bin & Polasky, 2004).

Even in years when a particular dwelling is not flooded, the prevalence of flooding elsewhere, communicated via accounts in news reports and social dialogue, will act as constant reminders of the household’s flood risk. People will not forget because the climate and media will not permit them; and the bounce-back effect will become a thing of the past (Pryce et al., 2011).

d) The Fallacy of Isolated Impacts: “The price of house A is reduced because it is flooded. House B is not flooded and, therefore, its price will not be reduced, irrespective of its proximity to A.”

This is really just common sense. You only need one part of the neighbourhood to be impaired in order to impact on the value of the entire neighbourhood.

The report ends with something that seems obvious to me, but I am amazed by how few have absorbed it:

Unfortunately, the housing economics literature on floods has so far developed largely under the assumption of climatic stability.

This applies to absolutely every single economic, financial and social relationship that has been forged in the modern era. The climate is undergoing a step change away from that of the holocene—under which civilisation was founded, took root and spread—to the human-generated climate of the anthropocene. The chart below, from Bart Verheggen’s blog, shows predicted temperature change set against the background of the temperature of the last 20,000 years (back to the last ice age). Message for flood prone home owners: you ain’t seen nothing yet.

Temperature Anomaly  jpeg

Can we put some numbers on the bad stuff that could happen to the price of flood-prone houses going forward? Well the Adam Smith Research Foundation certainly doesn’t.  I would sum up their paper with these words: “we used to know some stuff about the value of houses after flooding but now we don’t—and what we know we don’t know isn’t good”.

One other thing we do know is that when insurance companies know something they don’t know they will either not insure, or alternatively only insure at vast cost. Personally, when it comes to flood risk, I don’t like the fact that I know I don’t know. So when buying a house, my first reaction on coming across the merest whiff of flood risk in this new era of the anthropocene would be ‘don’t buy’. For an existing owner, the same applies: which means ‘sell’.

Finally, for those who love their houses—even if they are categorised as ‘significant’ risk—and wish to hand them down to their children, my advice would be to get active in lobbying the government to do more to prevent climate change. If you think that climate change doesn’t exist or is not your problem, then  you must understand that Mr. Market disagrees.