Battery Banter 5: The Relevance (or Not) of Moore’s Law

Concurrently with writing this series of blog posts, I have been reading Steve Levine’s newly published book “The Powerhouse: Inside the Invention of a Battery to Save the World“.  The book is a bit of a mess, full of random jumps, wrong turns and dead ends. Perhaps that is appropriate, since it describes a battery development process that is full of random jumps, wrong turns and dead ends.

While the back cover blurb tells me that the book reads like a thriller, it is more like Sir Arthur Conan Doyle’s tail “The Dog That Didn’t Bark”. We have two questing groups of heroes: the public-sector Argonne National Laboratory battery guys and the plucky privates-sector upstarts at Envia Systems. Yet the book peters out at the end, with both teams abjectly failing in their respective quests to find the super battery Holy Grail. Argonne’s new version of nickel manganese cobalt batteries (NMC 2.0) suffers from chronic voltage fade (meaning that the performance of the battery slumps after repeated recharging cycles). Meanwhile, Envia’s super battery is spectacularly flawed, based on a collapsing anode and dodgy intellectual property.

Despite the book being in need of a good edit, it is still full of interesting insights into the battery development process. In a chapter recounting conversations with Don Hillebrand, an old school auto expert working at Argonne, Levine makes this observation:

Unlike microchips, batteries don’t adhere to a principle akin to Moore’s law, the rule of thumb that the number of switches on a chip–semiconductor efficiency–doubles every eighteen months. Batteries were comparatively slow to advance. But that did not make electronics superior to electric cars.

Consumer electronics typically wear out and require replacement every two or three years. They lock up, go on the fritz, and generally degrade. They are fragile when jostled or dropped and are often cheaper to replace than repair. If battery manufacturers and carmakers produced such mediocrity, they would be run out of business, sued for billions and perhaps even go to prison if anything catastrophic occurred. Automobiles have to last at least a decade and start every time. Their performance had to remain roughly the same throughout. They had to be safe while moving–or crashing–at high speed.

At this point, I want to refer you back to the original 1965 article by Gordon Moore that ushered in Moore’s Law entitled  “Cramming more components onto integrated circuits.” From this, we have the quintessential exponential chart, which delivers a straight line if you put the y-axis onto a logarithmic scale (click for larger image):

Moore's Law Paper jpeg

This is the world of Ray Kurzweil‘s singularity which I blogged on in a post a couple of years back called “Singularity or Collapse: Part 1 (For Ever Exponential?“. As knowledge increases by powers of 10, virtually every challenge faced by mankind dissolves.

The problem here is with the rate of exponential growth. If battery capacity and solar efficiency were increasing by orders of magnitude (powers of 10) every few years, then climate change and energy resource depletion would be solved. But actually it is a bit more complicated than that: technological transformation is actually a combination of two variables: scientific progress and price. The electronics industry performed its magic on both these variables simultaneously. Further, the science was to a degree forgiving. The theoretical constraint governing the number of components that can be placed on a silicon wafer has only recently come into play.

For a solar cell, however, the constraint is visible on day one. In the UK, the average raw power of sunshine is 110 Watts per metre squared. Current solar panels are somewhere between 10% and 20% efficient. According to David MacKay in his wonderful treatise on all things energy Sustainable Energy Without the Hot Air, the maximum theoretical efficiency is limited to 60%  due to fundamental physical laws (here). Therefore, you will never achieve more than 66 Watts from your solar panel, so the  ability to keep doubling is capped. At best, you will achieve an S curve effect from a technological perspective alone.

S Curve jpeg

Of course, the second variable is price, In theory, once the solar PV panel reaches the top of the S curve above, the price could keep on falling even as efficiency stalls. So while your panels aren’t getting any better, you could plausibly plaster the world with them if they were dirt cheap. But this is a different dynamic than that of silicon chips, which benefitted from going up an efficiency curve and down a cost curve both at the same time.

And this takes us back to batteries and an article at the Bulletin of Atomic Scientists by Kurt Zenz House titled “The limits of energy storage technology“. For this series of posts on batteries, I have been working in units of kilowatt-hours (kWh), but the Bulletin article works in units of megajoules (MJ). In case you want to follow my link to the Bulletin article, note that 3.6 MJ equals 1 kWh. Consequently, a top of the range Tesla has a battery capable of storing roughly 306 MJ (85 kWh), the Nissan Leaf 86 MJ (24 kWh) and the BMW i3 68 MJ (19 kWh).

Crucially, House points out that a kilogram of crude oil contains 50 MJ of potential chemical energy, equivalent to almost 14 kWh, not much less than the entire BMW i3 battery which weighs in at 230 kg. Of course, internal combustion engines aren’t particularly efficient, so in a petrol engine only around 20% of the energy makes it from fuel to wheels (the rest is lost as heat); for an EV, this figure is around 90%. So our BMW has a battery capacity of 80 Watt-hours per kg (roughly 0.3 MJ), of which 72 Watt-hours gets to the wheels, while its conventional cousin translates 1 kg of petrol into perhaps 2,800 Watt-hours of motion–so 40 times better.

But what of the theoretical limits posed by battery chemistry as technology improves? House has this to day:

Due to the theoretical limits of lead-acid batteries, there has been serious work on other approaches such as lithium-ion batteries, which usually involve the oxidation and reduction of carbon and a transition metal such as cobalt. These batteries have already improved upon the energy density of lead-acid batteries by a factor of about 6 to around 0.5 mega-joules per kilogram–a great improvement. But as currently designed, they have a theoretical energy density limit of about 2 mega-joules per kilogram. And if research regarding the substitution of silicon for carbon in the anodes is realized in a practical way, then the theoretical limit on lithium-ion batteries might break 3 mega-joules per kilogram.

Therefore, the maximum theoretical potential of advanced lithium-ion batteries that haven’t been demonstrated to work yet is still only about 6 percent of crude oil!

To restate the above numbers, 0.5 MJ equals roughly 140 Watt-hours, 2 MJ is 0.5 kWh and 3 MJ is about 0.8 kWh.

But what about some ultra-advanced lithium battery that uses lighter elements than cobalt and carbon? Without considering the practicality of building such a battery, we can look at the periodic table and pick out the lightest elements with multiple oxidations states that do form compounds. This thought experiment turns up compounds of hydrogen-scandium. Assuming that we could actually make such a battery, its theoretical limit would be around 5 mega-joules per kilogram.

And this brings us back to Ray Kurzweil’s parable of the doubling lily pads in the lake:

A lake owner wants to stay at home to tend to the lake’s fish and make certain that the lake itself will not become covered with lily pads, which are said to double their number every few days. Month after month, he patiently waits, yet only tiny patches of lily pads can be discerned, and they don’t seem to be expanding in any noticeable way. With the lily pads covering less than 1 percent of the lake, the owner figures that it’s safe to take a vacation and leaves with his family. When he returns a few weeks later, he’s shocked to discover that the entire lake has become covered with the pads, and his fish have perished. By doubling their number every few days, the last seven doublings were sufficient to extend the pads’ coverage to the entire lake. (Seven doublings extended their reach 128-fold.) This is the nature of exponential growth.

So how many doublings does it take to get us from your grandfather’s 0.1 MJ per kg lead acid battery to an ultra-advanced theoretical lithium ion battery at 5 MJ? Answer: six. In this lake, the owner can not only take a few weeks vacation but a sabbatical for the whole year and not worry about his fish: the lily pads will still be safely confined to one corner. Gordon Moore’s lake, however, will have witnessed around 30 doublings.

Technological efficiency, however, is only one of two variables that define a disruptive technology: the other one being cost as I mentioned above. In many ways, it is the cost variable that has been the key focus in the approach taken by Elon Musk’s Tesla when it comes to the battery design. Indeed, the battery technology is relatively old school, using a nickel-cobalt-aluminium combination. This accounts for the batteries stunning 550 kg total weight. Preliminary reports on Tesla’s Gigafactory for battery fabrication suggest it will principally be dominated by efficiency of design rather than cutting edge new battery chemistry. But with technological change only in first gear, it will be hard to achieve integrated circuit-style disruptive change. Cost cutting design can only go so far.

At this point, I will conclude this series of posts in fear that I have turned into the ‘Battery Blog’ (although I will return to batteries at a future date as they are so important). But I will finish with the famous ‘rule of 72′. If you remember, divide 72 by a given growth rate to produce the approximate number of years to achieve a doubling. It works pretty well (click for larger image):

Rule of 72 jpeg

Tony Seba, Ray Kurzweil and other assorted techno-cornucopians achieve almost instant doublings by assuming growth rates in the high teens or better. Unfortunately, much science progresses in the low to mid single digits, so change is measured in decades–not years.

The distinction is important. Under the Kurzweil logic, we don’t really need to tackle climate change or resource depletion because technology is on the case. Just go about your business as usual, tuck up your kids in bed at night, and scientific innovation will do the rest.

But unless Argonne Laboratory‘s battery guys and their peers step up the pace (which looks exceedingly difficult), electric vehicles will not replace conventional internal combustion engines for a couple of decades or more. That translates into no natural near-term carbon emission mitigation in the field of motor transport. And unless we get very lucky with climate sensitivity to CO2, that also means we will get a lot closer to exceedingly dangerous climate change.

Sorry, this also means that a ‘do nothing’ political position at both a national and personal level won’t cut it when it comes to climate change.

Battery Banter 4: Could the Grid Cope with a Next Generation EV?

In my last series of posts, I focussed on the war of attribution between electric vehicles (EVs) and traditional internal combustion engine vehicles (ICEs).  Due to the recent slump in oil prices, EVs are on the defensive. They need increased volume to get down their cost curves and punch out of their current redoubt of super cars (Tesla) and green credential statement cars (Nissan Leaf). Low gasoline prices has made such an offensive a lot more tricky to pull off.

But let us suppose that a commercial super battery were to emerge that had high energy density and was cheap. What would happen next? Let’s run this thought experiment in a UK context.

First, let’s look a the UK’s existing fleet. Great Britain has a population of 64 million people, who between them drive around 29 million registered cars (source: here, click for larger image).

Registered Cars UK

And annually each car is driven for an average of 8,000 miles, which translates into 22 miles per day (click for larger image; also remember we are smoothing out weekends and holidays).

Annual Average Miles Travelled jpeg

From a previous blog post, I republish the following chart, which shows the kind of mileage per kilowatt-hour (kWh) a battery achieves at present.

EV by Range per kWh jpeg

Currently, the BMW i3 achieves around 5 miles per kWh. However, current generation EVs spend an awful lot of energy lugging around bloody great big batteries. With a super battery, like the lithium air batteries (li/O2) in the chart below (see my last post), you get four times as much energy for the same given weight. Let’s suppose that the auto makers double the battery capacity to get the required 200 mile range, but still halve the battery weight. Throw in even more use of modern materials and it is not unrealistic to guestimate that our future car would achieve 10 miles per kWh.

Battery Technology jpeg

Using these numbers, 22 miles translates into 2.2 kWh per car. Next, we find the average number of cars per household in the UK, which is 1.1 (here, click for larger image). So we are looking at EV energy expenditure per household of about 2.4 kWh.

Average Number of Cars Per Household jpeg

Meanwhile, average domestic daily electricity consumption per household in the UK is around, 4,200 kWh, which works out at 11.5 kWh per day (here, click for larger image).

Annual Average Electricity Consumption jpeg

We are now in a position to compare the daily EV energy expenditure of our hypothetical future household with current electricity consumption. In short, expending 2.4 kWh per day on the future EV will raise electricity consumption by 21% from the current level of 11.5 kWh. While that is a lot, it is not nearly as much as I would have originally thought.

Battery Banter 3: Gasoline’s Dastardly Energy Density

In my last post, I talked about the challenge that low oil prices pose for the electric vehicle industry. The following chart from a 2012 McKinsey battery study shows the key tipping points (click for larger image):

McKinsey Battery Study jpeg

With US gasoline (petrol) prices currently running at $2.5 per gallon, we are falling into the bottom left corner of the chart. In short, the battery price for battery electric vehicles (BEVs in the chart) must plummet to keep EVs in the game. As stated yesterday, Nissan and Tesla are getting their battery costs down to around $300 per kilowatt-hour (kWh), but this is still far above the current sweet spot of $150-$200.

Previously, I also talked about the ‘learning rate': the rate at which battery prices could fall due to learning from experience manufacturing cost savings for every doubling of battery volume. The industry is in the ‘Catch 22′ position of not being able to crank up volume sufficiently to get down its cost curve since EVs are just too far adrift from internal combustion engine vehicles price-wise to secure volume sales. So what is to be done?

What would break this logjam is if the auto battery industry could make the next technological leap.  The problem for batteries is that oil is so damn energy efficient. A litre of gasoline (petrol) can deliver 10 kWh of energy; the Nissan Leaf battery holds, per one litre by volume, only a hundredth of that. As the chart below shows, even the top-of-the-line Tesla battery is far inferior (source: here; click for larger image).

Battery Technology jpeg

Once the next generation of batteries arrive, however, things will get more interesting. The irony of both traditional vehicles and EVs is that not much energy is actually used to move humans. For current cars, most gasoline is burnt in order to carry a heavy internal combustion energy around; for EVs, the energy is used to transport the battery. Nonetheless, the energy density of gasoline means that traditional cars get the better of EVs in this particular trade-off. But once a new generation of batteries arrives, EVs can push into the top right-hand corner of the chart above. A that point things will change dramatically–a transition that I will tackle in my next post.

Battery Banter 2: Sliding Down the Electric Vehicle Cost Curve

With impeccable timing (for my current blogging theme), Nature Climate Change has just published a commentary by Bjorn Nkyvist and Mans Nilsson reviewing the falling cost of battery packs for electric vehicles (source: here, but apologies as the article is behind a paywall). Bottom line: costs have been falling faster than predicted a few years ago (click for larger image).

Battery Electric Vehicle Costs jpeg

In line with Tony Seba’s estimates I blogged on two days ago (here), Nykvist and Nilsson saw total battery pack costs fall 14% per annum between 2007 and 2014 from $1,000 per kilowatt-hour (kWh) to $410. The market leaders in terms of auto battery technology, Tesla and Nissan, saw a slightly lower rate of decline of 6 to 9% since they have been at the cutting edge of improvements and have had less potential for catch-up than the industry as a whole. However, their costs per kWh are now seen at around $300 per kWh of battery capacity. Note that a BMW i3 has battery capacity of approximately 19 kWh, a Nissan Leaf 24 kWh and a top of the range Tesla 85 kWh. Continue reading

Battery Banter 1: Are Internal Combustion Engines Going the Way of the Horse?

A few days ago, a good friend of mine pointed me toward a presentation on disruptive technologies given by Tony Seba. A youtube video is available here:

The entire video is worth watching, but today I will restrict myself to the issues he raises relating to battery technology.

Seba stresses that technological change in the transport sector could happen at breakneck speed. With a pair of compelling photos of early-last-century New York, we are asked to remember that a grand disruption in transport has happened before. In the first photo, dating from April 1900, we play a game of spot the car (click for larger image).

Where Is the Car? jpeg

In the second, a mere 13 years later, the challenge is to spot the horse.

Where Is the Horse? jpeg

The lesson here is that once a disruptive technology reaches a particular tipping point, it doesn’t just take market share from the incumbent industry but rather completely replaces it. For Seba, we are close to reaching that point with electric vehicles.

Continue reading

Charts du Jour, 18 March 2015: Shale and Seneca’s Cliff

In the words of the Roman philosopher Seneca:

Increases are of sluggish growth, but the way to ruin is rapid

Lucius Annaeus Seneca was musing on the accelerated rate of decline and fall of empires a couple of thousand years ago. The chemist and scholar of the post-growth world Ugo Bardi has borrowed the philosopher’s name for his idea of a Seneca Cliff–the precipice over which our complex society will likely (according to him) tip and fall.

While such ideas gained considerable traction a few years ago (fanned by rocketing fossil fuel prices and the impact of the Great Recession), they are now deeply out of fashion. Doesn’t Bardi know that we live in an age of abundance, or so the shale oil and gas story goes.

Befitting the name of his blog, Bardi remains a committed Cassandra, warning all those who will listen. To my shale oil production chart of yesterday, Bardi responds with this first (all is well in the world of cod):

Cod Landings jpeg

And then this (perhaps it was not as well as it seemed):

US Cod Landings Latest jpeg

Full blog post by Bardi on this theme is here. But does the argument “so goes cod, so will go shale” hold true?

This is certainly the view of the geoscientist J. David Hughes, who maintains a web site called “shalebubble.org“. On it, you will find a number of Hughes’ reports published under the imprint of the Post Carbon Institute, the latest going under the title of “Drilling Deeper‘. The full report is 300 pages long, but Hughes concludes that the US Energy Information Administration has built a production forecast on the back of a series of three false premises. Further, based on these, the US economy has taken as truisms a series of false promises (click for larger image).

False Premises and Promises jpeg

Should Hughes’ analysis be correct, then Seneca’s Cliff may beckon. Within a decade we will know one way or another. Never forget: Cassandra was proved right in the end.

Charts du Jour, 17 March 2015: Pump Baby Pump (but Don’t Drill)

I regularly report on the Energy Information Administration‘s monthly US oil production statistics, which show no slowdown in output as yet (see here for latest numbers). Bloomberg, however, has a series of multimedia offerings giving more colour as to what is going on.

First, a nice chart juxtaposing production and rig count numbers (source: here).

Active Oil Rigs jpeg

And for a great animated graphic showing rig count through time and space, this offering (again from Bloomberg) is superb. Below is my screen shot, but to get the full effect click this link here.

Watch Four Years jpeg

Finally, an animation explaining why the crashing rig count has yet to stop production rising. In Bloomberg‘s view, the divergence between rig count and production has many months to run.

National Geographic recently had an article titled “How Long Can the US Oil Boom Last?” which emphasises the longer view. They argue that the US fracking boom is a multi-year phenomenon not a multi-decade one.

But in the long term, the U.S. oil boom faces an even more serious constraint: Though daily production now rivals Saudi Arabia’s, it’s coming from underground reserves that are a small fraction of the ones in the Middle East.

Both the EIA and the International Energy Agency see US oil production peaking out by the end of the decade regardless of short-term oil price fluctuations. Nonetheless, both organisations have underestimated the upswing in tight oil production to date. Overall, it is very difficult to gauge where US production will be in five years time. This is a bigger story than the current spectacular rig count crash, and one I intend to return to in future posts.