Tag Archives: Ecologic

Climate ‘uncertainty’ is no excuse for climate inaction Updated for 2026





Former environment minister Owen Paterson has called for the UK to scrap its climate change targets.

In a speech to the Global Warming Policy Foundation, he cited “considerable uncertainty” over the impact of carbon emissions on global warming – a line that was displayed prominently in coverage by the Telegraph and the Daily Mail.

Paterson is far from alone: climate change debate has been suffused with appeals to ‘uncertainty’ to delay policy action. Who hasn’t heard politicians or media personalities use uncertainty associated with some aspects of climate change to claim that the science is ‘not settled‘?

Over in the US, this sort of thinking pops up quite often in the opinion pages of The Wall Street Journal. Its most recent article, by Professor Judith Curry, concludes that the ostensibly slowed rate of recent warming gives us “more time to find ways to decarbonise the economy affordably.”

What we do know – inspite of ‘uncertainty’

At first glance, avoiding interference with the global economy may seem advisable when there is uncertainty about the future rate of warming or the severity of its consequences.

But delaying action because the facts are presumed to be unreliable reflects a misunderstanding of the science of uncertainty.

Simply because a crucial parameter such as the climate system’s sensitivity to greenhouse gas emissions is expressed as a range – for example, that under some emissions scenarios we will experience 2.6°C to 4.8ºC of global warming or 0.3 to 1.7 m of sea level rise by 2100 – does not mean that the underlying science is poorly understood. We are very confident that temperatures and sea levels will rise by a considerable amount.

Perhaps more importantly, just because some aspects of climate change are difficult to predict (will your county experience more intense floods in a warmer world, or will the floods occur down the road?) does not negate our wider understanding of the climate.

We can’t yet predict the floods of the future but we do know that precipitation will be more intense because more water will be stored in the atmosphere on a warmer planet.

This idea of uncertainty might be embedded deeply within science but is no one’s friend and it should be minimised to the greatest extent possible. It is an impetus to mitigative action rather than a reason for complacency.

Uncertainty means more risk – not less

There are three key aspects of scientific uncertainty surrounding climate change projections that exacerbate rather than ameliorate the risks to our future.

First, uncertainty has an asymmetrical effect on many climatic quantities. For example, a quantity known as Earth system sensitivity, which tells us how much the planet warms for each doubling of atmospheric carbon dioxide concentration, has been estimated to be between 1.5°C to 4.5ºC.

However, it is highly unlikely, given the well-established understanding of how carbon dioxide absorbs long-wave radiation, that this value can be below 1ºC. There is a possibility, however, that sensitivity could be higher than 4.5ºC.

For fundamental mathematical reasons, the uncertainty favours greater, rather than smaller, climate impacts than a simple range suggests.

Uncertainty also makes adaptation harder

Second, the uncertainty in our projections makes adaptation to climate change more expensive and challenging. Suppose we need to build flood defences for a coastal English town.

If we could forecast a 1m sea level rise by 2100 without any uncertainty, the town could confidently build flood barriers 1m higher than they are today. However, although sea levels are most likely to rise by about 1m, we’re really looking at a range between 0.3m and 1.7m.

Therefore, flood defences must be at least 1.7m higher than today – 70cm higher than they could be in the absence of uncertainty. And as uncertainty increases, so does the required height of flood defences for non-negotiable mathematical reasons.

And the problem doesn’t end there, as there is further uncertainty in forecasts of rainfall occurrence, intensity and storm surges. This could ultimately mandate a 2 to 3m-high flood defence to stay on the safe side, even if the most likely prediction is for only a 1m sea-level rise.

Even then, as most uncertainty ranges are for 95% confidence, there is a 5% chance that those walls would still be too low. Maybe a town is willing to accept a 5% chance of a breach, but a nuclear power station cannot to take such risks.

Systemic uncertainties may be hiding the gravest of risks

Finally, some global warming consequences are associated with deep, so-called systemic uncertainty. For example, the combined impact on coral reefs of warmer oceans, more acidic waters and coastal run-off that becomes more silt-choked from more intense rainfalls is very difficult to predict.

But we do know, from decades of study of complex systems, that those deep uncertainties may camouflage particularly grave risks. This is particularly concerning given that more than 2.6 billion people depend on the oceans as their primary source of protein.

Similarly, warming of Arctic permafrost could promote the growth of CO2-sequestering plants, the release of warming-accelerating methane, or both.

Warm worlds with very high levels of carbon dioxide did exist in the very distant past and these earlier worlds provide some insight into the response of the Earth system; however, we are accelerating into this new world at a rate that is unprecedented in Earth history, creating additional layers of complexity and uncertainty.

Uncertainty is not the same as ignorance

Increasingly, arguments against climate mitigation are phrased as “I accept that humans are increasing CO2 levels and that this will cause some warming but climate is so complicated we cannot understand what the impacts of that warming will be.”

This argument is incorrect – uncertainty does not imply ignorance. Indeed, whatever we don’t know mandates caution. No parent would argue:

“I accept that if my child kicks lions, this will irritate them, but a range of factors will dictate how the lions respond; therefore I will not stop my child from kicking lions.”

The deeper the uncertainty, the more greenhouse gas emissions should be perceived as a wild and poorly understood gamble.

By extension, the only unequivocal tool for minimising climate change uncertainty is to decrease our greenhouse gas emissions.

 


 

Richard Pancost is Professor of Biogeochemistry, Director of the Cabot Institute at the University of Bristol. He receives funding from the NERC, the EU and the Leverhulme Trust.

Stephan Lewandowsky is Chair of Cognitive Psychology at the University of Bristol. He receives funding from the Australian Research Council, the World University Network, and the Royal Society.

This article was originally published on The Conversation. Read the original article.

The Conversation

 




385583

Climate ‘uncertainty’ is no excuse for climate inaction Updated for 2026





Former environment minister Owen Paterson has called for the UK to scrap its climate change targets.

In a speech to the Global Warming Policy Foundation, he cited “considerable uncertainty” over the impact of carbon emissions on global warming – a line that was displayed prominently in coverage by the Telegraph and the Daily Mail.

Paterson is far from alone: climate change debate has been suffused with appeals to ‘uncertainty’ to delay policy action. Who hasn’t heard politicians or media personalities use uncertainty associated with some aspects of climate change to claim that the science is ‘not settled‘?

Over in the US, this sort of thinking pops up quite often in the opinion pages of The Wall Street Journal. Its most recent article, by Professor Judith Curry, concludes that the ostensibly slowed rate of recent warming gives us “more time to find ways to decarbonise the economy affordably.”

What we do know – inspite of ‘uncertainty’

At first glance, avoiding interference with the global economy may seem advisable when there is uncertainty about the future rate of warming or the severity of its consequences.

But delaying action because the facts are presumed to be unreliable reflects a misunderstanding of the science of uncertainty.

Simply because a crucial parameter such as the climate system’s sensitivity to greenhouse gas emissions is expressed as a range – for example, that under some emissions scenarios we will experience 2.6°C to 4.8ºC of global warming or 0.3 to 1.7 m of sea level rise by 2100 – does not mean that the underlying science is poorly understood. We are very confident that temperatures and sea levels will rise by a considerable amount.

Perhaps more importantly, just because some aspects of climate change are difficult to predict (will your county experience more intense floods in a warmer world, or will the floods occur down the road?) does not negate our wider understanding of the climate.

We can’t yet predict the floods of the future but we do know that precipitation will be more intense because more water will be stored in the atmosphere on a warmer planet.

This idea of uncertainty might be embedded deeply within science but is no one’s friend and it should be minimised to the greatest extent possible. It is an impetus to mitigative action rather than a reason for complacency.

Uncertainty means more risk – not less

There are three key aspects of scientific uncertainty surrounding climate change projections that exacerbate rather than ameliorate the risks to our future.

First, uncertainty has an asymmetrical effect on many climatic quantities. For example, a quantity known as Earth system sensitivity, which tells us how much the planet warms for each doubling of atmospheric carbon dioxide concentration, has been estimated to be between 1.5°C to 4.5ºC.

However, it is highly unlikely, given the well-established understanding of how carbon dioxide absorbs long-wave radiation, that this value can be below 1ºC. There is a possibility, however, that sensitivity could be higher than 4.5ºC.

For fundamental mathematical reasons, the uncertainty favours greater, rather than smaller, climate impacts than a simple range suggests.

Uncertainty also makes adaptation harder

Second, the uncertainty in our projections makes adaptation to climate change more expensive and challenging. Suppose we need to build flood defences for a coastal English town.

If we could forecast a 1m sea level rise by 2100 without any uncertainty, the town could confidently build flood barriers 1m higher than they are today. However, although sea levels are most likely to rise by about 1m, we’re really looking at a range between 0.3m and 1.7m.

Therefore, flood defences must be at least 1.7m higher than today – 70cm higher than they could be in the absence of uncertainty. And as uncertainty increases, so does the required height of flood defences for non-negotiable mathematical reasons.

And the problem doesn’t end there, as there is further uncertainty in forecasts of rainfall occurrence, intensity and storm surges. This could ultimately mandate a 2 to 3m-high flood defence to stay on the safe side, even if the most likely prediction is for only a 1m sea-level rise.

Even then, as most uncertainty ranges are for 95% confidence, there is a 5% chance that those walls would still be too low. Maybe a town is willing to accept a 5% chance of a breach, but a nuclear power station cannot to take such risks.

Systemic uncertainties may be hiding the gravest of risks

Finally, some global warming consequences are associated with deep, so-called systemic uncertainty. For example, the combined impact on coral reefs of warmer oceans, more acidic waters and coastal run-off that becomes more silt-choked from more intense rainfalls is very difficult to predict.

But we do know, from decades of study of complex systems, that those deep uncertainties may camouflage particularly grave risks. This is particularly concerning given that more than 2.6 billion people depend on the oceans as their primary source of protein.

Similarly, warming of Arctic permafrost could promote the growth of CO2-sequestering plants, the release of warming-accelerating methane, or both.

Warm worlds with very high levels of carbon dioxide did exist in the very distant past and these earlier worlds provide some insight into the response of the Earth system; however, we are accelerating into this new world at a rate that is unprecedented in Earth history, creating additional layers of complexity and uncertainty.

Uncertainty is not the same as ignorance

Increasingly, arguments against climate mitigation are phrased as “I accept that humans are increasing CO2 levels and that this will cause some warming but climate is so complicated we cannot understand what the impacts of that warming will be.”

This argument is incorrect – uncertainty does not imply ignorance. Indeed, whatever we don’t know mandates caution. No parent would argue:

“I accept that if my child kicks lions, this will irritate them, but a range of factors will dictate how the lions respond; therefore I will not stop my child from kicking lions.”

The deeper the uncertainty, the more greenhouse gas emissions should be perceived as a wild and poorly understood gamble.

By extension, the only unequivocal tool for minimising climate change uncertainty is to decrease our greenhouse gas emissions.

 


 

Richard Pancost is Professor of Biogeochemistry, Director of the Cabot Institute at the University of Bristol. He receives funding from the NERC, the EU and the Leverhulme Trust.

Stephan Lewandowsky is Chair of Cognitive Psychology at the University of Bristol. He receives funding from the Australian Research Council, the World University Network, and the Royal Society.

This article was originally published on The Conversation. Read the original article.

The Conversation

 




385583

Keeping the lights on Updated for 2026





As a member of the Cabinet for four years I supported Coalition energy policy. However I have become increasingly aware from my own constituency and from widespread travel around the UK of intense public dissatisfaction with heavily subsidized renewable technologies in particular onshore wind.

I have used the last three months since leaving the Cabinet to learn more about the consequences of this policy. And what I have unearthed is alarming.

Our current policy will cost £1,300bn up to 2050. It fails to meet the very emissions targets it is designed to meet. And it fails to provide the UK’s energy requirements.

I will argue that current energy policy is a slave to flawed climate action. It neither reduces emissions sufficiently, nor provides the energy we need as a country.

I call for a robust, common sense energy policy that would encourage the market to choose affordable technologies to reduce emissions, and give four examples:

  • promotion of indigenous shale gas
  • large scale localised Combined Heat and Power (CHP)
  • small modular nuclear reactors
  • rational demand management


The vital importance of affordable energy

But first, let us consider what is at stake. We now live in an almost totally computer-dependent world. Without secure power the whole of our modern civilisation collapses: banking, air traffic control, smart phones, refrigerated food, life-saving surgery, entertainment, education, industry and transport.

We are lucky to live in a country where energy has been affordable and reliable. Yet we cannot take this for granted.

While most public discussion is driven by the immediacy of the looming 2020 EU renewables target; policy is actually dominated by the EU’s long-term 2050 target.

The 2050 target is for a reduction in greenhouse gas emissions by 80% relative to 1990 levels. The target has been outlined by the European Commission. But it is only the UK that has made it legally binding through the Climate Change Act – a piece of legislation that I and virtually every other MP voted for.

The 2050 target of cutting emissions by 80%, requires the almost complete decarbonisation of the electricity supply in 36 years.

In the short and medium term, costs to consumers will rise dramatically, and the lights would eventually go out. Not because of a temporary shortfall, but because of structural failures, from which we will find it extremely difficult and expensive to recover.

We must act now. The purpose of my address today is to set out how.

The 2050 Target – what it means in practice

By 2050, the aim is to produce virtually all of our electricity with ‘zero carbon’ emissions. Yet at the moment over 60% of our electricity is produced by carbon-based fossil fuel – mainly gas and coal. And the emissions of this “carbon” portion have to be removed almost completely.

Yet cutting carbon out of electricity production isn’t enough. Heating, transport and industry also use carbon based fuels.

In fact, to hit the 80% reduction target, we will have to abolish natural gas in most of our homes. No more cooking or central heating using gas. Our homes must become all-electric.

Much of the fuel used for transport will have to be abolished too. 65% of private cars will have to be electric.

This is a point that is little understood. The 2050 target commits us to a huge expansion of electricity generation capacity, requiring vast investment.

The EU’s suggested route to meet this target – and how it doesn’t work

So where does such a supply of zero-carbon electricity come from? The European Commission offers several possibilities, but its particular enthusiasm is for renewable energy, under what it calls its “High RES” (Renewable Energy Sources) scenario. In this scenario, most of the electricity comes from wind power.

This is regrettably entirely unrealistic.

The investment costs of generation alone are prohibitive. They are admitted by the EU to be staggering. The High RES scenario alone would require a cumulative investment, between the years 2011 and 2050, of €3.2 trillion.

Even if you could find such sums from investors, they will require a return and a large premium to de-risk a very hazardous investment. The margins will be astonishing. As Peter Atherton of Liberum argues, the public will not readily accept profits that large for the energy companies.

But if investment is tricky, we only need to consider the scale of construction.

Wind capacity in the EU 27 must rise from 83 GW in 2010 to 984 GW in 2050. It means an increase from 42,000 wind turbines across Europe, to nearly 500,000 wind turbines. This would require a vast acreage of wind turbines that would wall-to-wall carpet Northern Ireland, Wales, Belgium, Holland and Portugal combined.

There, at the heart of the Commission’s “high RES” decarbonisation policy, is the fatal flaw. At any practical level, it cannot be achieved. It simply will not happen. Yet, as far as EU policy goes, it is the most promising option, on which considerable development resource has been expended.

UK’s plans to meet the targets are no better

Knowing this to be unrealistic, no other country in the European Union apart from the UK has made the 2050 target legally binding.

So having signed up to it, how does the UK hope to deliver all this carbon neutral electricity? The target is, in theory, technology-neutral. The Coalition Government acknowledges shortcomings in wind by making only “significant use” of the UK’s wind resources while taking into account ecological and social sensitivities of wind.

But if wind doesn’t make up the bulk of zero-carbon electricity supply, then that would mean building new nuclear at the rate of 1.2GW a year for the next 36 years. Put simply, that’s a new Hinkley Point every three years.

In addition UK policy requires building Carbon Capture and Storage (CCS) plants which take CO2 emissions from gas and coal and buries them in the ground. But these are fuelled by gas or coal at the rate of 1.5GW a year. While nascent, this technology is known to cut efficiency by a third and treble capital cost.

So the British nuclear-led option is no more realistic than the Commission ‘high RES’ scenario or any other of the decarbonisation options. There is simply no plausible scenario by which the British government can conceivably meet its 80% emission cut by 2050.

And yet, despite this doomed policy, we provide subsidies for renewables of around £3 billion a year – and rising fast. This is a significant cost burden on our citizens.

In fact it amazes me that our last three energy secretaries, Ed Miliband, Chris Huhne and Ed Davey, have merrily presided over the single most regressive policy we have seen in this country since the Sheriff of Nottingham: the coerced increase of electricity bills for people on low incomes to pay huge subsidies to wealthy landowners and rich investors.

Furthermore the cost is rising, not falling. DECC wrongly assumed that the price of gas would only rise. Four years ago the Energy Secretary confidently argued that renewables would be cheaper than gas by 2020. But this was based on a DECC forecast that gas prices would double.

Instead gas prices have fallen. DECC has revised downwards its forecasts of 2020 gas prices to roughly what they were in 2011 – just 60p a therm. Wind power just isn’t competitive with gas. But the drop in gas prices raises the costs of renewable subsidies, already ‘capped’ at £7.6 billion in 2020, by 20%. This is unaffordable.

Climate science

Before I go on to outline an alternative, let me say a few words about climate science and the urgency of emissions reduction.

I readily accept the main points of the greenhouse theory. Other things being equal, carbon dioxide emissions will produce some warming. The question always has been: how much? On that there is considerable uncertainty.

For, I also accept the unambiguous failure of the atmosphere to warm anything like as fast as predicted by the vast majority of climate models over the past 35 years, when measured by both satellites and surface thermometers. And indeed the failure of the atmosphere to warm at all over the past 18 years – according to some sources. Many policymakers have still to catch up with the facts.

I also note that the forecast effects of climate change have been consistently and widely exaggerated thus far.

The stopping of the Gulf Stream, the worsening of hurricanes, the retreat of Antarctic sea ice, the increase of malaria, the claim by UNEP that we would see 50m climate refugees before now – these were all predictions that proved wrong.

For example the Aldabra Banded Snail which one of the Royal Society’s journals pronounced extinct in 2007 has recently reappeared, yet the editors are still refusing to retract the original paper.

It is exactly this sort of episode that risks inflicting real harm on the reputation and academic integrity of the science.

Despite all this, I remain open-minded to the possibility that climate change may one day turn dangerous. So, it would be good to cut emissions, as long as we do not cause great suffering now for those on low incomes, or damage today’s environment.

The inadequacies of renewable energy to meet demand

Let me briefly go through all the renewable energy options and set out why they cannot supply the zero-carbon electricity needed to keep the lights on in 2050.

Onshore wind is already at maximum capacity as far as available subsidy is concerned. Ed Davey recently confirmed, if current approval trends in the planning system continue, the UK is likely to have 15.25 GW of onshore wind by 2020. This is higher than the upper limit of 13 GW intended by DECC.

This confirms what the Renewable Energy Foundation has been pointing out for some time – that DECC is struggling to control this subsidy drunk industry. Planning approval for renewables overall, including onshore wind, needs to come to a halt or massively over-run the subsidy limits set by the Treasury’s Levy Control Framework.

However, this paltry supply of onshore wind, nowhere near enough to hit the 2050 target, has devastated landscapes, blighted views, divided communities, killed eagles, carpeted the countryside and the very wilderness that the “green blob” claims to love, with new access tracks cut deep into peat, boosted production of carbon-intensive cement, and driven up fuel poverty, while richly rewarding landowners.

Offshore wind is proving a failure. Its gigantic costs, requiring more than double the subsidy of onshore wind, are failing to come down as expected, operators are demanding higher prices, and its reliability is disappointing, so projects are being cancelled as too risky in spite of the huge subsidies intended to make them attractive. There is a reason we are the world leader in this technology – no other country is quite so foolish as to plough so much public money into it.

Hydro is maxed out. There is no opportunity to increase its contribution in this country significantly; the public does not want any more flooded valleys. Small-scale in-stream hydro might work for niche applications – isolated Highland communities for example – but the plausible potential for extra hydro is an irrelevance for the heavy lifting needed to support UK demand for zero-carbon electricity.

Tidal and wave power despite interesting small-scale experiments is still too expensive and impractical. Neither the astronomical prices on offer from the government, nor huge research and development subsidies have lured any commercial investors to step into the water. Even if the engineering problems could be overcome, tidal and wave power, like wind, will not always be there when you need it.

Solar power may one day be a real contributor to global energy in low latitudes and at high altitudes, and in certain niches. But it is a non-starter as a significant supplier to the UK grid today and will remain so for as long as our skies are cloudy and our winter nights long. Delivering only 10% of capacity, it’s an expensive red herring for this country and today’s solar farms are a futile eye-sore, and a waste of land that could be better used for other activities.

Biomass is not zero carbon. It generates more CO2 per unit of energy even than coal. Even DECC admits that importing wood pellets from North America to turn into hugely expensive electricity here makes no sense if only because a good proportion of those pellets are coming from whole trees.

The fact that trees can regrow is of little relevance: they take decades to replace the carbon released in their combustion, and then they are supposed to be cut down again. If you want to fix carbon by planting trees, then plant trees! Don’t cut them down as well. We are spending ten times as much to cut down North American forests as we are to stop the cutting down of tropical forests.

Meanwhile, more than 90% of the renewable heat incentive (RHI) funds are going to biomass. That is to say, we are paying people to stop using gas and burn wood instead. Wood produces twice as much carbon dioxide than gas.

Waste to energy is the one renewable technology we should be investing more in. It is a missed opportunity. We don’t do enough anaerobic digestion of sewage; we should be using AD plants to convert into energy more of the annual 15 million tonnes of food waste. But this can only ever provide a small part of the power we need.

So these technologies do not provide enough power. But they also don’t cut the emissions. And if you’ll bear with me I want to explain why.

Emissions reduction in practice

We know that Britain’s dash for wind, though immensely costly, regressive and damaging to the environment, has had very little impact on emissions.

DECC assumes that every MWh of wind replaces a MWh of conventionally generated power. But we know and they know that this is probably wrong at present, and is all but certain to be wrong in the future, when wind capacities are planned to be much higher.

According to an Irish study, because wind cannot always supply electricity when it is needed, backup from gas and coal power plants are required. When the carbon footprint of wind is added to that of the backup energy generators the impact on the environment is actually greater.

System costs incurred by the grid in managing the electricity system, especially given the remoteness of many wind farms, make it worse still. And a wind-dominated system affects the investment decisions other generators make.

So the huge investment we have made in wind power, with all the horrendous impacts on our most precious landscapes, have not saved much in the way of carbon dioxide emissions so far. What savings, if any, have been bought at the most astonishing cost per tonne?

Four possibilities – achieving emissions targets, supplying energy

So what is achievable? If we are to get out of the straight jacket of current policy, what can be done? I want to explore four technologies which, combined, would both reduce emissions and keep the supply of power on.

The shale gas opportunity

In contrast to Britain’s dash for wind, America’s dash for shale gas has had a huge impact on emissions.

Thanks largely to the displacement of coal-fired generation by cheap gas, US emissions in power generation are down to the level they were in the 1990s and in per capita terms to levels last seen in the 1960s. Gas has on average half the emissions of coal.

It has cut US gas prices to one-third of European prices, which means that we risk losing many jobs in chemical and manufacturing industries to our transatlantic competitors. We are sitting on one of the richest shale deposits in the world. Just 10% of the Bowland shale gas resource alone could supply all our gas needs for decades and transform the North West economy.

The environmental impact of shale would be far less than wind. For the same output of energy, a wind farm requires many more truck movements, takes up hundreds of times as much land and kills far more birds and bats. Above all, shale gas does not require regressive subsidy. In fact, it would bring energy prices down.

Not only does shale gas have half the emissions of coal; it could increase energy security. Currently 40% of the coal we burn in this country comes from Russia. Far better to burn Lancashire shale gas than Putin’s coal.

So the first leg of my suggested policy would be an acceleration of shale gas exploitation. As Environment Secretary I did everything I could to speed up approval of shale gas permits having set up a one-stop-shop aiming to issue a standard permit within two weeks. But I was up against the very powerful “green blob” whose sole aim was to stop it.

Combined Heat and Power

But there is another advantage of bringing abundant gas on stream. We could build small, local power stations, close to where people live and work. This would allow us to use not just the electricity generated by the power station, but its heat also.

Combined heat and power, or CHP, cuts emissions, cuts costs and creates jobs.

The generous EU estimate of the current efficiency in conventional power stations is about 50%. The best of the CHP plants deliver 92% efficiencies.

Yet despite these attributes CHP is treated as the Cinderella to the European Commission’s favoured Hi Renewable Energy Strategy.

Renewables – especially wind – have been showered with lucrative guarantees, in the form of doubled or trebled electricity prices – thereby absorbing available investment capital.

Whereas the Commission attributes CHP’s failure to the “limited” efficiency and effectiveness of its CHP Directive.

I am a realist. CHP does have high capital cost and limited returns with payback periods longer than normally considered viable. Given the commercial risks, dividends from energy efficiency alone have not been sufficient to drive a large-scale CHP programme.

But the Coalition Government recognise this too in seeking to promote energy efficiency in the NHS.

Its buildings consume over £410 million worth of energy and produce 3.7 million tonnes of CO2 every year. Energy use contributes 22% of the total carbon footprint and, in its own terms, the NHS says that this offers many opportunities for saving and efficiency, allowing these savings to be directly reinvested into further reductions in carbon emissions and improved patient care.

In 2013, therefore, it decided to kick-start its energy saving programme with a £50 million fund, aiming to deliver savings of £13.7 million a year. CHP comprised a substantial part of this spending.

To kick-start a broader national programme, providing state aid or financial incentives would be appropriate, especially as the effect would be more cost-effective than similar amounts spent on renewables.

In the United States, the value of CHP is beginning to be recognised as the most efficient way of capitalising on the shale gas bonanza. One state – Massachusetts – has delivered large electricity savings in recent years through CHP. CHP capacity in the United States is currently 83.3GW compared with about 9GW here.

Actually, between 2005 and 2010, the production of both electricity and heat from CHP installations in the UK fell, a dreadful indictment of the last Labour government’s energy policy. The installed capacity of wind increased by over 500%, despite a massively inferior cost-benefit ratio.

But I do want to highlight how revolutionary CHP technology can be in affording the localisation of the electricity supply system. Transmission losses, can account for 5-7% of national electricity production. A 20% reduction in transmission loss would be the equivalent of saving the output of another large nuclear installation. This is why CHP can deliver efficiency ratings of up to 90%: the system heat is produced where it can be used.

For instance, Leeds Teaching Hospital and the University of Leeds together have financed their own dedicated power station, comprising CHP units and an electricity generation capacity of 15MW.

With this model, it is easy to imagine office buildings, supermarkets and other installations operating CHP units of 1.5MW or less.

In fact, results from Massachusetts shows that 40% of total energy supply could be CHP. Freiburg in Germany is already producing 50% of its energy from CHP up from 3% in 1993.

Implemented nationally, this revolutionary programme of localised electricity production would massively increase the resilience of the system, considerably improve energy efficiency overall, and ease pressure on the distribution system. In total, we would save the equivalent of 9 Hinkley C’s.

Small modular nuclear

The third technology is an innovative approach with small nuclear reactors integrated with CHP.

Our policy has consistently favoured huge nuclear and coal plants, remote from their customers. Given that 40% or more of the total energy production from a nuclear plant is waste heat, such plants are ostensibly ideal for CHP, but there is no economic way of using the waste heat.
I think there is a further massive obstacle to achieving 40 GW capacity from large nuclear plants; there are simply not enough suitable sites and not enough time to build them.

Small nuclear plants have been running successfully in the UK for the last thirty years. Nine have been working on and off without incident and the technology is proven.

Factory built units at the rate of one a month could add to the capacity at a rate of 1.8 GW per year according to recent select committee evidence from Rolls-Royce.

Small factory built nuclear plants, could be located closer, say within 20 to 40 miles, to users and provide a CHP function. Installed near urban areas, they can deliver electricity and power district heating schemes or, in industrial areas, provide a combination of electricity and process heat.

I welcome the Government’s feasibility study into this technology. What is holding up full commercial exploitation is the cost of regulatory approval, which is little different from a large-scale reactor.

I also note that the US Department of Energy has commissioned the installation of three different modular reactors at its Savannah River test facility, with a view to undertaking generic or “fleet” licensing. We should learn from them as a key priority.

Demand management

The fourth leg of my proposal is demand management. The government is tentatively investigating smart meters and using our electric cars as a form of energy storage for the grid as a whole. That is to say, in the future, on cold, windless nights, people might wake to find that their electric cars have been automatically drained of juice to keep their electric central heating on. This is crazy stuff!

It is both impractical and yet not nearly bold enough. Dynamic demand would be a better policy for demand management that would also be cheaper.

It requires the fitting of certain domestic appliances, such as refrigerators, with low-cost sensors coupled to automated controls. These measure the frequency of the current supplied and switch off their appliances when the system load temporarily exceeds supply, causing the current frequency to drop.

Since appliances such as refrigerators do not run continuously, switching them off for short periods of 20 to 30 minutes is unlikely to be noticed and will have no harmful effects on the contents. Yet the cumulative effect on the generating system of millions of refrigerators simultaneously switching themselves off is dramatic – as much as 1.2GW, the equivalent of a large nuclear plant.

In addition, we can imagine a future in which supermarkets’ chillers switch off, and hospitals’ emergency generators switch on, when demand is high, thus shaving the peaks off demand. We have started this and we need to do much more.

For this reason, I think the Short Term Operational Reserve (STOR), a somewhat notorious scheme whereby costly diesel generators are kept on stand-by in case the wind drops, is not as foolish as it sounds. It would be even more useful in a system without wind power. At the moment it has to cope with unpredictable variation in supply as well as demand.

With as much as a 25GW variation during a day and with a winter peak load approaching 60GW, significant capacity has to be built and maintained purely to meet short-duration peaks in demand. The use and extension of STOR and like facilities can make a significant contribution to reducing the need for peak generation plants.

According to one aggregator, removing 5-15% of peak demand is realistic, as part of the new capacity market. This could be worth up to 9GW, effectively the output of seven major nuclear plants, or their equivalent which would otherwise have to be built. As it stands Ofgem has already estimated that demand management could save the UK £800 million annually on transmission costs and £226 million on peak generation capacity.

Four pillars of energy policy

And there you have it. Four possible common sense policies: shale gas, combined heat and power, small modular nuclear reactors and demand management. That would reduce emissions rapidly, without risking power cuts, and would be affordable.

In the longer term, there are other possibilities. Thorium as a nuclear fuel, sub-critical, molten-salt reactors, geothermal plants connected to CHP systems, fuel made in deserts using solar power, perhaps even fusion one day – all these are possible in the second half of the century.

But in the short term, we have to be realistic and admit that solar, wind and wave are not going to make a significant contribution while biomass does not help at all.

What I have wanted to demonstrate to you this evening, is that it is possible to reduce emissions, while providing power.

But what is stopping this program? Simply, the 2050 legally binding targets enshrined in the Climate Change Act.

The 80% decarbonisation strategy, cannot be achieved: it is an all-or-nothing strategy which does not leave any openings for alternatives.

It requires very specific technology, such as supposedly ‘zero carbon’ windfarms, and electric vehicles. Even interim solutions can never be ‘zero carbon’, so these too must be replaced well before 2050.

In guzzling up available subsidies and capital investment ‘zero carbon’ technology blocks the development of more modest but feasible and affordable low carbon options.

Thus, in pursuing the current decarbonisation route, we end up with the worst of all possible worlds. When there is a shortfall in electricity production, emergency measures will have to be taken – what in Whitehall is known as ‘distressed policy correction’. Bluntly, building gas or even coal in a screaming hurry.

The UK ends up worse off than if it adopted less ambitious but achievable targets. Reining in unrealistic green ambitions allows us to become more ‘green’ than the Greens.

We are the only country to have legally bound ourselves to the 2050 targets – and certainly the only one to bind ourselves to a doomed policy.

In the absence of a legally binding international agreement, which looks unlikely given disagreement within EU member states and the position of the BRIC countries, the Climate Change Act should be effectively suspended and eventually repealed.

Clause 2 of the Climate Change Act 2008 enables the Secretary of State by order to amend, subject to affirmative resolution procedure, the 2050 target which could have the immediate effect of suspending it.

Then, energy efficiency becomes a realistic and viable option. Investment in energy efficiency, including the Government’s very welcome initiatives on insulation, offers considerable advantages over wind energy.

It does not raise overall electricity costs, and may even cut them because the investment costs are matched by the financial savings delivered.

The moral case for abandoning the 2050 targets

We have to remember too that the people who suffer most from a lack of decent energy are the poor.

I have already mentioned that we are redistributing from those with low incomes to wealthy landowners through generous subsidies collected in high energy bills.

The sight of rich western film stars effectively telling Africa’s poor that they should not have fossil fuels, but should continue to die at the rate of millions each year from the smoke of wood fires in their homes, frankly disgusts me. The WHO estimates that 4.3 million lose their lives every year through indoor air pollution.

The sight of western governments subsidizing the growing of biofuels in the mistaken belief that this cuts emissions, and in the full knowledge that it drives up food prices, encourages deforestation and tips people into hunger, leaves me amazed.

The lack of affordable and reliable electricity, transport and shelter to help protect the poor from cyclones, droughts and diseases, is a far greater threat to them than the small risk that those weather systems might one day turn a bit more dangerous.

Growth is the solution, not the problem

Among most of those who marched against climate change last month, together with many religious leaders, far too many academics and a great many young people, the myth has taken hold that growth and prosperity are the problem, and that the only way to save the planet is to turn our backs on progress.

They could not be more wrong. The latest Intergovernmental Panel on Climate Change assessment report states that the scenario with the most growth is the one with the least warming. The scenario with the most warming is one with very slow economic growth.

Why?

Because growth means invention and innovation and it is new ideas, new technology that generates solutions to our problems. The IPCC’s RCP2.6 scenario projects that per capita GDP will be 16 times as high as today by the end of the century, while emissions will have stabilized and temperature will have stopped rising well before hitting dangerous levels.

The history of the last century shows that dramatic technical breakthroughs are possible where incentives are intelligently aligned – but it’s impossible to know in advance where these will come from. Who predicted 30 years ago that the biggest breakthrough would come from horizontal drilling?

We have some of the finest scientists and universities in the world. A fraction of the money spent on renewables subsidies should go towards research and development and specific, well defined goals with prizes for scientists and companies.

Energy efficiency will develop very rapidly if encouraged to do so, cutting emissions.

A common sense policy climate for climate policy

The fundamental problem with our electricity policy over the last two decades has been that successive governments have attempted to pick winners.

Pet technologies introduce price distortions that destroy investment in the rest of the market, with disastrous consequences.

Even Nigel [Lawson] would admit that the liberalisations he introduced to transform the electricity industry in the consumer interest were frustrated. Sadly, the policies of the last decade or so, have undone many of his reforms.

But like him, I would reliberalise the markets and allow the hidden hand to reach out for technologies that can in practice reduce emissions.

Conclusion

To summarise, we must challenge the current groupthink and be prepared to stand up to the bullies in the environmental movement and their subsidy-hungry allies.

Paradoxically, I am saying that we may achieve almost as much in the way of emissions reduction, perhaps even more if innovation goes well, using these four technologies or others, and do so much more cheaply, but only if we drop the 2050 target, which is currently being used to drive subsidies towards impractical and expensive technologies.

This is a really positive, optimistic vision that would allow us to reinvigorate the freedom of the science and business communities to explore new technologies. I am absolutely confident that by doing this we can reduce our emissions and keep the lights on.

 


 

This speech was delivered to the Global Warming Policy Forum on 15th October 2014. The GWPF has placed it in the public domain.

Owen Paterson is MP for North Shropshire and a farmer environment secretary. His website is at owenpaterson.org.

 

 




385502

Future NOW Updated for 2026





Featuring pioneering eco-spiritual presenters: Peter Owen Jones, Satish Kumar, Chloe Goodchild, Tim Freke and Joe Hoare.

The Future NOW conference and charity fundraiser brings leading eco and wellbeing thinkers, writers, performers and activists to Bristol’s Trinity Centre on Saturday 8th November (10am-5pm) to raise the debate about the future and explore urgent solutions and mindful steps for sustaining the Earth so we can secure bright, happy and sustainable future lives for our children and grandchildren on this planet.

Peter Owen Jones, maverick 21st Century priest, BBC TV explorer and keynote speaker for Future NOW, says:
Humanity is in the process of bequeathing a poisonous and broken planet to the next generation. The systems we have inherited from the past are simply unable to create a sustainable future. Whilst we are doubtless approaching an end of some sort we are also beginning at last to dream of what a new humanity and new Earth might contain.  Future NOW will explore all that we need to sustain a future for all the myriad of life on this beautiful planet.

Organised by leading edge speakers, communications and events agency Conscious Frontiers together with celebrated Laughter Yoga expert and author Joe Hoare, Future NOW was inspired by the burgeoning Spiritual Ecology movement which seeks a spiritual response to our current ecological crisis, urging us to reconnect with Mother Earth as a sacred living being to which we all belong, and to recognise the Earth as the source of all life, not a resource to be plundered.
 
Featuring groundbreaking presentations and powerful performances from ‘Extreme Pilgrim’ Peter Owen Jones, ‘Earth Pilgrim’ Satish Kumar, ‘Big Love Philosopher’ Tim Freke, ‘Sacred Voice Pioneer’ Chloe Goodchild and ‘Laughing Yogi’ Joe Hoare – as well as interactive breakout sessions exploring and reflecting on the question, “What can I do differently?” – Future NOW is a call to become more mindful, more peaceful, more connected and more loving to ourselves, to each other and to the Earth.

With our planet approaching tipping point, we are faced with potentially devastating climate change and environmental meltdown caused by our unsustainable, materialistic way of life, threatening us with natural disasters, famine, diseases, mass social upheaval and loss of life. World renowned Zen master Thich Nhat Hanh refers to these calamities as “Bells of Mindfulness” warning us to wake up and urgently consider our impact on the planet before it’s too late.

Will Gethin, Director of Conscious Frontiers says:
Future NOW is a response to this call of the Earth. it’s an invitation to take an active role in shaping a more sustainable and harmonious future – a future where our outmoded Western material dream is replaced by a new dream of mindfulness, kindness, interconnectedness and community.”

50% of the proceeds from Future NOW will be donated to the benevolent charities/causes of the keynote speakers: The Resurgence Trust, The Life Cairn Project, The Naked Voice and The Alliance for Lucid Living, all of which further the event’s aim to create a happier and more harmonious future for our planet (for further information visit the FutureNOW charity page).

Joe Hoare, co-organiser of Future NOW says:
“Throughout the conference, participants are invited to explore how we can each make a difference and take urgent action to be the change in our daily lives. Future NOW is an invitation to join the New Consciousness Revolution.”

Event details:
Date: Saturday 8th November, 10am-5pm
Venue: Trinity Centre, Trinity Road, Bristol, BS2 0NW

BOOKING INFORMATION:
Future NOW tickets cost £55 (£65 on the door). A limited number of Early Bird tickets are currently available. For further information and bookings visit FutureNow

Future NOW speakers and organisers are available for interview
For Media Enquiries please contact Will Gethin at Conscious Frontiers
07795 204 833 or email Will Gethin

 




385478

Greenpeace victory – LEGO ends Shell promotion link Updated for 2026





Following a Greenpeace campaign attracting over a million supporters, LEGO published a statement this morning promising that its promotion deal with Shell will lapse:

“We continuously consider many different ways of how to deliver on our promise of bringing creative play to more children. We want to clarify that as things currently stand we will not renew the co-promotion contract with Shell when the present contract ends.”

This decision comes a month after Shell submitted plans to the US administration showing it’s once again gearing up to drill in the melting Arctic next year, and as it argues with US authorities to lower environmental standards in the Arctic.

During Greenpeace’s three month campaign, more than one million people signed a petition calling on LEGO to stop promoting Shell’s brand because of its plans to drill for oil in the pristine Arctic.

Ian Duff, Arctic campaigner at Greenpeace, said: “This is a major blow to Shell. It desperately needs partners like LEGO to help give it respectability and repair the major brand damage it suffered after its last Arctic misadventure. Lego’s withdrawal from a 50 year relationship with Shell clearly shows that strategy will not work.”

“The tide is turning for these fossil fuel dinosaurs that see the melting Arctic as ripe for exploitation rather than protection. The message should be clear; your outdated, climate wrecking practices are no longer socially acceptable, and you need to keep away from the Arctic or face being ostracised by society.”

LEGO committed to renewable energy

In stark contrast to Shell, LEGO’s policies include a commitment to produce more renewable energy than they use, phase out oil in their products and, in cooperation with its partners, leave a better world for future generations.

In its statement, LEGO argued the dispute was between Greenpeace and Shell. “The Greenpeace campaign uses the LEGO brand to target Shell. As we have stated before, we firmly believe Greenpeace ought to have a direct conversation with Shell.

“The LEGO brand, and everyone who enjoys creative play, should never have become part of Greenpeace’s dispute with Shell.”

However, Greenpeace insists that while LEGO is doing the right thing under public pressure, it should choose its partners more carefully when it comes to the threats facing our children from climate change.

Due to contractual obligations, LEGO’s current co-promotion with Shell will be honoured.

The fossil fuel industry is losing friends, fast

LEGO is the latest in a line of leading global companies to walk away from a relationship with the fossil fuel industry.

In late 2012 Waitrose announced it has put its partnership with Shell on ice and in the last month Microsoft, Google and Facebook all made commitments to end their support for ALEC, a controversial lobby group that campaigns against climate change legislation.

Only weeks ago, the Rockefeller Foundation announced it will begin pulling its investments in the fossil fuel industry.

Shell has also come under pressure for its sponsorship links to UK arts organisations including the Southbank Centre.

“LEGO’s decision couldn’t have come soon enough”, said Duff. “The iconic and beautiful Arctic, and its incredible wildlife, like polar bears and narwhals, is under threat like never before. Arctic sea ice is melting at an unprecedented rate, but instead of seeing the huge risks, oil companies like Shell are circling like vultures.

“Only weeks ago Shell gave us the clearest indication yet that it’s planning to go back to the Arctic as soon as next summer.”

Shell’s Arctic ambitions plagued with difficulties

Shell’s past attempts to drill in the Arctic have been plagued with multiple operational failings culminating in the running aground of its drilling rig, the Kulluk.

The extreme Arctic conditions, including giant floating ice-bergs and stormy seas, make offshore drilling extremely risky. And scientists say that in the Arctic, an oil spill would be impossible to clean up meaning devastation for the Arctic’s unique wildlife [6].

But on 28 August 2014 Shell submitted new plans to the US administration for offshore exploratory drilling in the Alaskan Arctic, meaning it’s on course to resurrect its Arctic drilling plans as early as summer 2015.

In the past two years, a massive global movement has emerged calling for a sanctuary around the North Pole, to protect the Arctic and its unique wildlife from the onslaught of oil drilling and industrial fishing.

More than six million people have joined the movement, and more than 1,000 influential people have signed an Arctic Declaration, including Archbishop Desmond Tutu, Emma Thompson and Sir Paul McCartney.

On 19 September UN Secretary General Ban Ki-moon, met with Arctic campaigners to receive a global petition and said he would consider convening an international summit to discuss the issue of Arctic protection.

 

 




385235

Nuclear power trumps democracy Updated for 2026





Why is our democracy failing to tackle the horrific urgency of the climate crisis and the decimation of our eco-systems?

And why are all the main political parties betting the farm on nuclear power in spite of its madhouse economics – and against all their promises to either oppose nuclear power altogether, or to refuse subsidies for it?

In my new book, The Prostitute State – How Britain’s Democracy Has Been Bought, I set out my view that there is a single problem at the root of our nation’s difficulties.

A corporate elite have hijacked the pillars of Britain’s democracy. The production of thought, the dissemination of thought, the implementation of thought and the wealth arising from those thoughts, are now controlled by a tiny, staggeringly rich elite.

As a result the UK is no longer a functioning democracy but has become a  ‘Prostitute State’ built on four pillars: a corrupted political system, a prostituted media, a perverted academia and a thieving tax-haven system.

This has disastrously resulted in a flood of wealth from the poor and middle classes to the top 1%. This stolen wealth is built on the destruction of the planet’s ecosystems, which are essential for humanity’s survival.

Nuclear power defeats democracy

The reversal of government policy on nuclear power is a classic example of how the Prostitute State trumps democracy. Betrayed environmental activists must understand that – notwithstanding the noble form of democratic structures – what they are really up against is a corrupt corporate state.

The concept of lobbying is reasonably well known, but few of us understand how far lobbying has penetrated and hijacked the political parties themselves.

For example, most people are perplexed at how the nuclear industry managed to persuade the UK’s previous Labour government to build a fleet of hugely expensive experimental nuclear power stations on land prone to flooding from rising sea levels.

They also struggle to comprehend and why Labour’s shadow energy and climate change minister, Caroline Flint MP, having stated that she would only support nuclear power if built without public subsidies, now supports the £15-20 billion subsidy package for Hinkley C nuclear power station

Labour managed managed this policy U-Turn despite the Three Mile Island, Chernobyl and Fukushima nuclear catastrophes; the failure to find safe waste-disposal sites capable of protecting radioactive waste for over 100,000 years; and insurance companies’ point blank refusal to provide nuclear accident insurance.

It’s the money, stupid

My simple answer is that the nuclear industry has poured millions of pounds year after year into a massive political lobbying campaign.

They bought a whole swathe of senior ex-politicians to work as nuclear lobbyists, spent a fortune on trying to manipulate public opinion through media and advertising, and even funded school trips to their nuclear plants.

As they managed to persuade a Labour government to abandon their 1997 election manifesto commitment to oppose new nuclear power stations, it is crucial to understand how deeply the nuclear lobby is embedded in the Labour party.

My personal belief is that a complex web of financial interests ensured that the Labour government served the nuclear industry – no matter what Labour party members or the British public wanted.

Just consider for example the following list of Labour Party politicians:

  • Former Energy Minister Brian Wilson became a non-executive director of Amec Nuclear, a client of BNFL, a nuclear operator.
  • Former Energy Minister Helen Liddell was hired to provide “strategic advice” by the nuclear corporation British Energy.
  • Former Secretary of State John Hutton, who as Business Secretary published the government White Paper announcing government plans to build new nuclear stations, was appointed Chair of the Nuclear Industry Association in 2011. He also joined the advisory board of US nuclear corporation Hyperion Power Generation in July 2010.
  • Colin Byrne, the Labour Party’s former chief press officer, headed up lobbying giant Weber Shandwick’s UK arm, which BNFL hired to lobby for new nuclear plants.
  • Gordon Brown’s brother, Andrew, was nuclear giant EdF’s head of media relations in the UK.
  • Yvette Cooper was the Planning Minister who introduced fast-track planning for nuclear power stations. Her father was chair of nuclear lobbyists The Nuclear Industry Association and is director of the Nuclear Decommissioning Authority.
  • Alan Donnelly, former leader of the Labour MEPs, runs the lobbying company Sovereign Strategy, which represented US nuclear engineering giant Fluor. His website promised “pathways to the decision makers in national governments”.
  • Former Labour Minister Jack Cunningham was legislative chair of the Transatlantic Nuclear Energy Forum, an organisation founded by lobbyist Alan Donnelly to foster “strong relationships” between nuclear power companies and governments.
  • The Tory Peer Lady Maitland was a paid member of Sovereign Strategy’s board.
  • Donnelly funded Labour leadership contender David Miliband’s constituency office refurbishment.
  • David Sainsbury, Labour Minister for Science from 1998 to 2006 told the House of Lords that he regarded nuclear power as a form of renewable energy.
  • Ed Miliband’s barrister wife Justine Thornton advised EdF Energy on its Development Consent Order for a new nuclear plant at Hinkley Point.

Of course I cannot say that the financial links of any individual with the nuclear industry had any bearing on the party’s change in policy. However this wholesale hiring of senior Labour Party figures by the nuclear lobby may have been influential in the fact that a number of key aims were achieved over the last ten years:

  • the reversal of Labour’s commitment to rule out new nuclear power stations.
  • Labour ministers’ introduction of a fast-track planning process for new nuclear plants without lengthy inquiries.


The saintly Lib Dems …

It is also noteworthy that whilst governments across the world were abandoning nuclear power after the Fukushima disaster, the new Tory / Lib Dem coalition abandoned their manifesto commitments to provide no public subsidy for new nuclear, by guaranteeing multi-billion pound annual subsidies.

The Tory / Lib Dem government also made the taxpayer liable for nuclear disaster costs, after the private insurers refused to do so – as just one catastrophic accident would bankrupt most global insurance companies.

    To understand the comparative power of political lobbying versus voting at elections, you need to realise that the final two aims above were achieved despite the Lib Dems having for decades supposedly opposed nuclear power and the Tories having opposed nuclear subsidies in the 2010 general election.

    I was never convinced by the Lib Dem leadership’s opposition to nuclear power after it successfully, in the late ’90s, squashed the adoption in policy papers of the phrase “a renewable energy economy” that I had proposed to replace “a low carbon economy” which they favoured.

    The latter of course allowed the switch to a pro-nuclear policy once the Lib Dems were in government.

    The prominent Lib Dem MP Ed Davey stood for election opposing nuclear energy, but as Secretary of State for Energy and Climate Change, he became nuclear power’s chief cheerleader – announcing that the government’s entire industrial strategy was now based on new nuclear!

    The UK government is already spending the equivalent of 93% of the Department of Energy and Climate Change’s entire annual budget on nuclear subsidies! This was achieved despite polls indicating overwhelming support by the public for renewable energy over nuclear power.

    Lib Dem nuclear links

    Ed Davey’s brother, Henry Davey, works for the global law firm Herbert Smith Freehills which has advised EdF on its purchase of nuclear plants and the development application for a new nuclear plant at Hinkley Point.

    Also Lib Dem peer Tim Clement-Jones, Nick Clegg’s Party Treasurer at the last general election and the Party’s spokesman on culture and sport in the House of Lords, is founder and chairman of Global Government Relations, the lobbying arm of the huge multinational law firm DLA Piper, and serves as DLA Piper’s London Managing Partner.

    DLA Piper is listed as a member of the Nuclear Industry Association, and boasts of its widespread experience with many nuclear industry companies. According to its website it

    • advised AREVA SA on their investment in New Nuclear Build at Hinkley Point C including the new Contract for Difference regime, waste management strategy and HM Treasury Infrastructure Guarantee Scheme.
    • advised Sellafield Limited on all aspects of their waste management and decommissioning programme covering annual capital spend of £1billion.
    • is advising the Nuclear Decommissioning Authority on the application of the International Nuclear Liability Conventions in respect of the marine transport of high level radioactive waste from Europe to Japan.
    • is advising nuclear supply chain on tendering exercises in support of new nuclear build in the UK.
    • is advising Westinghouse, Nuclear Decommissioning Authority, Magnox Limited and International Nuclear Services Limited on all aspects of fuel supply contracts, enrichment, waste management and radioactive transportation in support of activities in UK and globally.

    Of course this could all be complete coincidence and we cannot conclude that Lord Clement-Jones had any influence on Lib Dem policy changes as regards nuclear power.

    But what we do know is that Davey won the battle yesterday at the European Commission to overthrow the Commission’s previous ban on state aid for new nuclear power, following intense political and industry lobbying of the 28 Commissioners.

    Thus the Lib Dems’ legacy will be to have thrown open the floodgates to new nuclear power right across Europe, despite their election manifesto having promised to oppose it.

     


     

    Donnachadh McCarthy FRSA is a former Deputy Chair of the Liberal Democrats. He can be reached via his website 3acorns.

    This article is based on an extract from Donnachadh McCarthy’s new book ‘The Prostitute State – How Britain’s Democracy Has Been Bought‘. 

    Copies of ‘The Prostitute State – How Britain’s Democracy Has Been Bought‘ are available from theprostitutestate.co.uk.

    E-book version available from www.Lulu.com.

     

     




    365529

Extreme Inequality Updated for 2026





In January 2014 Oxfam revealed that the richest 85 people in the world had the same amount of wealth as the bottom half of the world’s population: over 3 billion people. This attracted global media interest. As usual, our claim was challenged, but not in the usual way. When Forbes magazine updated the data just a few months later, they found that we were wrong. It now took just the richest 69 people to equal the wealth of the poorest half!

The disparities between the rich and the poor are increasing. Just a few feet of wall in Rio separates the have-nots living in slums from the have-it-alls in the penthouse apartments next door. In the UK, newspaper articles on bankers’ billions sit alongside those documenting the rising number of people forced to rely on food banks.

Does this really matter? Some say that economic growth benefits and creates opportunities for all and that this must involve some getting richer than others; that attacking the very rich is an ideological position that helps no one. Oxfam’s interest is not about the rights and wrongs of wealth per se. It is about the fact that extreme inequality of wealth leads to extreme inequalities in all forms of power, policy and wellbeing, so that poor people do not benefit from improved health, education or opportunity, even in an economy that seems to be growing.

Over the last year there has been widespread recognition that increasing inequality of income and wealth cannot go on unabated. President Obama promised in his State of the Union address to tackle inequality of opportunity. Pope Francis tweeted to warn that inequality is “the root of social evil”. Even the global institutions with orthodox economic outlooks – including the IMF and the World Bank – have been warning of the dangers of inequality, and, in the case of the IMF’s Christine Lagarde, quoting Oxfam.

Leaders and institutions are beginning to challenge inequality head-on and people are paying attention to this debate. Not only was Thomas Piketty’s book Capital in the Twenty-first Century, about the link between rising inequality and wealth, a massive publishing success, but it also sparked a flood of soul-searching about the state of modern capitalism (see review page 58). That an economics tome of graphs and data can top best-seller lists on both sides of the Atlantic clearly demonstrates the resonance of this issue.

Why does this matter for development and wellbeing?
Over the last two decades we have seen impressive reductions in poverty and improvements in health, education and other key indicators in many of the poorest countries around the world. The rapid economic growth of emerging economies has seen many countries improve their prospects dramatically. While this is hugely encouraging, looking through the lens of simple averages masks the unequal fate of those left behind. A baby born into a rich family in prospering Nigeria will live a longer life with far greater opportunities than a baby born into a poor family.

Gender inequalities will exacerbate these discrepancies even further, with a boy likely to spend more than 10 years in school, compared to the three years of schooling that a girl can expect. These disparities are not just a phenomenon in developing countries. Here in the UK a child born in leafy Richmond, South West London can expect to live 15 years longer than one born in Tower Hamlets in the east of the city. That is a year of extra life for every mile covered as you travel across London.

Whilst the marginalised are falling behind, the elite are moving further ahead. In the US, the richest 10% have captured over 90% of economic growth since the recession, while the poor have got poorer. Money yields money and power.

This massive concentration of economic resources in the hands of a few people presents a significant threat to democracy and wider wellbeing. Those with money can use it to buy power and to set the rules, regulations and policies in their favour, creating a cycle of growing inequality and poverty and undermining opportunity.

Politicians and institutions that should represent citizens and keep inequality in check are instead being influenced by the rich and powerful, resulting in policies and actions that further widen the gap between rich and poor. Society becomes a vicious circle where wealth (income, assets and access to resources) and power (particularly political decision-making) are increasingly concentrated in the hands of a few, reinforcing the continued marginalisation and exclusion of the many. We saw this in the response to the financial crisis, with the banks and bankers bailed out whilst the poorest in society were left to suffer the costs of their risk-taking.

Everywhere I travel I see evidence of this. Women’s low status in society means that the issue of maternal health is neglected in budget allocations. The wives, sisters and daughters of the rich and powerful give birth safely in sparkling new private hospitals, so policymakers have very little incentive to care about the health-care provisions for the half of all women in sub-Saharan Africa who give birth in unsafe conditions without trained support.

It is clear that to eliminate poverty and achieve social justice we need to look beyond the country-level average and understand and address how resources, wealth, power and voice are distributed.

Breaking the cycle of inequality
We know change is possible when governments make the right choices and are accountable to the many, not the few. Countries like Bolivia and Brazil, for example, have in the last decade managed to grow their economies whilst making them more equal. Brazil has achieved this through targeted policies, including an increase in the minimum wage that has seen the poorest 10% receive an income growth above the national average, compared to the rich, who have had income growth below the average.

Bolivia has seen a much sharper fall in inequality, with its government introducing a range of new progressive spending programmes while, crucially, funding them by renegotiating the country’s oil and gas tax revenue. Conversely, robust growth in Zambia, averaging 4.6% between 2000 and 2006, was almost entirely captured by the richest 10%, who increased their share of the country’s wealth by more than 9% while poverty rates increased by almost 4%. When I visited Zambia last year for the first time in a decade, it had moved from low to middle income status. The economy had grown but there were actually more poor people.

Extreme inequality is not inevitable but is the result of policy choices. Different choices can reverse it: free public health services that help everyone while ensuring the poor are not left behind; decent wages that end working poverty; and progressive taxation so that the rich pay their fair share. Governments also need to ensure that there is space for people to have their voices heard to rebalance the power of political influence.

Whilst the Pope tweets and the World Bank blogs about inequality, and as new data raises even louder alarm bells, governments and policymakers around the world can choose to seize this opportunity and be leaders in challenging inequality and restoring social and economic justice. Governments everywhere must commit to a more progressive agenda for redistribution and for a fairer world. Power and special interests must not be allowed to push us to the alternative of being tipped irrevocably into a world that caters only for the privileged.

 


Mark Goldring is the Chief Executive of Oxfam GB. He will be speaking at the Resurgence & Ecologist Festival of Wellbeing on 11 October 2014 in London. For more information and bookings: Festival of Wellbeing bookings

 

 




384965

‘New’ reactor types are all nuclear pie in the sky Updated for 2026





Some nuclear enthusiasts and lobbyists favour non-existent Integral Fast Reactors, others favour non-existent Liquid Fluoride Thorium Reactors, others favour non-existent Pebble Bed Modular Reactors, others favour non-existent fusion reactors. And on it goes.

Two to three decades ago, the nuclear industry promised a new generation of gee-whiz ‘Generation IV’ reactors in two to three decades. That’s what they’re still saying now, and that’s what they’ll be saying two to three decades from now. The Generation IV International Forum website states:

“It will take at least two or three decades before the deployment of commercial Gen IV systems. In the meantime, a number of prototypes will need to be built and operated. The Gen IV concepts currently under investigation are not all on the same timeline and some might not even reach the stage of commercial exploitation.”

The World Nuclear Association notes that “progress is seen as slow, and several potential designs have been undergoing evaluation on paper for many years.”

Integral Fast Reactors … it gets ugly moving from blueprint to backyard

Integral Fast Reactors (IFRs) are a case in point. According to the lobbyists they are ready to roll, will be cheap to build and operate, couldn’t be used to feed WMD proliferation, etc. The US and UK governments have been analysing the potential of IFRs.

The UK government found that:

  • the facilities have not been industrially demonstrated;
  • waste disposal issues remain unresolved and could be further complicated if it is deemed necessary to remove sodium from spent fuel to facilitate disposal; and
  • little could be ascertained about cost since General Electric Hitachi refuses to release estimates of capital and operating costs, saying they are “commercially sensitive”.

The US government has also considered the use of IFRs (which it calls Advanced Disposition Reactors – ADR) to manage US plutonium stockpiles and concluded that:

  • the ADR approach would be more than twice as expensive as all the other options under consideration;
  • it would take 18 years to construct an ADR and associated facilities; and
  • the ADR option is associated with “significant technical risk”.

Unsurprisingly, the IFR rhetoric doesn’t match the sober assessments of the UK and US governments. As nuclear engineer Dave Lochbaum from the Union of Concerned
Scientists puts it:

“The IFR looks good on paper. So good, in fact, that we should leave it on paper. For it only gets ugly in moving from blueprint to backyard.”

Small Modular Reactors … no-one actually wants to buy one

In any case, IFRs are yesterday’s news. Now it’s all about Small Modular Reactors (SMRs). The Energy Green Paper recently released by the Australian government is typical of the small-is-beautiful rhetoric:

“The main development in technology since 2006 has been further work on Small Modular Reactors (SMRs). SMRs have the potential to be flexibly deployed, as they are a simpler ‘plug-in’ technology that does not require the same level of operating skills and access to water as traditional, large reactors.”

The rhetoric doesn’t match reality. Interest in SMRs is on the wane. Thus Thomas W. Overton, associate editor of POWER magazine, wrote in a recent article:

“At the graveyard wherein resides the “nuclear renaissance” of the 2000s, a new occupant appears to be moving in: the small modular reactor (SMR). … Over the past year, the SMR industry has been bumping up against an uncomfortable and not-entirely-unpredictable problem: It appears that no one actually wants to buy one.”

Overton notes that in 2013, MidAmerican Energy scuttled plans to build an SMR-based plant in Iowa. This year, Babcock & Wilcox scaled back much of its SMR program and sacked 100 workers in its SMR division. Westinghouse has abandoned its SMR program. As he explains:

“The problem has really been lurking in the idea behind SMRs all along. The reason conventional nuclear plants are built so large is the economies of scale: Big plants can produce power less expensively per kilowatt-hour than smaller ones.

“The SMR concept disdains those economies of scale in favor of others: large-scale standardized manufacturing that will churn out dozens, if not hundreds, of identical plants, each of which would ultimately produce cheaper kilowatt-hours than large one-off designs.

“It’s an attractive idea. But it’s also one that depends on someone building that massive supply chain, since none of it currently exists. … That money would presumably come from customer orders – if there were any. Unfortunately, the SMR “market” doesn’t exist in a vacuum.

“SMRs must compete with cheap natural gas, renewables that continue to decline in cost, and storage options that are rapidly becoming competitive. Worse, those options are available for delivery now, not at the end of a long, uncertain process that still lacks [US Nuclear Regulatory Commission] approval.”

Can’t find customers, can’t find investors

Dr Mark Cooper, Senior Fellow for Economic Analysis at the Institute for Energy and the Environment, Vermont Law School, notes that two US corporations are pulling out of SMR development because they cannot find customers (Westinghouse) or major investors (Babcock and Wilcox). Cooper points to some economic constraints:

“SMR technology will suffer disproportionately from material cost increases because they use more material per MW of capacity. Higher costs will result from: lost economies of scale; higher operating costs; and higher decommissioning costs. Cost estimates that assume quick design approval and deployment are certain to prove to be wildly optimistic.”

Academics M.V. Ramana and Zia Mian state in their detailed analysis of SMRs: “Proponents of the development and large scale deployment of small modular reactors suggest that this approach to nuclear power technology and fuel cycles can resolve the four key problems facing nuclear power today: costs, safety, waste, and proliferation.

“Nuclear developers and vendors seek to encode as many if not all of these priorities into the designs of their specific nuclear reactor. The technical reality, however, is that each of these priorities can drive the requirements on the reactor design in different, sometimes opposing, directions.

“Of the different major SMR designs under development, it seems none meets all four of these challenges simultaneously. In most, if not all designs, it is likely that addressing one of the four problems will involve choices that make one or more of the other problems worse.”

The future is in … decommissioning

Likewise, Kennette Benedict, Executive Director of the Bulletin of the Atomic Scientists, states: “Without a clear-cut case for their advantages, it seems that small nuclear modular reactors are a solution looking for a problem.

“Of course in the world of digital innovation, this kind of upside-down relationship between solution and problem is pretty normal. Smart phones, Twitter, and high-definition television all began as solutions looking for problems.

“In the realm of nuclear technology, however, the enormous expense required to launch a new model as well as the built-in dangers of nuclear fission require a more straightforward relationship between problem and solution.

“Small modular nuclear reactors may be attractive, but they will not, in themselves, offer satisfactory solutions to the most pressing problems of nuclear energy: high cost, safety, and weapons proliferation.”

And as Westinghouse CEO Danny Roderick said in January: “The problem I have with SMRs is not the technology, it’s not the deployment – it’s that there’s no customers.”

Instead of going for SMRs, IFRs, Pebble Bed Reactors or thorium technologies, Westinghouse is looking to triple the one area where it really does have customers: its decommissioning business. “We see this as a $1 billion-per-year business for us”, Roderick said.

With the world’s fleet of mostly middle-aged reactors inexorably becoming a fleet of mostly ageing, decrepit reactors, Westinghouse is getting ahead of the game.

The writing is on the wall

Some SMR R&D work continues but it all seems to be leading to the conclusions mentioned above. Argentina is ahead of the rest, with construction underway on a 27 MWe reactor – but the cost equates to an astronomical US$15.2 billion per 1,000 MWe. Argentina’s expertise with reactor technology stems from its covert weapons program from the 1960s to the early 1980s.

So work continues on SMRs but the writing’s on the wall and it’s time for the nuclear lobby to come up with another gee-whiz next-gen fail-safe reactor type to promote … perhaps a giant fusion reactor located out of harm’s way, 150 million kilometres from Earth.

And while the ‘small is beautiful’ approach is faltering, so too is the ‘bigger is better’ mantra. The 1,600 MW Olkiluoto-3 European Pressurized Reactor (EPR) under construction in Finland is nine years behind schedule (and counting) and US$6.9 billion over-budget (and counting).

The UK is embarking on a hotly-contested plan to build two 1,600 MW EPRs at Hinkley Point with a capital cost of US$26 billion and mind-boggling public subsidies.

Economic consulting firm Liberum Capital said Hinkley Point will be “both the most expensive power station in the world and also the plant with the longest construction period.”

 


 

Dr Jim Green is the national nuclear campaigner with Friends of the Earth Australia and editor of the Nuclear Monitor newsletter. Nuclear Monitor is published 20 times a year. It has been publishing deeply researched, often strongly critical articles on all aspects of the nuclear cycle since 1978. A must-read for all those who work on this issue!

 

 




384864

‘New’ reactor types are all nuclear pie in the sky Updated for 2026





Some nuclear enthusiasts and lobbyists favour non-existent Integral Fast Reactors, others favour non-existent Liquid Fluoride Thorium Reactors, others favour non-existent Pebble Bed Modular Reactors, others favour non-existent fusion reactors. And on it goes.

Two to three decades ago, the nuclear industry promised a new generation of gee-whiz ‘Generation IV’ reactors in two to three decades. That’s what they’re still saying now, and that’s what they’ll be saying two to three decades from now. The Generation IV International Forum website states:

“It will take at least two or three decades before the deployment of commercial Gen IV systems. In the meantime, a number of prototypes will need to be built and operated. The Gen IV concepts currently under investigation are not all on the same timeline and some might not even reach the stage of commercial exploitation.”

The World Nuclear Association notes that “progress is seen as slow, and several potential designs have been undergoing evaluation on paper for many years.”

Integral Fast Reactors … it gets ugly moving from blueprint to backyard

Integral Fast Reactors (IFRs) are a case in point. According to the lobbyists they are ready to roll, will be cheap to build and operate, couldn’t be used to feed WMD proliferation, etc. The US and UK governments have been analysing the potential of IFRs.

The UK government found that:

  • the facilities have not been industrially demonstrated;
  • waste disposal issues remain unresolved and could be further complicated if it is deemed necessary to remove sodium from spent fuel to facilitate disposal; and
  • little could be ascertained about cost since General Electric Hitachi refuses to release estimates of capital and operating costs, saying they are “commercially sensitive”.

The US government has also considered the use of IFRs (which it calls Advanced Disposition Reactors – ADR) to manage US plutonium stockpiles and concluded that:

  • the ADR approach would be more than twice as expensive as all the other options under consideration;
  • it would take 18 years to construct an ADR and associated facilities; and
  • the ADR option is associated with “significant technical risk”.

Unsurprisingly, the IFR rhetoric doesn’t match the sobre assessments of the UK and US governments. As nuclear engineer Dave Lochbaum from the Union of Concerned
Scientists puts it:

“The IFR looks good on paper. So good, in fact, that we should leave it on paper. For it only gets ugly in moving from blueprint to backyard.”

Small Modular Reactors … no-one actually wants to buy one

In any case, IFRs are yesterday’s news. Now it’s all about Small Modular Reactors (SMRs). The Energy Green Paper recently released by the Australian government is typical of the small-is-beautiful rhetoric:

“The main development in technology since 2006 has been further work on Small Modular Reactors (SMRs). SMRs have the potential to be flexibly deployed, as they are a simpler ‘plug-in’ technology that does not require the same level of operating skills and access to water as traditional, large reactors.”

The rhetoric doesn’t match reality. Interest in SMRs is on the wane. Thus Thomas W. Overton, associate editor of POWER magazine, wrote in a recent article:

“At the graveyard wherein resides the “nuclear renaissance” of the 2000s, a new occupant appears to be moving in: the small modular reactor (SMR). … Over the past year, the SMR industry has been bumping up against an uncomfortable and not-entirely-unpredictable problem: It appears that no one actually wants to buy one.”

Overton notes that in 2013, MidAmerican Energy scuttled plans to build an SMR-based plant in Iowa. This year, Babcock & Wilcox scaled back much of its SMR program and sacked 100 workers in its SMR division. Westinghouse has abandoned its SMR program. As he explains:

“The problem has really been lurking in the idea behind SMRs all along. The reason conventional nuclear plants are built so large is the economies of scale: Big plants can produce power less expensively per kilowatt-hour than smaller ones.

“The SMR concept disdains those economies of scale in favor of others: large-scale standardized manufacturing that will churn out dozens, if not hundreds, of identical plants, each of which would ultimately produce cheaper kilowatt-hours than large one-off designs.

“It’s an attractive idea. But it’s also one that depends on someone building that massive supply chain, since none of it currently exists. … That money would presumably come from customer orders – if there were any. Unfortunately, the SMR “market” doesn’t exist in a vacuum.

“SMRs must compete with cheap natural gas, renewables that continue to decline in cost, and storage options that are rapidly becoming competitive. Worse, those options are available for delivery now, not at the end of a long, uncertain process that still lacks [US Nuclear Regulatory Commission] approval.”

Can’t find customers, can’t find investors

Dr Mark Cooper, Senior Fellow for Economic Analysis at the Institute for Energy and the Environment, Vermont Law School, notes that two US corporations are pulling out of SMR development because they cannot find customers (Westinghouse) or major investors (Babcock and Wilcox). Cooper points to some economic constraints:

“SMR technology will suffer disproportionately from material cost increases because they use more material per MW of capacity. Higher costs will result from: lost economies of scale; higher operating costs; and higher decommissioning costs. Cost estimates that assume quick design approval and deployment are certain to prove to be wildly optimistic.”

Academics M.V. Ramana and Zia Mian state in their detailed analysis of SMRs: “Proponents of the development and large scale deployment of small modular reactors suggest that this approach to nuclear power technology and fuel cycles can resolve the four key problems facing nuclear power today: costs, safety, waste, and proliferation.

“Nuclear developers and vendors seek to encode as many if not all of these priorities into the designs of their specific nuclear reactor. The technical reality, however, is that each of these priorities can drive the requirements on the reactor design in different, sometimes opposing, directions.

“Of the different major SMR designs under development, it seems none meets all four of these challenges simultaneously. In most, if not all designs, it is likely that addressing one of the four problems will involve choices that make one or more of the other problems worse.”

The future is in … decommissioning

Likewise, Kennette Benedict, Executive Director of the Bulletin of the Atomic Scientists, states: “Without a clear-cut case for their advantages, it seems that small nuclear modular reactors are a solution looking for a problem.

“Of course in the world of digital innovation, this kind of upside-down relationship between solution and problem is pretty normal. Smart phones, Twitter, and high-definition television all began as solutions looking for problems.

“In the realm of nuclear technology, however, the enormous expense required to launch a new model as well as the built-in dangers of nuclear fission require a more straightforward relationship between problem and solution.

“Small modular nuclear reactors may be attractive, but they will not, in themselves, offer satisfactory solutions to the most pressing problems of nuclear energy: high cost, safety, and weapons proliferation.”

And as Westinghouse CEO Danny Roderick said in January: “The problem I have with SMRs is not the technology, it’s not the deployment – it’s that there’s no customers.”

Instead of going for SMRs, IFRs, Pebble Bed Reactors or thorium technologies, Westinghouse is looking to triple the one area where it really does have customers: its decommissioning business. “We see this as a $1 billion-per-year business for us”, Roderick said.

With the world’s fleet of mostly middle-aged reactors inexorably becoming a fleet of mostly ageing, decrepit reactors, Westinghouse is getting ahead of the game.

The writing is on the wall

Some SMR R&D work continues but it all seems to be leading to the conclusions mentioned above. Argentina is ahead of the rest, with construction underway on a 27 MWe reactor – but the cost equates to an astronomical US$15.2 billion per 1,000 MWe. Argentina’s expertise with reactor technology stems from its covert weapons program from the 1960s to the early 1980s.

So work continues on SMRs but the writing’s on the wall and it’s time for the nuclear lobby to come up with another gee-whiz next-gen fail-safe reactor type to promote … perhaps a giant fusion reactor located out of harm’s way, 150 million kilometres from Earth.

And while the ‘small is beautiful’ approach is faltering, so too is the ‘bigger is better’ mantra. The 1,600 MW Olkiluoto-3 European Pressurized Reactor (EPR) under construction in Finland is nine years behind schedule (and counting) and US$6.9 billion over-budget (and counting).

The UK is embarking on a hotly-contested plan to build two 1,600 MW EPRs at Hinkley Point with a capital cost of US$26 billion and mind-boggling public subsidies.

Economic consulting firm Liberum Capital said Hinkley Point will be “both the most expensive power station in the world and also the plant with the longest construction period.”

 


 

Dr Jim Green is the national nuclear campaigner with Friends of the Earth Australia and editor of the Nuclear Monitor newsletter. Nuclear Monitor is published 20 times a year. It has been publishing deeply researched, often strongly critical articles on all aspects of the nuclear cycle since 1978. A must-read for all those who work on this issue!

 

 




384864

Billionnaires against fossil fuels Updated for 2026





The latest fund to announce its divestment from fossil fuels is none other then the heir to the Rockefeller fortune, built on oil and coal.

Coinciding with today’s UN Climate Change Summit in New York, the Rockefeller Brother’s Fund said that not only would it pull vast sums of money out of fossil fuels, but that it would funnel the money into clean energy.

This latest announcement is further evidence that the divestment movement is unstoppably gaining traction and snowballing, fast.

Institutions across the globe have begun to pledge to divest from fossil fuels in support of the climate change campaign. This list includes the British Medical Association and the Church of Sweden.

The combined asset size of the 837 institutions and individuals committing to divest amounts to more than $50 billion, campaign group 350.org has calculated. 

$50 billion moving out of fossil fuels

The move towards rapid divestment form individuals and institutions has been a result of support for the climate change movement.

The demand for climate change action was evident on Sunday when an estimated 40,000 people took to the streets of London for the Peoples Climate March, which saw over 2,000 protests take place around the world in a bid to make world leaders take solid action towards a stopping climate change.

The movement also took New York by storm with an estimated 400,000 marchers, as well as Rio, Jakarta, Brisbane and hundreds of cities around the world.

In New York, many of the 50,000 students, faith groups, state contingents, and groups carrying banners representing cities or towns, also wore orange squares representing fossil fuel divestment.

Records show that 181 institutions and local governments and 656 individuals representing over $50 billion dollars have pledged to divest to-date.

That number includes the $860 million which will be redirected from fossil fuels by the Rockefeller Brothers Fund. The report indicates that divestment commitments have doubled in the eight months since January 2014.

But emissions keep on increasing

Yet carbon dioxide emissions, the main contributor to global warming, are set to rise again in 2014 – reaching a record high of 40 billion tonnes, according to research from the University of East Anglia (UEA).

The 2.5% projected rise in burning fossil fuels has been revealed by the Global Carbon Project, which is co-led in the UK by researchers at the Tyndall Centre for Climate Change Research at UEA and the College of Engineering, Mathematics and Physical Sciences at theUniversity of Exeter.

The latest annual update of the Global Carbon Budget shows that total future CO2 emissions cannot exceed 1,200 billion tonnes – for a likely 66% chance of keeping average global warming under two degrees Celsius.

At the current rate of CO2 emissions, this 1,200 billion tonne CO2 ‘quota’ would be used up in around 30 years. This means that there is just one generation before the safeguards to a two degrees limit may be breached.

‘Unburnable’ carbon

To avoid this, a team of international climate scientists have said that more than half of all fossil fuel reserves may need to be left in the ground and are essentially ‘unburnable’.

Professor Corinne Le Quéré, Director of the Tyndall Centre at UEA, said: “The human influence on climate change is clear. “We need substantial and sustained reductions in CO2 emissions from burning fossil fuels if we are to limit global climate change.

“We are nowhere near the commitments necessary to stay below two degrees celsius of climate change, a level that will be already challenging to manage for most countries around the world, even for rich nations.”

Professor Pierre Friedlingstein, from the University of Exeter, said: “The time for a quiet evolution in our attitudes towards climate change is now over. Delaying action is not an option – we need to act together, and act quickly, if we are to stand a chance of avoiding climate change not long into the future, but within many of our own lifetimes.

He added: “We have already used two-thirds of the total amount of carbon we can burn, in order to keep warming below the crucial two degrees Celsius level. If we carry on at the current rate we will reach our limit in as little as 30 years’ time – and that is without any continued growth in emission levels.

“The implication of no immediate action is worryingly clear – either we take a collective responsibility to make a difference, and soon, or it will be too late.”

 


 

This article was originally published by Trillion Fund.

 




384613