Monthly Archives: May 2016

Climate Negotiations: tackling the big questions before COP22

The first week of the negotiations started slowly, and ended even slower. Negotiators look like they still have some sort of bad hangover, thanks to the fact they are still celebrating the Paris agreement. And while discussions take place inside the UN building in Bonn, Sri Lanka tries to recover from the worst floods in its history, India reports the hottest day every recorded in the country and Carbon Brief warn that we only have five years until the 1,5°C carbon budget is blown.

 

As the international community prepares for November’s COP22 in Morocco, this is the first round of technical discussions following COP21 in December, which resulted in an historic agreement signed by almost 200 nations. And as the dust settles after the victory of the Paris Agreement, the burning questions now concern its implementation.

 

The negotiations started last week with a plenary welcomed by COP president Ségolène Royal, who replaced Laurent Fabius. Christiana Figueres, outgoing UNFCCC Executive Secretary, reminded everyone to put human rights at the centre of development. While the incoming COP22 president Salaheddine Mezouar said that the upcoming COP in Marrakech will be one of actions, emphasising the importance of preparing ‘the roadmap’ to mobilise $100bn by the year 2020. On Wednesday, the UN Secretary General, Ban Ki-Moon officially announced the appointment of Patricia Espinosa of Mexico to the UNFCCC Executive Secretary.

 

The major political issues now being discussed in Bonn concern financing, and how to find and allocate the $100bn (by 2020), which was agreed in Paris. Unsurprisingly, this has been the key concern, especially for developing countries that are mostly affected by climate change

 

Adaptation is, of course, another issue for frontline nations whose argument is that these talks cannot focus solely on mitigation, especially when so many countries are already feeling the impacts of climate change. There is also a sense that countries must now realistically review their pledges and adopt much more ambitious commitments to get to  the 1.5 degrees.

 

Most importantly, perhaps, is the creation of APA – a new Ad-Hoc Working Group which will define the modalities of implementing the Paris Agreement, working together with the Subsidiary Bodies for Implementation and Scientific and Technological Advice. 

 

Sarah Baashan from Saudi Arabia and New Zealand’s Jo Tyndall are the two newly nominated female co-chairs for APA. Defining the agenda of this new group proved difficult in the course of the first week. The G77 and China proposed amendments, trying to ensure that it will not only focus on mitigation measures. What they want is a focus on other elements such as adaptations, loss and damage, finance, technology transfer and capacity building. Because of this, the APA negotiations were suspended until an agenda was agreed on Friday. Outside the negotiations, France has ratified the Paris Agreement.

 

Developing countries Ecuador, Guatemala and Bolivia, have called for rules on the participation of observers in COP22, as some have “commercial interests”. The move was supported by civil society organisations such as Corporate Accountability International, which highlighted the big presence of the fossil fuel industry during climate negotiations in Paris.

 

As we enter into the second week of the technical talks, everything indicates that the world’s governments have a long way to go, and in truth, must return to the negotiation table. With more and more citizens around the world pressing for immediate action, it remains to be seen whether our representatives can and will intensify efforts to achieve the goal of a cleaner, more sustainable future.

 

Pavlos Georgiadis is an ethnobotanist, climate tracker and agrifood author. He tweets at: @geopavlos

 

 

 

 

 

The problem is not glyphosate, or DDT, or BPA – we must challenge the entire system!

Piecemeal, and at long last, chemical manufacturers have begun removing the endocrine-disrupting plastic bisphenol-A (BPA) from products they sell.

Sunoco no longer sells BPA for products that might be used by children under three. France has a national ban on BPA food packaging. The EU has banned BPA from baby bottles.

These bans and associated product withdrawals are the result of epic scientific research and some intensive environmental campaigning. But in truth these restrictions are not victories for human health. Nor are they even losses for the chemical industry.

For one thing, the chemical industry now profits from selling premium-priced BPA-free products. These are usually made with the chemical substitute BPS, which current research suggests is even more of a health hazard than BPA. But since BPS is far less studied, it will likely take many years to build a sufficient case for a new ban.

But the true scandal of BPA is that such sagas have been repeated many times. Time and again, synthetic chemicals have been banned or withdrawn only to be replaced by others that are equally harmful, and sometimes are worse.

Neonicotinoids, which the International Union for the Conservation of Nature (IUCN) credits with creating a global ecological catastrophe, are modern replacements for long-targeted organophosphate pesticides. Organophosphates had previously supplanted DDT and the other organochlorine pesticides from whose effects many bird species are only now recovering.

The same is likely to happen with glyphosate – whose authorisation the EU notably failed to renew yesterday. If the EU does ban the herbicide in the next few months, the most likely outcome by far is that farmers will reach for another bottle. They will only spray 2,4-D, dicamba and glufosinate (phosphinothricin) instead.

The ‘complex illusion’ of risk assessment

So the big and urgent question is this: if chemical bans are ineffective (or worse), what should anyone who wants to protect themselves and everyone else from flame retardants, pesticides, herbicides, endocrine disruptors, plastics and so on – but who doesn’t expect much help from their government or the polluters themselves – do?

What would an effective grassroots strategy for the protection of people and ecosystems from toxic exposures look like?

  • Ought its overarching goal be a reduction in total population exposures and/or fewer chemical sales?
  • Or should it aim for sweeping bans, such as of entire chemical classes?
  • Or bans on specific usages (e.g. in all food or in all of agriculture)?
  • Or on chemical use in particular geographic locations (e.g in/around all schools)?
  • Or perhaps a better demand would be the dismantling (with or without replacement) of existing regulatory agencies, such as the culpable EPA?
  • Or should chemical homicide be made a statutory crime? Or all of these together?
  • And last, but not least, how can such goals be achieved given the finances and politics of our age?

The first task of chemical campaigning is to strip away the mythologies which currently surround the science of toxicology and the practice of chemical risk assessment. When we do this we find that chemical regulations don’t work.

The chief reason, which is easy to demonstrate, is that the elementary experiments performed by toxicologists are incapable of generating predictions of safety that can usefully be applied to other species, or even to the same species when it exists in other environments or if it eats other diets. Numerous scientific experiments have shown this deficiency, and consequently that the most basic element of chemical risk assessment is scientifically invalid.

For this reason, and many others too, the protection chemical risk assessments claim to offer is a pretense. As I will show, risk assessment is not a reality, it is a complex illusion.

This diagnosis may seem improbable and also depressing, but instead it reveals promising new political opportunities to end pollution and create a sustainable world. Because even in the world of chemical pollution, the truth can set you free.

The ensuing discussion, it should be noted, makes no significant effort to distinguish human health effects from effects on ecological systems. While these are often treated under separate regulatory jurisdictions, in practice, risks to people and ecosystems are difficult if not impossible to separate.

The story of the toxicological alarms surrounding BPA, which are diverse and scientifically extremely well substantiated, make an excellent starting point for this task.

Ignoring the full toxicity of BPA

According to the scientific literature, exposure to BPA in adulthood has numerous effects. It leads to stem cell and sperm cell defects (humans), prostate cancer (humans), risk of breast cancer (human and rats), blood pressure rises (humans), liver tumours and obesity (humans and mice) (Grun and Blumberg 2009; Bhan et al., 2014; Prins 2014).

However, foetuses exposed to BPA suffer from a significantly different spectrum of harms. These range from altered organ development (in monkeys) to food intolerance (in humans) (Ayyanan et al., 2011; Menard 2014; vom Saal et al., 2014).

Also in humans, early BPA exposures can lead to effects that are nevertheless delayed until much later in life, including psychiatric, social and behavioural abnormalities indicative of altered brain functions (Braun et al., 2011; Perera et al., 2012; Evans et al., 2014).

The above examples are just a representative handful. They are drawn from a much larger body of at least 200 publications (some have estimated a thousand publications) finding harms from BPA. The sheer quantity of results, the diversity of species tested, of consequences found, and of scientific methodologies used, represent a massive accumulation of scientific evidence that BPA is harmful (reviewed in Vandenberg et al., 2012). The evidence against BPA being safe, in short, is as close to unimpeachable as science can manage.

Nevertheless, such a large evidence base indicates that anti-BPA campaigning has been only partially successful. All the bans and the commercial withdrawals still ignore the implications of some of the most alarming scientific findings of all. For example, bans on baby bottles will not prevent foetal exposure. Nor will they prevent harms that result even from very low doses of BPA.

Ignoring the toxicity of BPS

The chemical most frequently used to make BPA-free products is called BPS. As its name implies, BPS is very similar in chemical structure to BPA (see Fig. 1). However, BPS appears to be absorbed by the human body significantly more readily than BPA and is already detectable in 81% of Americans (Liao et al., 2012).

Research into the toxicology of BPS is still at an early stage, but BPS is now looking likely to be even more toxic than BPA (Rochester and Bolden 2015). Like BPA, BPS has been found to interfere with mammalian hormonal activity. To a greater extent than BPA, BPS alters nerve cell creation in the zebrafish hypothalamus and causes behavioral hyperactivity in exposed zebrafish larvae (Molina-Molina et al., 2013; Kinch et al., 2015).

These latter results were observed at the extremely low chemical concentrations of 0.0068uM. This is 1,000-fold lower than the official U.S. levels of acceptable human exposure. The dose was chosen by the researchers since it is the concentration of BPA in the river that passes their laboratory.

Chemical substitutions are business as usual

The substitution of one synthetic chemical for another, wherein the substitute later turns out to be hazardous, is not a new story. Indeed, a great many of the chemicals that environmental campaigners nowadays oppose (such as Monsanto’s best-selling herbicide Roundup) are still considered by many in their industries to be ‘newer’ and ‘safer’ substitutes for chemicals (such as 2,4,5-T) that are no longer widely used.

Thus, when the EU banned the herbicide atrazine, Syngenta replaced it with terbuthylazine. Terbuthylazine is chemically very similar and, according to University of California researcher Tyrone Hayes, it appears to have similar ecological and health effects.

The chemical diacetyl was forced off the market for causing ‘popcorn lung‘. However, it has been largely replaced by dimers and trimers of the same chemical. Unfortunately, the safety of these multimers is highly dubious since it is believed that, in use, they break down into diacetyl.

The Bt pesticides produced inside GMO crops are considered (by farmers and agribusiness) to be safer substitutes for organochlorine, carbamate, and organophosphate insecticides. These chemicals replaced DDT, which was banned in agriculture following Rachel Carson’s Silent Spring. DDT was itself the replacement for lead-arsenate. Many other examples of what are sometimes called regrettable substitutions can be found.

Chemical bans (or often manufacturer withdrawals) that precede such substitutions are nevertheless normally celebrated as campaigning victories. But the chemical manufacturers know that substitution is an ordinary part of business. Because weeds and pests become resistant and patents run out, they are usually looking for substitutes irrespective of any environmental campaigning.

Manufacturers also know that, since approvals and permits initially rely primarily on data supplied by the applicant (and which is often anyway incomplete), problems with safety typically manifest only later, as independent data and practical experience accumulate. Given this current system it is almost inevitable that older (or more widely used) chemicals typically have a dubious safety record while newer ones are considered ‘safer’.

‘Bad actors’: the rotten apple defence in toxicology

In these cycles, of substituting one toxin for another, BPA is likely to become a classic.

Environmental health non-profits become active participants in this toxic treadmill when they implicitly treat certain chemicals as rotten apples. Some even explicitly refer to particular chemicals as ‘bad actors‘. The chemical ‘bad actor’ framing strongly implies that the methods and institutions of chemical regulation are not at fault.

But we can ask the question, in what chemical or biological sense can BPA be termed a bad actor? Is there, for example, a specific explanation for how it slipped through the safety net?

The very short answer to this question is to recall the results noted above: BPA impairs mammalian hormonal and reproductive systems; it disrupts brain function; it impacts stem cell development; it causes obesity and probably cancer; it causes erectile dysfunction. Many hundreds of research papers attest that BPA’s harmful effects are numerous, diverse, prolonged, reproducible and found in many species. In short, they are easy to detect (e.g. vom Saal et al., 2014).

So while hundreds of scientists outside the regulatory loop have found problems, the formal chemical regulatory system has never flagged BPA, even though astonishingly, long before it was thought of as a plastic, BPA first came to the attention of science in specific searches for estrogen-mimicking (i.e. hormone-disrupting) compounds.

And despite the overwhelming nature of the published evidence regulators still resist concluding that BPA is a health hazard. And so the clear answer to the ‘bad actor’ question is that there is no special reason why BPA should have slipped through the regulatory process; instead, the case of BPA strongly suggests a dysfunctional regulatory system.

Framing the problem of pollution as being caused a few ‘bad actor’ chemicals is equally inconsistent with the facts in other cases too. Chemical regulatory systems initially approved but have sometimes later banned or restricted (and always under public pressure): atrazine, endosulfan, Roundup (glyphosate), lindane, methyl bromide, methyl iodide, 2,4,5-T, chlorpyrifos, DDT and others.

Many other chemicals are strongly implicated as harmful by extensive and compelling independent scientific evidence that has so far not been acted on. And of course, chemical regulators have graduated whole classes of ‘bad actors’: the organophosphate pesticides, PCBs, organochlorine pesticides, chlorofluorocarbons, neonicotinoids, phthalates, flame retardants, perfluorinated compounds, and so on.

How many bad actors ought it to take before we instead indict the whole show?

Chemical regulation in theory and practice: the limits of toxicology

An alternative approach to judging regulatory systems by their results, is to analyse them directly and assess their internal logic and rigour. Thus one can ask what is known about the technical limitations of toxicology and the overall scientific rigour of chemical risk assessment?

And, secondly, one can direct attention to the social and institutional practices of chemical regulation. Are chemical risk assessments, for example, being applied by competent and well-intentioned institutions?

The technical limitations of chemical risk assessment are rarely discussed in detail (but see Buonsante et al., 2014). A full discussion would be lengthy, but some of the most important limitations are outlined in the paragraphs below.

The standard assays of toxicology involve the administration (usually oral feeding) of chemicals in short term tests of up to 90 days to defined strains of organisms (most often rats or mice). These test organisms are of a specified age and are fed standardised diets.

The results are then extrapolated to other doses, other age groups and other environments. Such experiments are used to create estimates of harm. Together with estimates of exposure they form the essence of chemical risk assessment. When specific chemicals are flagged as being worthy of further interest, other techniques may be brought to bear. These may include epidemiology, cell culture experiments, and biological modeling, but the basis of risk assessment is always the estimation of exposure and the estimation of harm.

To say that both estimates are prone to error, however, is an understatement.

Part I: limits to estimating chemical exposures

Fifty years ago no one knew that many synthetic chemicals would evaporate at the equator and condense at the poles, from where they would enter polar ecosystems.

Neither did scientists appreciate that all synthetic fat-soluble compounds that were sufficiently long-lived would bio-accumulate as they rose up the food chain and thus reach concentrations inside organisms sometimes many millions of times above background levels.

Nor until recently was it understood that sea creatures such as fish and corals would become major consumers of the plastic particles flushed into rivers. These misunderstandings are all examples of historic errors in estimating real world exposures to toxic substances.

A general and broad limitation of these estimates is that real world exposures are very complex. For instance, commercial chemicals are often impure or not well defined. Thus PVC plastics are a complex mixture of polymers and may be further mixed with Cadmium or Lead (in varied concentrations).

One implication of this is that it is impossible for experiments contributing to risk assessment to be ‘realistic’. The reason is that actual exposures are always unique to individual organisms and vary enormously in their magnitude, duration, variability, and speed of onset, all of which influence the harm they cause. Whose specific reality would realism mimic?

Additionally, many regulatory decisions do not recognise that exposures to individual chemicals typically come from multiple sources. This failing is often revealed following major accidents or contamination events. Regulatory agencies will assert that actual accident-related doses do not exceed safe limits. However, such statements usually ignore that, because regulations function in effect as permits to pollute, many affected people may already be receiving significant exposures for that chemical prior to the accident.

Returning to the specific case of BPA, no one appreciated until 2013 that the main route of exposure to BPA in mammals is absorption through the mouth and not the gut. The mouth is an exposure route whose veinous blood supply bypasses the liver, and this allows BPA to circulate unmetabolised in the bloodstream (Gayrard et al. 2013).

Before this was known, many toxicologists explicitly denied the plausibility of measurements showing high BPA concentrations in human blood. They had assumed that BPA was absorbed via the gut and rapidly degraded in the liver.

Part II: limits to estimating harms

Similarly significant obstacles are faced in estimating harm. Many of these obstacles originate from the obvious fact that organisms and ecosystems are enormously biologically diverse.

The solution adopted by chemical risk assessment is to extrapolate. Extrapolation allows the results of one or a few experiments to ‘cover’ other species and other environmental conditions.

Most of the assumptions required for such extrapolations, however, have never been scientifically validated. Lack of validation is most obvious for species not yet discovered or those that are endangered. But in other cases they are actively known to be invalid (e.g. Seok et al., 2013).

For example, in their responses to specific chemicals, rats often do not extrapolate to humans. Indeed, they often do not extrapolate even to other rats. Thus individual strains of rats respond differently (which of course is why they get used); but also young and old rats give different responses. So do male and female rats (vom Saal et al., 2014). So too do rats fed non-standard diets (Mainigi and Campbell, 1981)

Even more extreme extrapolations are employed in ecological toxicology. For example, data on adult honey bees is typically extrapolated to every stage of the bee life cycle, to all other bee species, and sometimes to all pollinators, without the experimenters citing any supporting evidence. Such extrapolations may seem absurd but they are the primary basis of the claim that chemical risk assessment is comprehensive.

There are many other limits to estimating harm. Until it was too late, scientists were not aware that a human with an 80-year lifespan could have a window of vulnerability to a specific chemical as short as four days. Neither was it known that the effects of chemicals could be strongly influenced by the time of day they are ingested.

Another crucially important limitation is that, for budgetary and practical reasons, toxicologists necessarily focus on a limited number of specific ‘endpoints’. An endpoint is whatever characteristic the experimenter chooses to measure. Typical endpoints are death (mortality), cancers, organism weight, and organ weights; but endpoints can even be more subtle measures like neurotoxicity.

There is a whole politics associated with the choice of endpoints, which reflects their importance in toxicology, including allegations that endpoints are sometimes chosen for their insensitivity rather than their sensitivity; but the inescapable point is that no matter what endpoints are chosen, there is a much vaster universe of unmeasured endpoints.

These typically include: learning defects, immune dysfunction, reproductive dysfunction, multigenerational effects, and so on. Ultimately, most potential harms don’t get measured by toxicologists and so are missing from risk assessments.

Another example of the difficulty of estimating real life harms is that organisms are exposed to mixtures of toxins (Goodson et al., 2015). The question of toxin mixtures is extremely important (Kortenkamp, 2014).

All real life chemical exposures occur in combinations, either because of previous exposure to pollutants or because of the presence of natural toxins. Many commercial products moreover, such as pesticides, are only available as formulations (i.e. mixtures) whose principal purpose is to enhance the potency of the product. Risk assessments, however, just test the ‘active ingredient’ alone (Richard et al., 2005).

Consider too that all estimates of harm depend fundamentally on the assumption of a linear (or at least simple) dose-response relationship for the effect of each chemical. This is necessary to estimate harms of doses that are higher, lower, or even in between tested doses.

The assumption of a linear response is rarely tested, yet for numerous toxins (notably endocrine disrupting chemicals) a linear dose-response relationship has been disproven. Thus the question for any risk assessment is whether the assumption is reliable for the novel compound under review (reviewed in Vandenberg et al., 2012).

Replacing doubts with false certainty

To summarise, the process of chemical risk assessment relies on estimating real world exposures and their potential to cause harm by extrapolating from one or a few simple laboratory experiments. The resulting estimates come with enormous uncertainty. In many cases the results have been extensively critiqued and shown to be either dubious or actively improbable (Chandrasekera and Pippin, 2013).

Yet extrapolation continues – even though we know that the various errors must multiply – because the alternative is to actually measure these different species, using different mixtures and under different circumstances. Given the challenges this would entail, the continued reliance on simplistic assumptions is understandable.

Nevertheless, one might have thought that such important limitations and assumptions would be frequently noted as caveats to risk assessments. They should be, but they are not.

Following the UK’s traumatically disastrous outbreak of BSE (mad cow disease) in the 1980s, during which most of the UK population was exposed to infectious prions following highly questionable scientific advice, this exact recommendation was made in the Phillips report. Lord Phillips proposed that such caveats should be specifically explained to non-scientific recipients of scientific advice. In practice however, Phillips changed nothing.

When an unusual scientific document does discuss the limitations of chemical risk assessment (such as this description of the failure of interactions between pesticides to extrapolate between closely related species), it rapidly becomes obvious just how much the knowledge and understanding available to us are dwarfed by actual biological and system complexities. As any biologist ought to expect, the errors multiplied and the standard assumptions of risk assessment were overwhelmed even by ordinary life situations.

For good reason many scientific experts are therefore concerned about the number and quantity of man-made chemicals in our bodies. Recently, the International Federation of Gynecology and Obstetrics linked chemical exposure to the emergence of new diseases and disorders.

They specifically mentioned obesity, diabetes, hypospadias and reproductive dysfunction and noted: “The global health and economic burden related to toxic environmental chemicals is in excess of millions of deaths” (Di Renzo et al., 2015). The Federation acknowledged this to be an underestimate. Nor does it count disabilities.

Conflicts of interest in chemical risk assessment

In addition to the technical difficulties, there is also the problem that the scientists who produce scientific knowledge often have financial (and other) conflicts of interest. Conflicts, we know, lead to biases that impact on science well before it is incorporated into risk assessment (e.g. Lesser et al., 2007).

A fascinating example of apparent unconscious bias comes from a recent survey of scientific publications on the non-target effects of pesticidal GMO (Bt) crops in outdoor experiments. It was commissioned by the Dutch government (COGEM 2014). The report observed that researchers who found negative consequences of GMO Bt crops were disregarding their own findings, even when these were statistically significant.

Even more interesting to the Dutch authors was that the rationales offered for doing so were oftentimes illogical. Typically, researchers were using experimental methods specialised for detecting ecotoxicological effects that were “transient or local”, but when such effects were found the researchers were dismissing the significance of their own results for being either transient or local.

The COGEM report represented prima facie evidence that researchers within a whole academic discipline were avoiding conclusions that would throw doubt on the wisdom of using GMO Bt crops. Apparently the Bt researchers had a prior ideological commitment to finding no harm of the kind that scientists are supposed to not have.

Corporate capture and institutional dysfunctionality

Chemical regulation occurs primarily within a relatively small number of governmental or ‘independent’ regulatory institutions.

Of these, the United States Environmental Protection Agency (EPA) is the most prominent and widely imitated example. The EPA has a variety of institutional and procedural defects that prevent it being an effective regulator.

Perhaps the best known of these is to allow self-interested chemical corporations to conduct the experiments and provide the data for risk assessment. This lets them summarise (or even lie about) the results. As was once pointed out by Melvin Reuber, former EPA consultant, it is extraordinarily easy for an independent commercial testing operation to bias or fix the result of a typical toxicology study for the benefit of a client.

How the EPA first allowed corporations to generate and submit their own regulatory data is a story well worth knowing.

In the 1980s Industrial Bio-Test Laboratories (IBT) was the largest independent commercial testing laboratory in the United States. FDA scientist Adrian Gross discovered that IBT (and other testing companies) were deliberately, consistently, and illegally misleading both EPA and the FDA about their results.

Aided by practices such as the hiring of a chemist from Monsanto, who manufactured them, to test PCBs, IBT created an illusion of chemical safety for numerous pesticides and other chemicals. Many are still in use. They include Roundup, atrazine and 2,4-D, all commonly used in US agriculture.

Between them, Canadian regulators drew up a list of 106 questionable chemical registrations and FDA identified 618 separate animal studies as being invalid due to numerous discrepancies between the study conduct and data.” Both regulators suppressed their findings.

Senior IBT managers were jailed, but what the scandal had revealed was that whenever results showed evidence of harm – which was often – misleading regulators was standard practice.

More remarkable even than the scandal was EPA’s response. Instead of bringing testing in-house, which would seem the logical response to a system-wide failure of independent commercial testing, EPA instead created a Byzantine system of external reporting and corporate summarising.

The resulting bureaucratic maze ensures that no EPA employee ever sets eyes on the original experiments or the primary data, and only a handful can access even the summarised results. This system has the consequence of excluding any formal possibility that whistleblowing on the part of Federal employees or FOIA requests (from outsiders) might reveal fraudulent or otherwise problematic tests.

EPA calculatedly turned a blind eye to any potential future wrongdoing in the full knowledge that the chemical regulatory system it oversaw was systemically corrupt.

You don’t have to MAD to work at the EPA, but it helps

Probably more familiar to readers is what is called ‘regulatory capture‘. This takes many forms, from the offering to public servants of favours and future jobs, to the encouragement of top-down political interference with regulatory agencies. The culminating effect is to ensure that political will within agencies to protect the public is diluted or lost.

Regulatory capture can become a permanent feature of an institution. For example, OECD member countries have an agreement called the Mutual Acceptance of Data (MAD). MAD is appropriately named. It has the effect of explicitly excluding from regulatory consideration most of the peer-reviewed scientific literature (Myers et al., 2009a).

The purported goal of MAD was to elevate experimental practices by requiring certification via Good Laboratory Practice (GLP) which was a procedure introduced after the IBT scandal (Wagner and Michaels, 2004). GLP is a mix of management and reliability protocols that are standard in industrial laboratories but rare in universities and elsewhere. However, the consequence of accepting MAD has been to specifically exclude from regulatory consideration evidence and data not produced by industry.

The MAD agreement explains much of the regulatory inaction over BPA. Because of MAD, FDA (and also its European equivalent the European Food Safety Authority) have ignored the hundreds of peer-reviewed BPA studies – since they are not GLP – in favor of just two by industry.

These two industry studies, whose credibility and conclusions have been publicly challenged by independent scientists, showed no ill effects of BPA (Myers et al., 2009b).

Whistleblowing at the EPA

Various EPA whistleblowers have described in detail the specifics of their former organisation’s capture by branches of the chemical industry.

Whistleblower William Sanjour has described how regulatory failure was ensured by the organisational structure imposed on the EPA at its Nixon-era inception. The structure of EPA is inherently conflicted since it has the dual functions of both writing and and enforcing regulations. Unwillingness to enforce high standards led his superiors to order Sanjour to write deliberate loopholes into those regulations. More recently, the EU’s EFSA was similarly caught proposing loopholes for new regulations on endocrine disrupting chemicals. Inserting loopholes is standard practice in the writing of chemical safety regulations.

In the same article, Sanjour also proposed that since corporate capture renders them useless, the public would be better off with no regulatory agencies. In a similar vein, former EPA pesticide scientist Evaggelos Vallianatos called his former employer, at book length, the polluter’s protection agency.”

Another EPA whistleblower, David Lewis, this time at EPA’s Office of Water, has shown in court-obtained documents that EPA scientists buried evidence and even covered up deaths so as to formulate regulations that would permit land application of sewage sludge. This sludge was routinely contaminated with pathogens, heavy metals, industrial chemicals, pharmaceuticals, flame retardants, and other known hazardous substances.

The corruption around sewage sludge regulations extended well beyond the EPA. It encompassed other federal agencies, several universities, the National Academy of Science, and municipalities. David Lewis eventually obtained a legal judgement that the City of Augusta, Ga, had fudged the toxicity testing of its own sewage sludge in order to meet EPA guidelines. The city had done so at the request of EPA.

In another recent case, DeSmogBlog obtained, through a Freedom of Information Act request (FOIA), internal documents showing how EPA offered access to its fracking study plans:

“‘[Y]ou guys are part of the team here,’ one EPA representative wrote to Chesapeake Energy as they together edited study planning documents in October 2013, ‘please write things in as you see fit.'”

Even more recently, EPA whistleblower and chemist Dr Cate Jenkins and the non-profit Public Employees for Environmental Responsibility (PEER) successfully sued EPA for suppressing information about toxic effects on 9/11 first responders. The case ended with a judgement showing that EPA had, among numerous egregious acts, created fake email accounts (including for EPA head Lisa Jackson) to evade accountability. According to Judge Chambers, EPA:

“Failed, and failed miserably, over an extended course of time in complying with its discovery obligations and … Court discovery orders”

Judge Chambers also found that EPA worked a “fraud on the Court” through numerous “false claims” and inaccurate claims of privilege which upon examination applied to “none of the documents provided”. The judge also found that EPA deliberately and illegally destroyed an unknown number of documents which should have been under a litigation hold.

The ultimate effect of these institutional defects is that chemical risk assessments in the US and the EU have a safety bar for approval that is so low that regulators virtually never decline to approve a chemical. In contrast, the exact same institutions use standards for taking any chemical off the market that are so high that such an event nearly never happens. Yet if both standards were based purely on science, as they claim to be, both bars would be the same height.

This double standard represents the overwhelming bias in the system. At every stage of chemical risk assessment-from the funding of research to the ultimate decision to approve a chemical-the process is dominated by commercial concerns and not by science (as was recently shown yet again).

Beyond any conceivable doubt, inappropriate external influences swamp the scientific content and protective mission of chemical risk assessment.

Chemical risk assessment: can the show be salvaged?

It therefore seems clear that to frame individual chemicals as ‘bad actors’ is incorrect. Chemical risk assessments themselves are the problem. Thus we can perfectly explain why approved chemicals accumulate red flags when exposed to the scientific process but also why those that replace them are no less harmful.

Specific chemicals like glyphosate and BPA are thus the messengers and shooting them one by one is not only pointless, it is counterproductive. It distracts and detracts from the infinitely more important truth – that the institutions, the methods, and thus the entire oversight of chemical regulation is failing in what it claims to do, which is to protect from us from harm.

Importantly, chemical regulatory systems are not just broken, they are unfixable. Even with the best intentions, such as the full cooperation of all the institutions mentioned here and of the entire academic research community, remedying the technical problems would be a task that is beyond Herculean.

Consider just one of these-the testing of a chemical in combination with others. The testing of mixtures is an improvement often suggested by NGOs and thousands of scientific studies show that this is an important consideration. The pesticide Chlordecone, for example, increases the toxicity of an “otherwise inconsequential” dose of the common contaminant carbon tetrachloride by 67-fold in rats (Curtis et al., 1979).

To test mixtures properly, however, would be astonishingly expensive and also enormously costly towards experimental animals. According to the US National Toxicology Program, standard 13 week studies of the interactions between just 25 chemicals would require 33 million experiments costing $3 trillion. This is because each chemical needs to be tested against all possible combinations of the others.

To study mixtures of all 11,000 chlorinated chemicals in commerce would require 103311 experiments. This is more experiments than there are atoms in the universe. Our entire planet would have to devote itself to animal experimentation and the work would still not be done by the end of time itself (Yang 1994). Even then we would only know the toxicity of organochlorines towards a single test species. Would the results be extrapolable to any other species? Well, we could buy another planet and test it!

Imagine also that an adequate test for synthetic chemicals were devised and it was run by competent institutions. Would any chemical pass? The multiple harms of the single chemical BPA, plus the frequency with which chemical substitutes turn out later to be harmful, and plenty of other data, suggests it is possible that few chemicals would pass.

This conclusion, of course, contradicts the presumption of innocence that underlies all chemical regulation. But we should be clear that the presumption is arbitrary and therefore may be wrong. What is so unbelievable, after all, about proposing that all man-made chemicals cause dysfunction at low doses in a significant subset of all the biological organisms on earth?

Strategising for success

Obviously, the implications of this knowldge are many, but the one of specific importance to environmental health campaigners, is that organising for a ban on a specific hazardous chemical, such as the herbicide atrazine, is likely to be a strategic error.

If chemical risk assessment is ineffective then demanding a ban is pointless because achieving it will result only in the substitution of a chemical that is no better. But even worse, if chemical risk assessment is ineffective, such campaigns undermine the wider cause because they falsely imply that chemical regulations protect the public and limit pollution.

Messaging is extremely important. If the public learned that chemical regulations were effective only from the chemical industry they probably would disbelieve it. However, since they hear it from the entire environmental movement then chemical risk assessment acquires credibility. Why, they no doubt reason, would the environment movement pretend chemical testing was effective if it wasn’t? And indeed the environment movement traditionally reinforces this message still further whenever it calls for more testing.

In the light of this understanding, if they accept the accumulated scientific evidence, environmental and public health advocates who campaign for bans or restrictions on single chemicals have an opportunity to substantively rethink their strategies and reframe their activities. This doesn’t necessarily mean abandoning any discussion of individual chemicals, but at the very least it does mean explicitly framing those specific chemicals not as ‘bad actors’ but as symptoms of a much bigger problem of incompetent and dysfunctional regulation, with all that implies.

This challenge is also a tremendous opportunity. Having facts that are more stark and analysis that is more scientific and more rigorous creates a superior and more powerful basis upon which to organise and strategise. Thus it brings more ambitious environmental health goals within reach.

Advocates can choose from a broader range of possible approaches and engage a broadened segment of the population. They can place clear and obvious intellectual distance between their own realistic strategies for protecting the public and the planet and contrast them with the plainly inadequate views of the chemical industry.

For example, it is surely easier to explain to a layperson the generic absurdities of chemical risk assessment (and thus gain their support) than it is to explain the toxicological niceties of glyphosate (Roundup) or 2,4-D, especially one chemical (of 80,000) at a time. They say the truth can set you free, but in the world of toxic campaigning it is a strategy that has hardly been tried yet. I am optimistic, therefore, that the tide can be turned.

In the late 1990s Greenpeace USA adopted the novel campaigning position that all chlorinated hydrocarbons should be banned, in part on the grounds that every one so far investigated had proven toxicologically problematic.

In doing so they took chemical campaigning to a new level. Greenpeace was threatening thousands of products of the chemical industry with a strategic goal that had a realistic chance of significantly enhancing the quality of our environment. If they had succeeded neonicotinoids would not now be ubiquitous in the environment, DDT would never have been allowed. Nor would 2,4-D, but it is unlikely that your material standard of living would be lower, it might even be higher.

Greenpeace was hit by a campaign of corporate espionage. Their offices were bugged and their computers were hacked, they were infiltrated by phony volunteers and more. The chemical industry was spooked. Greenpeace eventually backed off, but by raising the stakes and making their case with science, they had shown a way.

The book Pandora’s Poison elaborates on the some of the ambitious ideas for eradicating pollution that Greenpeace tried but never in the end adequately road tested. It is time to learn the lessons of the past and move chemical safety campaigning outside of the comfort zone of the chemical industry, which is where it belongs.

 


 

Dr Jonathan R. Latham is editor of Independent Science News.

This article was originally published by Independent Science News under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License. Its creation was supported by The Bioscience Resource Project.

References

Ayyanan A, O. Laribi, S. Schuepbach-Mallepell, C. Schrick, M. Gutierrez, T. Tanos, G. Lefebvre, J. Rougemont, O. Yalcin-Ozuysal, Brisken C. Perinatal Exposure to Bisphenol A Increases Adult Mammary Gland Progesterone Response and Cell Number. Molecular Endocrinology, 2011; DOI: 10.1210/me.2011-1129

Buonsante, Vito A., Muilerman H, Santos T, Robinson C, Tweedale AC (2014) Risk assessment׳s insensitive toxicity testing may cause it to fail. Environmental Research 135: 139-147

Bae S., Hong Y-C. (2014). Exposure to Bisphenol A From Drinking Canned Beverage Increases Blood Pressure: Randomized Crossover Trial. Hypertension 10.1161/HYPERTENSIONAHA.114.04261

Bhan A, Hussain I, Ansari KI, Bobzean SAM, Perrotti LI, Mandal SS (2014) Bisphenol-A and diethylstilbestrol exposure induces the expression of breast cancer associated long noncoding RNA HOTAIR in vitro and in vivo. The Journal of Steroid Biochemistry and Molecular Biology 141: 160.

Braun JM, Kalkbrenner AE, Calafat AM, Yolton K, Ye X, Dietrich KN, Lanphear BP (2011)Impact of Early Life Bisphenol A Exposure on Behavior and Executive Function in Children. Pediatrics, 128: 873-882.

Chandrasekera PC, Pippin JJ (2013). Of rodents and men: species-specific glucose regulation and type 2 diabetes research. ALTEX 31:157-176.

Curtis LR, Williams W.Lane, Harihara M. (1979) Potentiation of the hepatotoxicity of carbon tetrachloride following preexposure to chlordecone (Kepone) in the male rat. Toxicology and Applied Pharmacology Volume 51: Pages 283-293.

Evans SF, Kobrosly RW, Barrett ES, Thurston SW, Calafat AM, Weisse B, Stahlhut R, Yolton K, Swan SH (2014) Prenatal bisphenol A exposure and maternally reported behavior in boys and girls. NeuroToxicology 45: 91-99.

Gayrard V, Lacroix MZ, Collet SH, Viguié C, Bousquet-Melou A, Toutain P-L, and Picard-Hagen N (2013) High Bioavailability of Bisphenol A from Sublingual Exposure. Environ Health Perspect. 121: 951-956.

Goodson et al. (2015) Assessing the carcinogenic potential of low-dose exposures to chemical mixtures in the environment: the challenge ahead. 36: Supplement 1, S254-S296.

Kinch C, Ibhazehiebo K, Jeong J-H, Habibi HR, and Kurrasch DM (2015) Low-dose exposure to bisphenol A and replacement bisphenol S induces precocious hypothalamic neurogenesis in embryonic zebrafish. PNAS doi: 10.1073/pnas.1417731112

Kortenkamp, A (2014) Low dose mixture effects of endocrine disrupters and their implications for regulatory thresholds in chemical risk assessment. Current Opinion in Pharmacology Volume 19, December 2014, Pages 105-111.

Lesser, LI Ebbeling CB , Goozner M, Wypij D, Ludwig DS (2007) Relationship between Funding Source and Conclusion among Nutrition-Related Scientific Articles. PLOS DOI: 10.1371/journal.pmed.0040005

Liao C, Liu F, Alomirah H, Duc Loi V, Ali Mohd M, Moon H-B, Nakata H, and Kannan K (2012)Bisphenol S in urine from the United States and seven Asian countries: occurrence and human exposures. Environ. Sci. Technol. 46: 6860-6866.

Mainigi, KD. and T.C Campbell (1981) Effects of low dietary protein and dietary aflatoxin on hepatic glutathione levels in F-344 rats. Toxicology and Applied Pharmacology 59: 196-203.

Melzer D; Nicholas J. Osborne; William E. Henley; Ricardo Cipelli; Anita Young; Cathryn Money; Paul Mccormack; Robert Luben; Kay-Tee Khaw; Nicholas J. Wareham; Tamara S. Galloway (2012) Urinary Bisphenol: A Concentration and Risk of Future Coronary Artery Disease in Apparently Healthy Men and Women. Circulation 125: 1482-1490.

Menard S. , L. Guzylack-Piriou, M. Leveque, V. Braniste, C. Lencina, M. Naturel, L. Moussa, S. Sekkal, C. Harkat, E. Gaultier, V. Theodorou, E. Houdeau. (2014) Food intolerance at adulthood after perinatal exposure to the endocrine disruptor bisphenol A. The FASEB Journal 28: 4893-4900.

Molina-Molina J-M, Esperanza Amaya, Marina Grimaldi, José-María Sáenz, Macarena Real, Mariana F. Fernández, Patrick Balaguer, Nicolás Olea (2013) In vitro study on the agonistic and antagonistic activities of bisphenol-S and other bisphenol-A congeners and derivatives via nuclear receptors. Toxicology and Applied Pharmacology 272: 127-136.

Myers JP et al (2009a) Why Public Health Agencies Cannot Depend on Good Laboratory Practices as a Criterion for Selecting Data: The Case of Bisphenol A. Environ Health Perspect 117:309-315.

Myers JP, Zoeller TH and vom Saal F (2009b) A Clash of Old and New Scientific Concepts in Toxicity, with Important Implications for Public Health. Environ Health Perspect 117: 1652-1655.

Perera F, Julia Vishnevetsky, Julie B Herbstman, Antonia M Calafat, Wei Xiong, Virginia Rauh and Shuang Wang (2012) Prenatal Bisphenol A Exposure and Child Behavior in an Inner-City Cohort. Environ. Health Perspect. 120: 1190-1194.

Prins, Wen-Yang Hu, Guang-Bin Shi, Dan-Ping Hu, Shyama Majumdar, Guannan Li, Ke Huang, Jason Nelles, Shuk-Mei Ho, Cheryl Lyn Walker, Andre Kajdacsy-Balla, and Richard B. van Breemen (2014) Bisphenol A Promotes Human Prostate Stem-Progenitor Cell Self-Renewal and Increases In Vivo Carcinogenesis in Human Prostate Epithelium. Endocrinology doi.org/10.1210/en.2013-1955.

Richard S, Moslemi S, Sipahutar H, Benachour N, and Seralini G-E (2005) Differential Effects of Glyphosate and Roundup on Human Placental Cells and Aromatase. Environ Health Perspect. 113: 716-720.

Seok et al (2013) Genomic responses in mouse models poorly mimic human inflammatory diseases. Proc Natl Acad Sci USA 110: 3507-3512.

Tarapore P, Jun Ying, Bin Ouyang, Barbara Burke, Bruce Bracken, Shuk-Mei Ho. (2014)Exposure to Bisphenol A Correlates with Early-Onset Prostate Cancer and Promotes Centrosome Amplification and Anchorage-Independent Growth In Vitro. PLoS ONE DOI: 10.1371/journal.pone.0090332

Rubin B S, M K Murray, D A Damassa, J C King, and A M Soto (2001) Perinatal exposure to low doses of bisphenol A affects body weight, patterns of estrous cyclicity, and plasma LH levels. Environ Health Perspect. 109: 675-680.

Vandenberg Laura N. , Theo Colborn, Tyrone B. Hayes, Jerrold J. Heindel, David R. Jacobs, Jr., Duk-Hee Lee, Toshi Shioda, Ana M. Soto, Frederick S. vom Saal, Wade V. Welshons, R. Thomas Zoeller, and John Peterson Myers (2012) Hormones and Endocrine-Disrupting Chemicals: Low-Dose Effects and Nonmonotonic Dose Responses. Endocrine Reviews DOI: http://dx.doi.org/10.1210/er.2011-1050

Rochester and AL Bolden (2015) Bisphenol S and F: A Systematic Review and Comparison of the Hormonal Activity of Bisphenol A Substitutes. Environmental Health Perspectives DOI:10.1289/ehp.1408989

vom Saal F., Catherine A. VandeVoort, Julia A. Taylor, Wade V. Welshons, Pierre-Louis Toutain and Patricia A Hunt. (2014) Bisphenol A (BPA) pharmacokinetics with daily oral bolus or continuous exposure via silastic capsules in pregnant rhesus monkeys: relevance for human exposures. Reproductive Toxicology 45: 105-116.

Wagner, Wendy, and David Michaels. (2004) “Equal Treatment for Regulatory Science: Extending the Controls Governing the Quality of Public Research to Private Reseach.” Am. JL & Med. 30 : 119.

Yang RSH (1994) Toxicology of chemical mixtures derived from hazardous waste sites or application of pesticides and fertilizers. In Yang RSH, ed: Toxicology of Chemical Mixtures. Academic Press 99-117.

Yuan Z, S Courtenay, RC Chambers, I Wirgin (2006) Evidence of spatially extensive resistance to PCBs in an anadromous fish of the Hudson River. Environmental Health Perspectives 114. 77-84.

 

World must end slavery – for the environment as well as human rights

It touches the food we eat and the air we breathe, the clothes we wear and possibly the device you’re using to read these words.

But slavery today is a paradox. It is hidden away as never before, but its effects are everywhere.

If slavery were a country it would have the population of Canada and the GDP of Kuwait, but its CO2 emissions would rank third globally after China and the US.

The latest measures of global slavery conservatively estimate there are about 36m slaves worldwide, spread across virtually all countries. The UN says slavery generates some US$150 billion annually.

These numbers seem immense, but the number of slaves represents only a small amount of the global population. While US$150 billion is a tiny fraction of the global economy and is spread across several million local criminal enterprises.

In all of human history, slavery has never been such a small part of our shared existence. Slavery is illegal in every country, it is condemned by every faith, and business and government leaders are unanimous in rejecting it.

Slavery has been pushed to the very edges of our global society, but it is still destroying lives and the natural world at an alarming rate because the criminal gangs who employ slave labour are often involved in pollution and deforestation as part of their work.

Bosses pit slaves against forests and tigers

As part of my research, I met 19-year-old Shumir in a village at the bottom of Bangladesh. Just the night before he had escaped from slavery by stowing away in a fishing boat. He was lured into slavery, he said:

“A recruiter told my parents he would give them 2000 taka [US$29] if they’d let me come and work on a fish camp … The man said the work was easy, and there was plenty of food to eat. My parents needed the money and I wanted to help, so I left with the recruiter.”

It was a lie. Shumir and dozens of other boys would often work 24 hours straight. “The longer I worked”, he told me, “I’d get exhausted and clumsy. Sometimes I’d cut myself with the gutting knife or slip and fall from the drying rack. Whenever I made a mistake the boss would hit me.”

Yet more feared than the bosses were the tigers. Every boy I met from the fish camps reported having witnessed or having known another child slave who had been eaten by a tiger. These accounts came about because criminal slaveholders had carved fish processing camps from the forests of the Sundarbans UNESCO World Heritage Site – a natural habitat for protected Bengal tigers in the area.

This vast area of protected mangrove forest is also the largest carbon sink in Asia – which basically means that it absorbs more carbon that it releases and is particularly key to the environment. The forest also functions as a crucial buffer protecting coastal towns from cyclones.

Invaded by criminal gangs, their slaves clear cut the forest, feeding CO2 into the atmosphere, and set the collision course of slave children and protected tigers. The profits driving this destruction come from the global market for shrimp and pet food, but it doesn’t have to be this way.

New beginnings

The fish camp where Shumir was enslaved is just one of thousands of slave-based enterprises in agriculture, mining, brick making, timbering, charcoal making, and other businesses spread around the world’s equatorial belt, the vast majority of which cause pollution and deforestation as part of their everyday work.

But understanding the link between slavery and climate change can help to solve both problems.

In Brazil I met a former slave named Jose Barros who had previously worked down the mines. He turned his life around in 2005 when he was granted access – but not ownership – to 100 acres of Amazonian forest. This was assisted by a local cooperative of small farmers organised by the Pastoral Land Commission – an offshoot of the catholic church in Brazil.

As part of their project to tackle both slavery and climate change they provided Barros with cocoa seedlings that he planted beneath the big canopy trees. In return he had to leave 60% of the forest intact.

When the cocoa pods began to grow, “that’s when our lives began to change”. He told me how he sold about 1,000 kilos of cocoa and for the first time ever was able to buy more than just the food his family needed. His children went to school and their lives improved dramatically.

The forest is now preserved and guarded. Barros plants new trees and harvests other cash crops like Brazil nuts that grow naturally.

The multiple benefits of enforcing anti-slavery laws – let’s do it!

A concerted effort to end slavery around the world is a big investment, but one that can have a huge global impact. Enforcement of the anti-slavery laws that are on the books in every country would immediately diminish CO2 emissions and species loss.

In developing countries ending slavery can stimulate the economy, ward off the threat of rising sea levels or destructive deforestation, and preserve endangered species. Freed slaves can also be paid to replant the forests they were forced to cut.

This would not only help to rehabilitate the land, but it would also help to give work and a wage to some of the people who need this most in the world.

 


 

Kevin Bales is Professor of Contemporary Slavery, University of Hull.The Conversation

This article was originally published on The Conversation. Read the original article.

 

Bovine TB Part II: rotten data, dodgy science, bad politics.

My first article published on this site under this same heading attracted comment from a wide spectrum. Perhaps surprisingly – and helpfully – both ends of that spectrum seem to agree now that the notion of the Random Badger Culling Trial (RBCT) perturbation effect increasing badger transmission of bTB has been shown to be an unsafe concept. However, many of the commentators appear to have missed the relevance of the bTB spike in two of the ten central England control areas.

It is the unexpected – roughly 10% – contribution from each of these that make up around 20% of the difference between the control and the proactive cull areas; in other words most of the supposed 25% proactive cull benefit. 

Why did breakdowns spike in two of the control areas? Perhaps it was because they had the two highest ‘starting’ levels of bTB herd infection (37% & 28% respectively) in the years immediately before the trials.

To spell out the significance of this, the supposed benefit from the proactive cull appears not to be real and not real by a very big margin. The claimed bTB reduction seems instead to be a function of other factors occurring in the field that have not been adequately controlled.

In fact the data from the ISG final report in 2007, Tables 5.1 and 5.7, indicate that the paired comparisons (proactive cull to control) of new herd breakdowns in four of the ten trial areas, (Cornwall/Devon, Devon, Devon/Somerset and Gloucestershire), were lower in the control area than in its respective cull area. There was no benefit trend in the proactive cull areas. This does not seem to have been widely appreciated yet it is an obvious cause for alarm bells. 

On 5 May, DEFRA Minister George Eustice was asked in Parliament by Neil Parish, chair of EFRA scrutiny sub-committee the following question: “When will the Minister be able to give the scientific figures for the badger cull areas to show the reduction in the amount of disease in cattle?” However, DEFRA announced around the end of last year that any bTB reduction from culling would not be measurable. Eustice answered “The reality is that the programme is a long-term commitment and it will be several years before we can see the impact of the culls.”

Again, not true, in his own department’s view. 

http://www.theyworkforyou.com/debates/?id=2016-05-05a.290.5&s=OLIVER+COLVILE

Eustice continued on timing; “As my Hon. Friend knows, the randomised badger culling trials a decade or more ago found that the benefits of the culling of badgers were only seen some four years after the conclusion of the culls”. But the fact that any supposed benefit cannot be measured makes any implications sterile.

How is it possible that such senior figures are so misinformed?

Much was asked of the RBCT, but like the many field experiments before it, it has proved a near-impossible task to control for the huge number of variables in field conditions. This has been compounded by inadequate scientific disease investigations relating to wildlife.

In cattle, bovine TB needs detecting with the IFN gamma interferon test (for early stage bTB), in tandem with the SICCT skin test. Further tests, either a fast blood antibody or PCR test complete the minimum three-stage checking needed in order to include the later stage ‘sleepers’ and complete an comprehensive approach to detection. Whole-herd depopulation may be minimised with this approach in some areas too.

A few weeks ago George Eustice released figures following a Parliamentary Question by Kerry McCarthy MP revealing that gamma interferon testing is all but abandoned in the High Risk Areas. 

http://www.parliament.uk/business/publications/written-questions-answers-statements/written-question/Commons/2016-04-13/33912/

This is further evidence that the UK strategy has been only to try to lightly slow-up bTB’s proliferation, rather than to tackle bTB head-on, in all areas, as it must do to attempt eradication. Public funds are being poured into bTB slaughter compensation in a method that can only perpetuate the spread of bTB and result in higher levels of crisis and cost. 

The point is that any available money now needs to be placed urgently but expertly into effective testing, in order to help beef and dairy farmers properly tackle the disease. Otherwise the other spending is wasted.

The cost is perhaps up to £100 per animal and our veterinarians should surely be pressing to get the job done? Why would the public continue to pour compensation payments into a system that fails to adequately address the problem? Good money frittered on the now exposed dodgy badger killing policy is a scandalous waste –the money better spent on testing. The even bigger scandal is how long avoidance of any real solution to the wider problem has been allowed to persist.

Tom Langton has been a consulting ecologist to government, business and industry and a voluntary sector volunteer, more recently working on assisting small pressure groups in their legal opposition to destruction of species and habitats in Europe.

 

 

 

 

 

Searing heat may spark Middle East, North Africa climate exodus

Parts of the Middle East and North Africa could become unbearably hot if greenhouse gas emissions continue to rise.

New research predicts that, by mid-century, summer temperatures will stay above 30C at night and could rise to 46°C during the day.

By the end of the century, maximum temperatures could reach 50C, and this could happen more often. Instead of 16 days of extreme heat, there could be 80 days.

“In future, the climate in large parts of the Middle East and North Africa (MENA) could change in such a manner that the very existence of its inhabitants is in jeopardy”, says Jos Lelieveld, director of the Max Planck Institute for Chemistry in Mainz, Germany.

He and colleagues report in the journal Climatic Change that they used computer models to explore changes in temperature patterns in the MENA region in the 21st century. Global warming happens unevenly, and many regions are experiencing warmer winters – with earlier growing seasons – but not necessarily many more extremes in summer heat.

200 ‘unusually hot’ days a year by 2100?

But the pattern around the Eastern Mediterranean and in the landscapes of Egypt, Libya, Algeria, Tunisia and Morocco is one of increasing summer heat.

Between 1986 and 2005, the average number of ‘very hot’ days was 16. By mid-century, this could reach 80 days a year. By the end of the century, even if greenhouse gas emissions decline after 2040, the number of sweltering days could soar to 118.

“If mankind continues to release carbon dioxide as it does now, people living in the Middle East and North Africa will have to expect about 200 unusually hot days [per year], according to the model projections”, says Panos Hadjinicolaou, associate professor at the Cyprus Institute and a co-author of the report.

Prof Lelieveld and another co-author from the Cyprus Institute took part in a study of changing atmospheric conditions, to see what aerosol concentrations in the atmosphere could tell climate science about soil moisture trends in the region’s arid landscapes.

They report in the journal Atmospheric Chemistry and Physics that as soils have dried, dust emissions have increased – by 70% over Saudi Arabia, Iraq and Syria since the beginning of this century.

Climate researchers have repeatedly warned that extremes of heat will become the ‘new normal’ at most latitudes. However, those countries that already experience the most relentless summer heat could become increasingly unhealthy and unstable.

Near-lethal conditions

One research team recently took a close look not just at heat but at potential humidity levels around the Gulf, and found that conditions could in some circumstances one day become near-lethal. So the latest studies are more confirmation than revelation.

The researchers considered what would happen if the world adopted the notorious ‘business-as-usual’ scenario and did nothing significant to control greenhouse gas emissions.

They also considered a scenario in which the world tried to contain global warming to a 2C average above historic levels, and in which global emissions began to decrease by 2040. But, even under this scenario, summer temperatures in the region would get to 46C by mid-century.

“Climate change will significantly worsen the living conditions in the Middle East and in North Africa”, Professor Lelieveld says. “Prolonged heatwaves and desert dust storms can render some regions uninhabitable, which will surely contribute to the pressure to migrate.”

Was Syria’s war triggered by climate change?

Extremes of drought have been linked to the fall of ancient civilisations in the region, as well as to the present conflict in Syria and to the growth in the refugee population in Europe and the Middle East.

A 2015 study by scientists from Lamont-Doherty Earth Observatory at Columbia University, US, published in the Proceedings of the National Academy of Sciences, proposes that Syria’s devastating civil war which has forced millions of refugees to seek safety in Lebanon, Jordan, Turkey, Europe and elsewhere was caused in part by climate change.

The drought that devastated parts of Syria from 2006 to 2010 was probably the result of climate change driven by human activities, according to the paper. “We’re not saying the drought caused the war”, says Richard Seager, one of the co-authors.

“We’re saying that, added to all the other stressors, it helped kick things over the threshold into open conflict. And a drought of that severity was made much more likely by the ongoing human-driven drying of that region.”

Agricultural production fell by a third. In the northeast, livestock was practically wiped out, cereal prices doubled, and nutrition-related diseases among children increased steeply. Adding to the problem was Syria’s huge increase in population from four million in the 1950s to 22 million in recent years.

The drought also forced 1.5 million people to flee from the countryside to cities already strained by waves of refugees from the war in neighbouring Iraq, increasing tensions in urban communities.

 


 

Tim Radford writes for Climate News Network, where this article was originally published (CC BY-ND). This version contains additional reporting by The Ecologist.

Also on The Ecologist

 

Glyphosate in the EU: product promoters masquerading as regulators in a ‘cesspool of corruption’?

On 13th April, the EU Parliament called on the European Commission to restrict certain permitted uses of the toxic herbicide glyphosate, best known in Monsanto’s ‘Roundup’ formulation. Glyphosate was last year determined to be ‘probably carcinogenic‘ by the WHO.

The parliament’s resolution called for no approval for many uses now considered acceptable, including use in or close to public parks, playgrounds and gardens, use where integrated pest management systems are sufficient for necessary weed control, and as a pre-harvest ‘dessicant’ on arable crops.

The resolution, however, fell short of calling for an outright ban. Due to the various political maneuverings, a disappointing compromise was reached that called for the renewal of the licence for glyphosate to be limited to just seven years instead of the 15 proposed by the Commission.

The resolution and the vote to re-approve glyphosate for seven years are non-binding, and tomorrow, on Wednesday 18 May, the European Food Standard Authority Standing Committee on Plants, Animals, Food and Feed will meet to decide whether glyphosate is to be re-registered for use in the EU.

Rosemary Mason, indomitable campaigner for truth

In addition to the World Health Organisation classifying glyphosate as being probably carcinogenic to humans, various peer-reviewed studies have indicated strong links between its use and a range of serious diseases and deleterious environmental impacts, as presented by Rosemary Mason in the documents that are attached to this article (see below).

Rosemary Mason has been campaigning about the harmful effects of glyphosate for many of years. She has sent various open letters accompanied by in-depth, fully-referenced reports to key figures in both Britain and the EU who are responsible for regulating the use of glyphosate and for setting the official narrative about this substance.

In the downloads provided at the end of this text, you can access some of the documents she has sent to the EFSA, European Commission and other key bodies / figures since November 2015. They provide detailed descriptions of the impacts of glyphosate along with the ongoing saga of deception and duplicity that result in an ultimate failure to regulate.

It would be an understatement to say that Mason smells a rat: the kind of rat recently discussed on the Corporate European Observatory website, which describes the strategic position the biotech lobby has gained within the heart of policy / decision-making processes in the EU. And the kind of rat that underlies the collusion between this lobby and regulatory / policy bodies in Europe, which has been described many times over the years.

For example, see William Engdahl’s recent piece on the “cesspool of corruption” that underpins relations between the EU, EFSA and the major pesticide companies; read how scientific evidence was sidelined in the EU to get the use of gylphosate sanctioned.

And – just to highlight the type of companies public officials and bodies are all too willing to jump into bed with – read how Monsanto appears to have hidden evidence of the glyphosate-cancer link for decades.

With reports emerging that the EC plans to relicense glyphosate for nine years, should we be too surprised about this when glyphosate sales account for $5.1 billion of Monsanto’s revenue (2014 figure)? The level of collusion between the biotech lobby and public officials suggest that the line between product promoting and regulating was crossed long ago.

Whose interests are being served here?

In response to the WHO reclassification of glyphosate as being probably carcinogenic to humans, the EFSA responded with its own review and concluded a cancer link was unlikely. The way the review was manipulated to reach that conclusion has been roundly condemned by dozens of scientists.

Mason notes that there is currently a legal case in process against EU regulators, and, if anyone were to be found to be colluding with the pesticides industry over the licensing of glyphosate, there are likely to be severe penalties. Environmentalists have launched the case against Monsanto and EU regulators over glyphosate assessment.

Details about this action are provided on the GMWatch website, where Viennese lawyer Dr Josef Unterweger states: “If there has been deliberate manipulation of the new licensing procedure for glyphosate with the intention of approving a carcinogenic substance, then this would be defrauding 508 million EU citizens.”

For this reason Dr Unterweger is pressing charges against the  European Food Safety Authority and the German regulator BfR on behalf of Munich Environmental Institute and the six environmental organisations: Global 2000, Pesticide Action Network (PAN) Europe, PAN Germany, PAN UK, Générations Futures (France), WeMove Europe, and Nature & Progrès Belgique. A report will also be submitted to OLAF, the European anti-fraud office.

In addition 94 respected scientists wrote an open letter to the EU commission that criticised the BfR / EFSA assessment as “scientifically unacceptable”, “fundamentally flawed” and “misleading”.

The ongoing scenario surrounding glyphosate begs the question whose interests are ultimately being served? Those of 500 million Europeans or those of Monsanto, a corporation that will be put ‘on trial’ as part of a civil society initiative for crimes against nature and humanity and ecocide in The Hague on World Food Day, October 16, 2016 (see Monsanto’s track record.

The International Criminal Court in The Hague has determined that prosecuting ecocide as a criminal offence is the only way to guarantee the rights of humans to a healthy environment and the right of nature to be protected.

As for the symbolic trial, the tribunal’s website states: “According to its critics, Monsanto is able to ignore the human and environmental damage caused by its products and maintain its devastating activities through a strategy of systemic concealment: by lobbying regulatory agencies and governments, by resorting to lying and corruption, by financing fraudulent scientific studies, by pressuring independent scientists, by manipulating the press and media, etc.

“The history of Monsanto would thereby constitute a text-book case of impunity, benefiting transnational corporations and their executives, whose activities contribute to climate and biosphere crises and threaten the safety of the planet.”

How long do the EC and the EFSA think they can continue to play the European public for fools?

 


 

Colin Todhunter is an extensively published independent writer and former social policy researcher, based in the UK and India. More of his articles can be found on Colin’s website.

Support Colin’s work here.

Also on The Ecologist:No, the UN has not given glyphosate a ‘clean bill of health‘ by Georgina Downs.

More information: Rosemary Mason’s documents contain a great amount of detail on the glyphosate issue and can be consulted here:

 

Glyphosate in the EU: product promoters masquerading as regulators in a ‘cesspool of corruption’?

On 13th April, the EU Parliament called on the European Commission to restrict certain permitted uses of the toxic herbicide glyphosate, best known in Monsanto’s ‘Roundup’ formulation. Glyphosate was last year determined to be ‘probably carcinogenic‘ by the WHO.

The parliament’s resolution called for no approval for many uses now considered acceptable, including use in or close to public parks, playgrounds and gardens, use where integrated pest management systems are sufficient for necessary weed control, and as a pre-harvest ‘dessicant’ on arable crops.

The resolution, however, fell short of calling for an outright ban. Due to the various political maneuverings, a disappointing compromise was reached that called for the renewal of the licence for glyphosate to be limited to just seven years instead of the 15 proposed by the Commission.

The resolution and the vote to re-approve glyphosate for seven years are non-binding, and tomorrow, on Wednesday 18 May, the European Food Standard Authority Standing Committee on Plants, Animals, Food and Feed will meet to decide whether glyphosate is to be re-registered for use in the EU.

Rosemary Mason, indomitable campaigner for truth

In addition to the World Health Organisation classifying glyphosate as being probably carcinogenic to humans, various peer-reviewed studies have indicated strong links between its use and a range of serious diseases and deleterious environmental impacts, as presented by Rosemary Mason in the documents that are attached to this article (see below).

Rosemary Mason has been campaigning about the harmful effects of glyphosate for many of years. She has sent various open letters accompanied by in-depth, fully-referenced reports to key figures in both Britain and the EU who are responsible for regulating the use of glyphosate and for setting the official narrative about this substance.

In the downloads provided at the end of this text, you can access some of the documents she has sent to the EFSA, European Commission and other key bodies / figures since November 2015. They provide detailed descriptions of the impacts of glyphosate along with the ongoing saga of deception and duplicity that result in an ultimate failure to regulate.

It would be an understatement to say that Mason smells a rat: the kind of rat recently discussed on the Corporate European Observatory website, which describes the strategic position the biotech lobby has gained within the heart of policy / decision-making processes in the EU. And the kind of rat that underlies the collusion between this lobby and regulatory / policy bodies in Europe, which has been described many times over the years.

For example, see William Engdahl’s recent piece on the “cesspool of corruption” that underpins relations between the EU, EFSA and the major pesticide companies; read how scientific evidence was sidelined in the EU to get the use of gylphosate sanctioned.

And – just to highlight the type of companies public officials and bodies are all too willing to jump into bed with – read how Monsanto appears to have hidden evidence of the glyphosate-cancer link for decades.

With reports emerging that the EC plans to relicense glyphosate for nine years, should we be too surprised about this when glyphosate sales account for $5.1 billion of Monsanto’s revenue (2014 figure)? The level of collusion between the biotech lobby and public officials suggest that the line between product promoting and regulating was crossed long ago.

Whose interests are being served here?

In response to the WHO reclassification of glyphosate as being probably carcinogenic to humans, the EFSA responded with its own review and concluded a cancer link was unlikely. The way the review was manipulated to reach that conclusion has been roundly condemned by dozens of scientists.

Mason notes that there is currently a legal case in process against EU regulators, and, if anyone were to be found to be colluding with the pesticides industry over the licensing of glyphosate, there are likely to be severe penalties. Environmentalists have launched the case against Monsanto and EU regulators over glyphosate assessment.

Details about this action are provided on the GMWatch website, where Viennese lawyer Dr Josef Unterweger states: “If there has been deliberate manipulation of the new licensing procedure for glyphosate with the intention of approving a carcinogenic substance, then this would be defrauding 508 million EU citizens.”

For this reason Dr Unterweger is pressing charges against the  European Food Safety Authority and the German regulator BfR on behalf of Munich Environmental Institute and the six environmental organisations: Global 2000, Pesticide Action Network (PAN) Europe, PAN Germany, PAN UK, Générations Futures (France), WeMove Europe, and Nature & Progrès Belgique. A report will also be submitted to OLAF, the European anti-fraud office.

In addition 94 respected scientists wrote an open letter to the EU commission that criticised the BfR / EFSA assessment as “scientifically unacceptable”, “fundamentally flawed” and “misleading”.

The ongoing scenario surrounding glyphosate begs the question whose interests are ultimately being served? Those of 500 million Europeans or those of Monsanto, a corporation that will be put ‘on trial’ as part of a civil society initiative for crimes against nature and humanity and ecocide in The Hague on World Food Day, October 16, 2016 (see Monsanto’s track record.

The International Criminal Court in The Hague has determined that prosecuting ecocide as a criminal offence is the only way to guarantee the rights of humans to a healthy environment and the right of nature to be protected.

As for the symbolic trial, the tribunal’s website states: “According to its critics, Monsanto is able to ignore the human and environmental damage caused by its products and maintain its devastating activities through a strategy of systemic concealment: by lobbying regulatory agencies and governments, by resorting to lying and corruption, by financing fraudulent scientific studies, by pressuring independent scientists, by manipulating the press and media, etc.

“The history of Monsanto would thereby constitute a text-book case of impunity, benefiting transnational corporations and their executives, whose activities contribute to climate and biosphere crises and threaten the safety of the planet.”

How long do the EC and the EFSA think they can continue to play the European public for fools?

 


 

Colin Todhunter is an extensively published independent writer and former social policy researcher, based in the UK and India. More of his articles can be found on Colin’s website.

Support Colin’s work here.

Also on The Ecologist:No, the UN has not given glyphosate a ‘clean bill of health‘ by Georgina Downs.

More information: Rosemary Mason’s documents contain a great amount of detail on the glyphosate issue and can be consulted here:

 

To keep the lights on, pay people to use less electricity

We were told early last week that the government will pay existing power stations a fee for staying open over the winter of 2017/18.

A similar scheme is already in place for later years. Old power stations, which would probably otherwise close, will be paid about a billion pounds as a bribe to remain ready to generate power. This scheme is called the ‘capacity auction’.

The government is convinced this is the right way to ensure that we never – or virtually never – lose electricity supply. I want to suggest three schemes that would have represented much better value for money.

In fact, in the longer run they will all save householders substantial amounts of cash, rather than costing us money. These are

  • Pay people to reduce their electricity demand at home, probably by providing a game with prizes.
  • Hand out LED light bulbs to reduce household electricity use by replacing the increasing numbers of inefficient halogen lamps in kitchens and living areas.
  • (I do realise that this next suggestion is deeply counter-cultural but I make it nevertheless). Tell people when an electricity blackout is likely; ask them voluntarily to reduce their power use at that time. I suspect the results would be far better than anybody thinks is possible.

In this post, I’m going to look at the first of these options. (An article on the unassailable reasons for handing out free LED bulbs will follow. This second post will use analysis I have been done for Greenpeace on the impact of switching to LEDs on peak power demand).

First, let’s get our facts straight …

First of all, we need a few numbers to start the discussion on ‘games’ to reduce power demand.

  • Over the next couple of years, the government thinks that up to 8.5 gigawatts of fossil fuel electricity generating capacity may decide to close.
  • It believes that these closures can be expected to result in electricity demand exceeding supply for 38 hours a year. (Probably this means 1-2 hours on around 25 weekdays in December and January, when demand is highest).
  • During each of these hours, the forecast is that an average of about 2 gigawatts of demand is not met. This is about 4% of typical peak demand. (I suspect that this will usually mean that one area of the country representing about 4% of demand will be disconnected for the period of 1-2 hours). On average, each household will lose power for about 2.25 hours if the forecasts are correct. (1.5 hours of loss 1.5 times a year).
  • Now here’s a number that we should look twice at: the government says that the ‘cost’ to society of this power outage is £17,000 for each megawatt hour of electricity not supplied, or £17 a kilowatt hour. The average household is using about 1.1 kilowatts at the December peak, so the cost of not having electricity is put at about £19 an hour, or about £28 for the typical outage of 1.5 hours for the average home. That’s about 150 times what the lost electricity would have cost, by the way.

    I don’t believe the real figure is more than a tiny fraction of this but the important thing is that this number represents the assumed cost of TOTAL loss of power. We can agree that a power cut is potentially costly and inconvenient to householders. But, by contrast, having to cut usage in half, perhaps by turning off the washing machine, has a negligible impact on us.

  • By December 2017, I guess there will be 9 million smart meters in UK homes. That means 1 in 3 households will be able to change their rate of consumption of electricity and have this measured independently by a third party.

Some of the implications of these numbers include:

  • If we could reduce demand by 2 GW below what it would have been at peak, most outages would not occur in the winter of 2017/18 and, second, the numbers affected by any power cuts would be much reduced.
  • There are about 27 million households in the UK. If we could in some way reduce the average electricity demand in these homes by 100 watts at 5 o’clock on a December evening, we would save 2.7 gigawatts. That’s less than a 10% reduction in typical household power consumption.
  • Or if we cut power use in smart meter homes by 300 watts, we could make a similar saving. On average, that would mean a cut of less than 30% below the average usage level.
  • Either way, we substantially reduce the threat of power outages.


Paying people to reduce their electricity demand.

Around the world utilities are introducing ‘time of use’ pricing for home users. Take power from the grid at times of peak demand and you pay a higher price. This is the market working in its conventional way, choking off usage at times when supplies are tight. It works because it punishes.

It may not be the best way of getting people to use less. Rewarding socially beneficial behaviour could be at least as effective. If a power supplier paid its customers for keeping their usage low, demand will also fall.

And there is lots of money available to offer as a reward. The government’s capacity market is expected to cost £38 a household across the 27 million homes in the UK. That means we have over £100 available for each of the 9 million homes that have smart meters.

One incentive scheme for smart meter homes might use the following format. On the 25 days a year that demand is expected to exceed supply in the early evening, a message is sent to the phones of people in the scheme. Pay people £4 for keeping their household electricity demand below an average of 250 watts over the critical 1-2 hour period. (That’s below a quarter of typical household use).

Someone who successfully plays the game 25 times would make £100. If the government wants to encourage smart meter takeup, I can’t think of a better incentive.

Then there’s the social aspect to this. If you are wealthy, £100 may not be worth the inconvenience of switching off the dishwasher, turning most of the lights out and avoiding using the cooker for an hour or so. But for those who are short of cash, this amount of money could make a difference. It’s potentially a highly progressive, rather than regressive, policy.

20th century thinking won’t solve 21st century problems

This all sounds far-fetched, impossible even in today’s connected world. But the young Silicon Valley company Bidgely (‘electricity’ in Hindi) shows how it might work. Bidgely puts an app on your phone that informs you in real-time what your energy usage is. When the power emergency arrives, it tells you when you need to reduce your electricity draw. As importantly, it then gives you instant updates on how your home is performing against the target of 250 watts.

One of Bidgely’s strengths is that it recognises the power use signature of each major appliance in the house. (The heaters in electric dryers cycle on and off in short bursts, for example). So the app can send an alert that warns the householder which appliances are using a lot of power and threatening the attainment of the reward. Bidgely doesn’t need to put sensors on each appliance. At the end of the emergency period, a signal is sent to the smartphone saying what the average usage has been and whether or not the prize has been won.

Of course it’s also increasingly easy to imagine times when the National Grid has too much power. We have already seen several instances this year. Instead of rewarding power use reduction, Bidgely could give you cash for turning on appliances instead.

Bidgely’s investors and customers include the German giants E.ON and RWE, still two of the biggest private utilities in the world. Both companies say their business will move from operating giant fossil fuel power stations to providing a variety services to electricity customers. It’s easy to see how Bidgely might provide a key part of this.

That’s the first option of the three listed above. Instead of rewarding fossil fuel generators for promising to stay open, pay individual householders a decent reward for cutting their demand when told to. It would be cheaper, and instead of going to elderly power stations the money would largely arrive in the bank accounts of the less well-off. (Although the other beneficiary group might be the young London professionals who are not home, and therefore not using much electricity, when the power shortage looms).

There’s one thing that continually strikes me about UK energy policy. It’s driven by a view that supply must be continually matched to an inflexible demand. That 20th century ideology needs radical updating. Today’s world offers almost unlimited opportunities to mould demand to the available supply.

If we don’t have the generating capacity to meet demand for a few hours each winter, the answer surely does not lie in spending a billion pounds on diesel generators and superannuated coal-fired power stations.

Instead we could pay some money, probably largely to less well-off households, to reduce demand until the emergency passes a couple of hours later.

 


 

Chris Goodall is an expert on energy, environment and climate change, and a frequent contributor to The Ecologist. He blogs at Carbon Commentary.

This article was first published on Carbon Commentary. Ideas expressed in this article are explored in far greater detail in The Switch, a book about the global transition to solar power, to be published in June 2016 by Profile Books.

 

World Yoga Festival

World Yoga Festival is bringing together the most revered collection of teachers in traditional yoga, meditation and wisdom gathered in one glorious 3-day open-air event on 29th-31st July 2016 in the grounds of Beale Park near Reading, UK.
 

The programme includes: Advaita Zen master Sri Mooji.

Yogacharyas: Bijou H. Mehta, Swami Jyothirmayah, Sri Louise, Sri Nanda Kumar (BKS Iyengar Yoga Shala), Nataraj, Nrithya Jagannathan (KYM), Sharmila Mahesh, S Sridharan (KYM), V Srinivasan (KYM).
 
Meditation & wisdom masters: Mike Sarson, Neema Majmudar, Swami Paramatmananda, Peter Russell, Ram Banerjee, Rupert Spira, Swami Santatmananda, Suriya Tahora.
 

You can also enjoy: Astraology, ayurveda, dance & sound: Ananya Chatterjee, Angela Hope-Murray, Sheila Whittaker, Swami Svatmananda.
 
Live music: Prem Joshua & Band, Shammi Pithia & Band.
 
The festival features a full children’s programme and some of the best vegetarian and vegan food available locally. A range of camping and glamping options are available.

Daily capacity 1,500
.
Location: Beale Park, Lower Basildon, Reading, West Berkshire RG8 9NW

Tickets from £40 – £145 for a festival pass.

For more information and to buy tickets visit the Yogafestival website.
Follow Yogafesitval on twitter: @worldyogafest


 

With thanks to Cristina at the World Yoga Festival for this information.