ISN'T IT TIME WE REGULATED CHEMICALS?
By Tim Montague*
If you read almost any newspaper these days, you learn the following kinds of information:
** Many plastic toys contain chemicals that can interfere with the sexual development of laboratory animals and are now thought capable of doing the same in baby boys.
** Most of the rivers and streams in the U.S. are contaminated with low levels of chemicals that can change the sexual orientation of fish and can interfere with reproduction in animals that feed on fish.
** Dozens of toxic chemicals have recently been measured in household dust, indicating that common consumer products are contaminating our homes with toxicants.
You might ask yourself, isn't the government regulating dangerous chemicals? Unfortunately, the answer is No, not in any effective way.
About 1700 new chemicals are put into commercial use each year, almost entirely untested for their effects on humans and the natural world.
After a chemical causes enough harm for someone to take notice, then the government conducts a numerical risk assessment (aka, quantitative risk assessment) on an individual chemical. The point of a numerical risk assessment is to learn how much of a chemical is "safe" to eat, drink, and breathe. Then the government may try to regulate releases of that chemical. But fewer than 1% of all chemicals are currently regulated.
A scientist at the University of Oregon has described why numerical risk assessment doesn't work, and has suggested other ways we could control chemical hazards. Dr. Joe Thornton -- a biologist -- explains that numerical risk assessment is a fundamentally inappropriate way to control persistent pollutants (such as heavy metals and chemicals containing chlorine) for two reasons:
1) It assumes that we can learn all the ways that every individual chemical can cause harm in humans and in the natural environment -- but there aren't enough scientists in the world to do this.
2) Many industrial chemicals tend to stick around for a long time and move from place to place in ways that are impossible to predict, so often we don't even know what we're looking for.
Thornton proposes we adopt four new ways of regulating chemicals -- zero discharge, clean production, reverse onus, and phasing out entire classes of persistent chemicals -- because the old way (regulating one chemical at a time at the end of the discharge pipe) simply doesn't work.
Risk assessment assumes that damage is local, short-lived, and predictable. But organisms and the environment are complex, interconnected, and only partly understood (to put it mildly). Therefore, we cannot predict cause-and-effect in any reliable way. In the face of these insurmountable difficulties, we can take a precautionary stance: when we have good reason to suspect harm, yet we have scientific uncertainty, we can err on the side of caution. Faced with choices, we can give the benefit of the doubt to public health and to nature.
Thornton's four principles begin to clarify how the precautionary principal can work in the real world. These principles are:
ZERO DISCHARGE -- Persistent and bioaccumulative toxicants are incompatible with ecological processes, and no amount of their release into the environment is acceptable.
CLEAN PRODUCTION -- We can consider alternative technologies up front and avoid the use of known toxicants in manufacturing. Finding alternatives rather than approving pollutants becomes the focus. For example, in dry cleaning, we can replace perchloroethylene (perc) with CO2 and water-based methods.
REVERSE ONUS -- Apply the same logic used in drug safety: give manufacturers the responsibility to show that a product is reasonably safe for use before it can be released into the environment. This shifts the burden of proof from society to the chemical companies to provide information about their products, to monitor for harmful effects and to come clean about their findings.
EMPHASIS ON LARGE CLASSES OF CHEMICALS -- Faced with the impossibilities of measuring the impacts of individual chemicals, simply phase out entire classes of compounds that are clearly problematic. PCBs, CFCs and lead compounds are all examples of classes of chemicals that have been phased out because of their hazards.
Thornton gives six reasons why the current risk paradigm is so flawed:
1. ACCUMULATION OF PERSISTENT POLLUTANTS
Risk-based approaches assume that nature and living things can absorb and assimilate synthetic chemicals, breaking them down and digesting them. This may be true for sewage, oil, and other naturally occurring substances. But persistent organic pollutants (POPs) like pesticides, solvents, refrigerants, etc. often resist natural breakdown and can persist for years, decades or centuries.
Many POPs and metallic pollutants are fat-soluble and thus bioaccumulate as they move up the food chain. Top predators like humans, bears, and big fish can accumulate chemical concentrations that are tens of millions of times greater than typical environmental levels.
Persistence and bioaccumulation mean that even very small discharges of synthetic chemicals can build up to dangerous levels in our bodies over time. The general public's average body burden for some of the best studied pollutants is already at or near the range at which health impacts have been found in laboratory animals. To avoid this problem, we can declare that there is no level of acceptable discharge for chemicals that persist or magnify in the food chain -- in other words, we can adopt a zero-discharge policy.
2. CUMULATIVE GLOBAL POLLUTION
Numerical risk assessment oversimplifies the real world and considers environmental risks to be local in time and space. Once a chemical disperses beyond some horizon, it is assumed to do no further harm. So industry is encouraged to dot the landscape with sources of pollution that collectively begin to overwhelm the biosphere but which are individually within acceptable limits. As a result the entire planet has become polluted.
3. TOXICOLOGICAL COMPLEXITY
The science of numerical risk assessment is based on the premise that we can calculate a chemical's impact on the health of living things. Risk assessors do this by measuring the toxicity of individual chemicals on individual species -- usually rats, mice and other small mammals. There are at least 70,000 synthetic chemicals being used in commerce today (up from 40,000 in 1991). Risk assessment considers the toxicity of an individual pollutant acting alone -- when in reality each chemical is acting in concert with a myriad of other chemicals in the environment.
This poses a huge problem -- studying multiple chemical exposures is very costly and time-consuming. It would require 33 million experiments just to learn something about the effects of 25 different chemicals on a single species over a short period of time (13 weeks). A similar study of just 1% of the 70,000 chemicals in commerce would require 10E210 experiments (that is, 10 with 210 zeroes behind it). Trillions upon trillions upon trillions of experiments -- you can see that science as we know it is not prepared to tackle this problem. On the other hand, the risk-assessment solution is easy: Just ignore multiple chemical exposures.
4. INADEQUATE DATA
Industry's capacity for inventing new chemicals has overwhelmed the regulatory system's ability to study their potential harms. The chemical industry is introducing at least 1700 new chemicals into commerce each year. The U.S. National Toxicology Program conducts assessments on just 10 to 20 substances per year. At this rate we are falling at least 90 years behind in our knowledge each year that passes. A study by the National Research Council in 1997 concluded that we lack even minimal toxicity information for 70% of the most worrisome chemicals -- those that are manufactured in high volume and are already suspected of harming the environment.
The risk-assessment solution: If you don't have data on the toxicity of a substance, assume the risk is ZERO. Just ignore the problem. Here, reverse onus plays an important role in putting the burden of proof on industry to collect and reveal data on new chemicals prior to their general release or manufacture.
5. FORMATION OF CHEMICAL MIXTURES AND BYPRODUCTS
The nature of industrial chemistry is messy. When you mix chemicals under diverse industrial circumstances you inevitably produce new and unexpected byproducts. Joe Thornton gives three examples of how we are flying blind:
a) Paper manufacturing. The effluent from pulp mills contain over 300 organochlorine byproducts; including dioxins, furans, phenols, benzenes, thiophenes, methyl-sulfones, methanes, ethanes, acids and PCBs. We have only identified 3 to 10% of the organically bound chlorine in pulp effluent. In other words, we are 90-97% ignorant of what is coming out of the pipe.
b) Incineration. Incinerator emissions are estimated to contain over 1,000 products of incomplete combustion (complete combustion would reduce the fuel to carbon dioxide and water). Yet we have identified only 40-60% of these chemical effluents.
c) Pesticide manufacture. Byproducts account for almost 20% of DDT manufacture by weight. Many of these byproducts have never even been identified.
We don't know the names, structures or toxicity of many of the chemical byproducts formed in industrial processes. Even though we phased out purposeful manufacture of PCBs they -- and dioxins, an unwanted byproduct -- are still being introduced into the environment as side-effects of chlorine chemistry. To prevent global contamination with dioxin, we would need to phase out the whole class of organochlorines.
6. POLLUTION CONTROL AND DISPOSAL
End-of-pipe pollution control and disposal technologies do little to prevent global environmental contamination. If you manufacture a substance that breaks down slowly and tends to accumulate in living things, it will eventually spread throughout the living world. Scrubbers, filters, precipitators, incinerators, and landfills are all just ways of temporarily moving a substance from one location or form to another (a shell-game). In the end, everything that persists will disperse into the air, water, land and living things and people will be affected. Landfills leak, incinerators generate toxic ash and gas, and even the best pollution controls are never 100% effective.
Thornton helps us realize that we are foolish to try to control chemicals with the end-of-the-pipe risk-assessment approach. Instead, we can use the precautionary principle and acknowledge that:
a) Some chemicals don't belong in the environment (zero discharge) and are best regulated away as entire classes of compounds;
b) With the right combination of carrots and sticks as motivation, industry can find clean technologies (clean production); and
c) The burden of proof (aka "reverse onus") can be placed on the industries that want to introduce new chemicals -- to show that they have done their best to understand the consequences of their actions -- thus motivating them to innovate and develop clean technologies. No data? No market.
* Tim Montague is Associate Director of Environmental Research Foundation. He holds an M.S. degree in ecology from University of Wisconsin-Madison and lives in Chicago.
A NEW WAY TO INHERIT ENVIRONMENTAL HARM
by Tim Montague*
New research shows that the environment is more important to health than anyone had imagined. Recent information indicates that toxic effects on health can be inherited by children and grandchildren, even when there are no genetic mutations involved. These inherited changes are caused by subtle chemical influences, and this new field of scientific inquiry is called "epigenetics."
Since the 1940s, scientists have known that genes carry information from one generation to the next, and that genes gone haywire can cause cancer, diabetes, and other diseases. But scientists have also known that genes aren't the whole story because identical twins -- whose genes are identical -- can have very different medical histories. One identical twin can be perfectly healthy while the other develops schizophrenia or cancer -- so the environment must play a significant role, not merely genes.
What's surprising is that scientists are now revealing that these environmental effects can be passed from one generation to the next by a process called "epigenetics," with far-reaching implications for human health. Epigenetics is showing that environmental influences can be inherited -- even without any mutations in the genes themselves -- and may continue to influence the onset of diseases like diabetes, obesity, mental illness and heart disease, from generation to generation.
In other words, the cancer you get today may have been caused by your grandmother's exposure to an industrial poison 50 years ago, even though your grandmother's genes were not changed by the exposure. Or the mercury you're eating today in fish may not harm you directly, but may harm your grandchildren.
This emerging field of epigenetics is causing a revolution in the understanding of environmental influences on health. The field is only about 20 years old, but is becoming well-established. In 2004, the National Institutes of Health granted $5 million to the Johns Hopkins Medical School in Baltimore to start the Center for Epigenetics of Common Human Disease.
The latest information appears in a new study by Michael Skinner and colleagues at the University of Washington, published in the June 3 issue of Science magazine. Skinner found that mother rats exposed to hormone-mimicking chemicals during pregnancy gave birth to four successive generations of male offspring with significantly reduced fertility.[3] Only the first generation of mothers was exposed to a toxin, yet four generations later the toxic effect could still be detected.
Prior to this study, scientists had only been able to document epigenetic effects on the first generation of offspring. These new findings suggest that harm from toxins in the environment can be much longer lasting and pervasive than previously known because they can impact several generations.
And therefore a precautionary approach to toxics is even more important that previously believed.
Over the past sixty years doctors and scientists have pieced together a picture of the genetic basis for life and some of the genetic causes of! human and animal disease. Genes regulate the production of proteins -- the essential building blocks of life. Genes are composed of a finite series of letters (a code made up of Cs, Ts, As, and Gs, each representing a nucleotide) embedded in long strands of DNA. DNA is the large molecule, composed of genes, that carries the genetic inheritance forward into the next generation.
There are approximately three billion 'letters' in the human genetic code. Science has long understood that when a gene mutates -- that is, when a typo is introduced -- it can have far-reaching effects for the cell, the tissue and the organism as a whole. For example, a genetic mutation caused by too much sun (ultraviolet radiation), could result in abnormal uncontrolled cell growth which could lead to skin cancer which could spread throughout your body. Stay in the shade and you reduce your risk.
But now scientists are seeing that disease can be passed from generation to generation without any genetic mutations. The DNA molecule itself gets another molecule attached to it, which changes the behavior of the genes without changing the genes themselves. The attachment of these additional molecules is caused by environmental influences -- but these influences can then be passed from one generation to the next, if they affect the germ cells, i.e., the sperm or the egg.
Scientists have, so far, discovered three different kinds of "epigenetic" changes that can affect the DNA molecule and thus cause inheritable changes. One is the methyl molecule.
Scientists began to see direct connections between human diseases like cancer and these subtle genetic variations like methylation in 1983 when Andrew Feinberg and his colleagues at Johns Hopkins found that cancer cells had unusually low incidence of DNA-methylation.
Methyl is a molecule of one carbon atom and three hydrogen atoms. Together they attach to a strand of DNA altering its three-dimensional structure and the behavior of specific genes in the DNA strand. It turns out that methylation works like a volume control for the activity of individual genes. Whereas genetic mutations are typos and relatively easy to test for, epigenetic changes are analogous to the formatting of the text (e.g. font, size, and color) and are much less-well understood. Over the past 20 years, Feinberg and many other cancer specialists have documented the wide-spread influence of epigenetics on the development of cancer in humans and laboratory animals.
So epigenetics is changing our traditional picture of common chemicals, like DDT. DDT is a powerful environmental toxin -- once it enters a living thing it mimics the behavior of natural hormones -- resulting in abnormal sexual and reproductive development. Widespread use of DDT in the 1940s and 1950s is associated with large scale declines in some bird populations (like the Peregrin falcon) because DDT causes birds' eggshells to thin, and thus the eggs crack before the embryo can develop into a chick.
When persistent environmental pollutants (like DDT) are phased out, we might be falsely lulled into believing that we have solved the problem. The thinking is logical -- remove the toxin from the environment and you get rid of the toxic effects. Not so according to the findings of Skinner and his colleagues.
The Skinner study tells us that phasing out dangerous toxins doesn't end the problem -- because the damage done by exposures decades ago could still flow from generation to generation via epigenetic pathways.
Skinner and his colleagues treated groups of pregnant rats, some with methoxychlor and some with vinclozolin. Methoxychlor is a replacement for DDT, a pesticide used on crops and livestock and in anima! l feed. Vinclozolin is a fungicide widely used in the wine industry. It is just one of a suite of widely used chemicals from flame-retardants to ingredients in plastics that can cause reproductive abnormalities in laboratory animals.
Both methoxychlor and vinclozolin are known hormone disruptors. Male offspring of these pesticide-treated mothers had reduced fertility (lower sperm count, reduced sperm quality), which was not a surprising finding. The scientists then bred these offspring, and again the male offspring had reduced fertility. This came as a complete surprise. Over 90% of the male offspring in four generations of the test animals had reduced fertility.
Skinner's report concludes that genetic mutations are highly unlikely to produce such a strong signal in the treated animals and that DNA-methylation is the likely mechanism responsible for the observed decline in male fertility.
Treating the mother rats during pregnancy apparently re-programmed the genetic material in the male offspring so that all subsequent male offspring suffered lower fertility from this environmental factor.
Skinner believes that his findings in rats could explain the dramatic rise in breast and prostate cancers in humans in recent decades as partly due to the cumulative effects of multiple toxins over several generations.
Skinner acknowledges that the doses he gave his rats were high, compared to the doses humans might expect to receive from environmental exposures. He is continuing his rat experiments with lower doses now.
Of course all this new information makes the control of toxic chemicals even more important that previously thought. The health of future generations is at stake.
The development of epigenetics also greatly complicates toxicity tes! ting, and chemical risk assessment. Epigenetics tells us that much additional toxicity testing will be needed. So far, there are no standardized, government-approved protocols for conducting epigenetic tests. Until such protocols emerge (which could take years), and a great deal of expensive testing has been completed (requiring many more years), risk assessors will have to acknowledge that -- so far as epigenetics is concerned -- they are flying blind.
* Tim Montague is Associate Director of Environmental Research Foundation. He holds an M.S. degree in ecology from University of Wisconsin-Madison and lives in Chicago.
NEXT PAGE -->
|
|
|---|

| * * * IN-HOUSE RESOURCES * * * |
|---|