Wednesday, July 31, 2013

BPA exposure disrupts human egg maturation; may explain some fertility problems




IMAGE: Images of eggs examined in this study show a properly formed spindle structure with aligned chromosomes (image A) and eggs with spindles of various abnormal shapes and misaligned chromosomes after...

Click here for more information.
As many as 20 percent of infertile couples in the United States have unexplained reasons for their infertility. Now, new research led by Catherine Racowsky, PhD, director of the Assisted Reproductive Technologies Laboratory at Brigham and Women's Hospital (BWH), shows that exposure to BPA (Bisphenol-A) could be a contributing factor as to why some infertile couples are having difficulty conceiving. The study will be published online on July 31, 2013 in the journal Human Reproduction.

"To our knowledge, this is the first study that has shown that BPA has a direct effect on egg maturation in humans," said Dr. Racowsky. "Because exposure to BPA is so ubiquitous, patients and medical professionals should be aware that BPA may cause a significant disruption to the fundamentals of the human reproductive process and may play a role in unexplained infertility."

The randomized trial examined 352 eggs from 121 consenting patients at a fertility clinic. The eggs, which would have otherwise been discarded, were exposed to varying levels (20 ng/ml, 200 ng/ml and 20 µg/ml) of BPA in a laboratory setting. An egg from each patient was not exposed to BPA and served as the control. Researchers then examined the eggs and found that exposure to BPA caused:
  • A decrease in the percentage of eggs that matured.
  • An increase in the percentage of eggs that degenerated.
  • An increase in the percentage of eggs that underwent spontaneous activation, the abnormal process when an egg acts as though it has been fertilized, even though it has not been.
As the BPA dose increased, there was a decreased likelihood of maturity, an increased likelihood of degeneration and an increased likelihood of spontaneous activation. Additionally, among the mature eggs, there was a significant trend toward a decreased incidence of bipolar spindles and aligned chromosomes with an increased dose of BPA. Researchers note that these results are similar to the previous research examining the impact of BPA exposure on animal eggs.

Racowsky said, "Our data show that BPA exposure can dramatically inhibit egg maturation and adds to a growing body of evidence about the impact of BPA on human health. I would encourage further research to gain a greater understanding of the role BPA plays in infertility."

------------------------------------------------------------------------
For more stories on health, pollution, chemical exposure and improving your indoor air quality visit www.allerair.com or call to speak to an air quality expert about improving the air in your home 1-888-852-8247.

Monday, July 29, 2013

Pedticides not contained on farmland; Scientists find contaminated frogs far from fields

Photo: freedigitalphotos.net
Pesticides commonly used in California's Central Valley, one of the world's most productive agricultural regions, have been found in remote frog species miles from farmland. Writing in Environmental Toxicology and Chemistry, researchers demonstrate the contamination of Pacific Tree Fogs in remote mountain areas, including national parks; supporting past research on the potential transport of pesticides by the elements.

California's Central Valley is one of the most intensely farmed regions in North America, producing 8% of U.S agricultural output by value. While the use of pesticides such as triazines, endosulfan and organophosphates is common across the U.S., California uses more pesticides than any other state.

"Our results show that current-use pesticides, particularly fungicides, are accumulating in the bodies of Pacific chorus frogs in the Sierra Nevada," said Kelly Smalling a research hydrologist from the U.S. Geological Survey. "This is the first time we've detected many of these compounds, including fungicides, in these remote locations."

The Pacific chorus frog Pseudacris Regilla can be found in abundance across the state's Sierra Nevada mountain range. As with other amphibians, agrochemicals potentially pose a threat to chorus frogs, as exposure to pesticides can decrease their immune system, thereby increasing the risk of disease.

The team collected frogs, as well as water and sediment samples, from seven ponds ranging from Lassen Volcanic National Park at the northern most point of Central Valley, to the Giant Sequoia National Monument in the valley's southern extent. All sites were downwind of agricultural areas.

"The samples were tested for 98 types of pesticides, traces of which were found in frog tissues from all sites," said Smalling. "We found that even frogs living in the most remote mountain locations were contaminated by agricultural pesticides, transported long distances in dust and by rain."

Two fungicides, commonly used in agriculture, pyraclostrobin and tebuconazole, and one herbicide, simazine, were the most frequently detected compounds, and this is the first time these compounds have ever been reported in wild frog tissue. Another commonly detected pesticide was DDE (Dichlorodiphenyldichloroethylene) a breakdown product of DDT which was banned in the United States in 1972. The continued presence of a DDT byproduct reveals how long this banned chemical can impact biodiversity.

A comparison of the frog tissue with water and sediment collected from the same sites shows that the frogs were the more reliable indicator of chemical exposure. This is partly due to the physical-chemical properties of the l compounds and biological influences such as such as organism specific metabolism and life history. Documenting the occurrence of these compounds is an important first step in figuring out the health consequence associated with the exposures.

"Very few studies have considered the environmental occurrence of pesticides, particularly fungicides which can be transported beyond farmland," concluded Smalling. "Our evidence raises new challenges for resource managers; demonstrating the need to keep track of continual changes in pesticides use and to determine potential routes of exposure in the wild."

-------------------------------------------------------------------------
For more stories on chemical exposure and improving your indoor air quality visit www.allerair.com or call to speak to an air quality expert about improving the air in your home 1-888-852-8247.

Friday, July 26, 2013

Smoking during pregnancy associated with offspring behavior problems, study suggests

Smoking during pregnancy appears to be a prenatal risk factor associated with conduct problems in children, according to a study published by JAMA Psychiatry.
Conduct disorder represents an issue of significant social, clinical, and practice concern, with evidence highlighting increasing rates of child conduct problems internationally. Maternal smoking during pregnancy is known to be a risk factor for offspring psychological problems, including attention deficits and conduct problems, the authors write in the study background.

Professor Gordon Harold and Dr. Darya Gaysina, of the University of Leicester, with colleagues in the United States and New Zealand, examined the relationship between maternal smoking during pregnancy and offspring conduct problems among children raised by genetically related mothers and genetically unrelated mothers.

"Our findings suggest an association between pregnancy smoking and child conduct problems that is unlikely to be fully explained by postnatal environmental factors (i.e., parenting practices) even when the postnatal passive genotype-environment correlation has been removed." The authors conclude, "The causal explanation for the association between smoking in pregnancy and offspring conduct problems is not known but may include genetic factors and other prenatal environmental hazards, including smoking itself."

------------------------------------------------------------------------------
For more stories on second-hand smoke or our air purifiers for tobacco smoke, visit www.allerair.com or call to speak to an air quality expert about improving the air in your home 1-888-852-8247.

Thursday, July 25, 2013

Bad sleep around full moon not a myth say scientists

Photo: Imagerymajestic/freedigitalphotos.net

Many people complain about poor sleep around full moon. Scientists at the University of Basel in Switzerland now report evidence that lunar cycles and human sleep behavior are in fact connected. 

A research group at the Psychiatric Hospital of the University of Basel analyzed the sleep of over 30 volunteers in two age groups in the lab. While they were sleeping, the scientists monitored their brain patterns, eye movements and measured their hormone secretions. The findings suggest that even today, despite the comforts of modern life, humans still responds to the geophysical rhythms of the moon.

Short And Poor Sleep


The data show that both the subjective and the objective perception of the quality of sleep changed with the lunar cycles. Around full moon, brain activity in the areas related to deep sleep dropped by 30 percent. People also took five minutes longer to fall asleep and they overall slept for 20 minutes less. The volunteers felt as though their sleep had been poorer during full moon and they showed lower levels of melatonin, a hormone that regulates sleep and wake cycles.

"This is the first reliable evidence that lunar rhythm can modulate sleep structure in humans", Cajochen says.

A Relic From The Past 

According to the researchers, this circalunar rhythm might be a relic from past times, when the moon was responsible for synchronizing human behavior. This is well known for other animals, especially marine animals, where moon light coordinates reproduction behavior. Today, other influences of modern life, such as electric light, masked the moon's influence on us. However, the study shows that in the controlled environment of the laboratory with a strict study protocol, the moon's hold over us can be made visible and measurable again.

The results have been published in the journal Current Biology. 

--------------------------------------------------------------
For more stories on health, pollution, chemical exposure and improving your indoor air quality visit www.allerair.com or call to speak to an air quality expert about improving the air in your home 1-888-852-8247.

Wednesday, July 24, 2013

Researchers find enzyme that could hold the key for asthma sufferers

An enzyme known for its role in heart disease may well be a promising target to treat asthma. Researchers from the University of Iowa have found that the enzyme, called CaMKII, is linked to the harmful effects of oxidation in the respiratory tract, triggering asthmatic symptoms. The finding could lead to the development of a drug that would target the CaMKII enzyme, the researchers say.

Asthma affects billions of people worldwide. In the United States, 8.5 percent of the population has asthma, which causes 3,000 deaths and more than $56 billion annually in medical and lost work costs, according to the federal Centers for Disease Control and Prevention. Despite its toll on health and productivity, treatment options remain confined to steroids, which have harmful, even life-threatening, side effects for those with severe cases.

Current treatments don’t work well, noted Mark Anderson, professor and chair in internal medicine at the UI and a co-corresponding author on the paper, published July 24 in the journal Science Translational Medicine.

“It’s a kind of an epidemic without a clear, therapeutic option," Anderson says. "The take-home message is that inhibiting CaMKII appears to be an effective anti-oxidant strategy for treating allergic asthma."

Anderson and co-corresponding author Isabella Grumbach knew from previous work that the CaMKII enzyme played a role in the oxidation of heart muscle cells, which can lead to heart disease and heart attacks. The scientists surmised the same enzyme may affect oxidation in the respiratory system as well.

The team first tested the enzyme in airway muscle cells, but to little effect. They then tried to block the enzyme in the airway lining (epithelial) cells. They noticed that mice with the blocked enzyme had less oxidized CaMKII, no airway muscle constriction and no asthma symptoms. Similarly, mice without the blocked enzyme showed high “oxidative stress,” meaning lots of oxidized enzymes in the epithelial cells, a constricted airway and asthma symptoms.

“[The study] suggests that these airway lining cells are really important for asthma, and they’re important because of the oxidative properties of CaMKII,” says Anderson, whose primary appointment is in the Carver College of Medicine. “This is completely new and could meet a hunger for new asthma treatments. Here may be a new pathway to treat asthma.”

"Ten years ago, not much was known about what CaMKII does outside of nerve cells and muscle cells in the heart," says Grumbach, associate professor in internal medicine at the UI. "My lab has worked on investigating its function mainly in blood vessels with the long-term goal to use blockers of CaMKII to treat common diseases. We are constantly finding that CaMKII is interesting and important."

The researchers also took tissue samples from the airways of patients with asthma. True to their hypothesis, they found more oxidized enzymes in those patients than in healthy individuals. Taking a step further, the team found that mild asthma patients who inhaled an allergen had a spike in oxidized CaMKII in the epithelial cells just a day later.

“We have this very compelling association,” Anderson says, adding that more studies in patients are needed to validate the approach.

----------------------------------------------------------------------------
 For more stories on asthma, pollution, chemical exposure and improving your indoor air quality visit www.allerair.com or call to speak to an air quality expert about improving the air in your home 1-888-852-8247.

Tuesday, July 23, 2013

Scientists find environmental toxins in the brain tissue of polar bears

Photo: Hal Brindley

Scientists from Denmark and Canada are worried by their new findings showing that several bioaccumulative perfluoroalkyl substances (PFASs) are crossing the blood brain barrier of polar bears from Scoresby Sound, East Greenland.

(Source: Aarhus University)
PerFluoroAlkyl Substances (PFASs)  have been used in a wide variety of commercial and industrial products over the past six decades. Applications include water and oil repellent coatings, e.g. for textiles, paper products, carpets and food packaging, pharmaceuticals and surfactants in cleaning products and fire-fighting foams. PFASs are highly resistant to chemical, thermal and biological degradation.

PFASs and their precursor compounds have shown a dramatic increase and dispersal around the world over the past four decades. An increasing amount of information is becoming available on the toxicity of these compounds. Hence, studies have documented the toxicity of PFASs on wildlife and human health, including carcinogenesis, genotoxicity and epigenetic effects as well as reproductive and developmental toxicities, neurotoxicity, effects on the endocrine system and immunotoxicity.

Bioaccumulative PFASs enter all parts of the brain

Despite the fact that the liver is considered the major repository in the body for most PFASs, some shorter chain compounds from this grouping have previously been reported in the brain of chicken embryos, suggesting that they are able to cross the blood–brain barrier.

Previous studies have shown a dramatic biomagnification of several PFASs, and particularly one known as perfluorooctane sulfonate (PFOS) as well as several compounds of the perfluorinated carboxylate (PFCAs) grouping, in polar bears. PFOS have been shown to be at concentrations in the liver that are 100 fold higher than the ringed seals on which they are predating. In a new study Arctic researchers from Carleton University in Canada and Aarhus University in Denmark have used the polar bear as a sentinel species for humans and other predators in the top of the food chain. The researchers demonstrated accumulation of PFOS and several PFCAs in eight brain regions of polar bears collected from Scoresby Sound, East Greenland. Dr. Robert Letcher, Carleton University, explains:

”We know that fat soluble contaminants are able to cross the brain-blood barrier, but is it quite worrying that the PFOS and PFCAs, which are more associated with proteins in the body, were present in all the brain regions we analyzed.”

Professor Rune Dietz, Aarhus University, is also worried about the results:

“If PFOS and PFCAs can cross the blood-brain barrier in polar bears, it will also be the case in humans. The brain is one of the most essential parts of the body, where anthropogenic chemicals can have a severe impact. However, we are beginning to see the effect of the efforts to minimize the dispersal of this group of contaminants.”

Different functional parts of the Greenland polar bear brain were investigated for transfer of contaminants over the blood-brain barrier. The inner regions of the brain closer to incoming blood flow (pons/medulla, thalamus, and hypothalamus) contained consistently higher concentrations of perfluorooctane sulfonate (PFOS) and several perfluorinated carboxylates (PFCAs) compared to outer brain regions (cerebellum, striatum, and frontal, occipital, and temporal cortices). Photo: Rune Dietz, Aarhus University.
Select environmentally labeled products

The eight carbon chain PFOS and perfluorooctane carboxylate (PFOA) are PFASs have been phased out and are no longer produced in the western world. However, production in China, today the only known production source of PFOS and PFOA, has increased by roughly a factor of 10, since it was phased out in the USA. Unfortunately, no emission inventory is so far available from this region. Furthermore, replacements for PFOS and PFOA are now marketed and produced in e.g. the U.S.A. and China, which generally have perfluorinated carbon chains that are shorter or branched.

Another recent study from Aarhus University documents that PFOS concentrations in Greenlandic polar bears and ringed seals started to decline after 2006. Other wildlife populations closer to the sources in Europe and North America have shown a decline prior to the Greenlandic animals. Rune Dietz comments:

”It is promising to see that the PFAS are on the decline. This development should be encouraged by the authorities globally.

In the meantime my best advice to the consumers is to go for environmentally labeled products. But avoiding products is difficult, because PFASs are so widespread in many kind of products and they are rarely declared.”


-----------------------------------------------------

For more stories on chemical exposure, health and improving your indoor air quality visit www.allerair.com or call to speak to an air quality expert about improving the air in your home 1-888-852-8247.




Friday, July 19, 2013

Genetic mutation linked to severe obesity

Photo: Michelle Meiklejohn/freedigitalphotos.net
Researchers at Boston Children’s Hospital have identified a genetic cause of severe obesity that, though rare, raises new questions about weight gain and energy use in the general obese population. The research, published in the journal Science on July 19, involved genetic surveys of several groups of obese humans and experiments in mice.

Mice with the genetic mutation gained weight even while eating the same amount of food as their normal counterparts; the affected gene, Mrap2, has a human counterpart (MRAP2) and appears to be involved in regulating metabolism and food consumption.

“These mice aren’t burning the fat, they’re somehow holding onto it,” says the study’s lead investigator Joseph Majzoub, MD, chief of endocrinology at Boston Children’s. “Mice with the genetic mutation gained more weight, and we found similar mutations in a cohort of obese humans.”

The protein created by the Mrap2 gene appears to facilitate signaling to a receptor in the brain called Mc4r, which helps increase metabolism and decrease appetite as part of a larger signaling chain involved in energy regulation. Fat cells produce the hormone leptin, prompting receptors in the brain to instigate production of a second hormone, αMSH. Mc4r detects this hormone with the aid of Mrap2, leading to a decrease in appetite and weight. Mutations in this signaling chain, including mutations in Mc4r, are known to increase the likelihood of obesity.

Majzoub, first author Masato Asai, MD, PhD, now at Nagoya University in Japan, and colleagues studied mice with the Mrap2 gene knocked out both overall and just in the brain. In both cases, the mice grew to about twice their normal size. Weight gain was greatest when both copies of Mrap2 were knocked out, but the mice still showed weight gain and appetite increase with one working copy of the gene. The weight gain was more pronounced in males than females. In addition, the mice without Mrap2 had more exaggerated weight gain when fed a high-fat diet than normal mice.

Surprisingly, while the mice without Mrap2 didn’t eat more at first, they still gained weight faster than the controls. Later, their appetites increased and they continued to gain more weight than the controls, even when held to the same diet and quantity of food. In the end, the mutant mice had to be underfed by 10 to 15 percent to show the same weight gain as their normal peers. As soon as they were let off the restricted diet, their weight gain increased.

To investigate the gene in humans, Majzoub collaborated with Sadaf Farooqi, MD, PhD, of the University of Cambridge, and others to investigate groups of obese patients from around the world. The team found four mutations in the human equivalent of Mrap2 among the 500 people, all in patients with severe, early-onset obesity; each of the four affected patients had only one copy of the mutation.

While the finding suggests that these rare mutations directly cause obesity in less than 1 percent of the obese population, the researchers suspect that other mutations in the gene might occur more commonly and might interact with other mutations and environmental factors to cause more common forms of obesity. “We found other mutations that weren’t as clearly damaging to the gene,” notes Majzoub. “It’s possible that some of these more common mutations actually are pathogenic, especially in combination with other genes in the same pathway.”

One intriguing theory, called the thrifty-gene hypothesis, holds that rare mutations in genes like Mrap2 exist because they gave humans an evolutionary advantage in times of severe famine. Further investigation into how these mutations work may lend insight into the body’s mechanisms for energy storage and use. In the present study, the lab did not observe anything to explain why the mutant mice were storing more food energy, such as a difference in activity level or heat output.

Majzoub and his colleagues look forward to expanding the scope of the research, studying additional populations of obese people, including measures of their activity and diet, as well as further exploring how the gene alters energy balance.
-------------------------------------------------------------------------------
For more stories on health, pollution, chemical exposure and improving your indoor air quality visit www.allerair.com or call to speak to an air quality expert about improving the air in your home 1-888-852-8247.

Thursday, July 18, 2013

Pathogen That caused the Irish potato even more virulent now

Photo:Evgeni Dinev
The plant pathogen that caused the Irish potato famine in the 1840s lives on today with a different genetic blueprint and an even larger arsenal of weaponry to harm and kill plants.

In a study published in the journal Nature Communications, North Carolina State University plant pathologist Jean Ristaino and colleagues Mike Martin and Tom Gilbert from the University of Copenhagen compared the genomes, or sets of all genes, of five 19th century strains of the Phytophthora infestans pathogen with modern strains of the pathogen, which still wreaks havoc on potatoes and tomatoes.

The researchers found that the genes in historical plant samples collected in Belgium in 1845 as well as other samples collected from varied European locales in the late 1870s and 1880s were quite different from modern-day P. infestans genes, including some genes in modern plants that make the pathogen more virulent than the historical strains.

In one example, a certain gene variant, or allele, called AVR3a that was not virulent in the historical samples was shown to be virulent in the modern-day samples.

“The genetic blueprints, or genotypes, of the historical strains were distinct from modern strains, and genes related to infection were also quite different,” Ristaino says. “In the areas of the genome that today control virulence, we found little similarity with historical strains, suggesting that the pathogen has evolved in response to human actions like breeding more disease-resistant potatoes.”

Some of the differences between the European historical samples from the 1840s and the 1870s and 1880s suggest that the pathogen was brought to Europe more than once, debunking the theory that the pathogen was introduced once and then expanded its range. Ristaino believes it was introduced to Europe multiple times, probably from South American ships.

P. infestans caused massive and debilitating late-blight disease outbreaks in Europe, leaving starvation and migration in its wake after ravaging Ireland in the mid-to-late 1840s. Ristaino’s previous work pointed the finger at the 1a strain of P. infestans as the Irish potato-famine pathogen and traced its probable origin to South America.

An estimated $6.2 billion is spent each year on crop damage and attempts to control the pathogen, Ristaino says.

“Late blight is still a major threat to global food security in the developing world,” she adds. “Knowing how the pathogen genome has changed over time will help modern-day farmers better manage the disease.”
------------------------------------------------------------------------------------------
 For more stories on health, pollution, chemical exposure and improving your indoor air quality visit www.allerair.com or call to speak to an air quality expert about improving the air in your home 1-888-852-8247.

Wednesday, July 17, 2013

Antibiotic-Resistant Bacteria Widespread in Hudson River

Photo: Renjith Krishnan
The risk of catching some nasty germ in the Hudson River just started looking nastier.  Disease-causing microbes have long been found swimming there, but now researchers have documented antibiotic-resistant strains in specific spots, from the Tappan Zee Bridge to lower Manhattan. The microbes identified are resistant to ampicillin and tetracycline, drugs commonly used to treat ear infections, pneumonia, salmonella and other ailments. The study is published in the current issue of the Journal of Water and Health.
“If you find antibiotic-resistant bacteria in an ecosystem, it’s hard to know where they’re coming from,” said study co-author Andrew Juhl, a microbiologist at Columbia University’s Lamont-Doherty Earth Observatory. “In the Hudson, we have a strong case to make that it’s coming from untreated sewage.”
On repeated visits to 10 locations on the Hudson, the researchers found microbes resistant to ampicillin 84 percent of the time, and resistant to tetracycline 38 percent of the time. The stretches harboring the most sewage-indicator bacteria also generally contained the most antibiotic-resistant ones. These were led by Flushing Bay, near LaGuardia Airport, followed by Newtown Creek, on the border of Brooklyn and Queens; and sewage outfall pipes near Piermont Pier in Rockland County, N.Y.; West 125th Street in Manhattan; and Yonkers, in Westchester County, N.Y.. The antibiotic-resistant bacteria found include potentially pathogenic strains of the genera Pseudomonas, Acinetobacter, Proteus and Escherichia.
“They could be difficult to treat in people with compromised immune systems,” said Dr. Stephen Morse, an infectious disease epidemiologist at Columbia’s Mailman School of Public Health, who was not involved in the study.  “If I were inclined to swim in the Hudson, quite truthfully I’d look to this paper for the places to stay away from.” 
Though people routinely catch infections while swimming, only severe illnesses are typically treated with antibiotics. And an antibiotic-resistant infection would be noted only if the illness failed to respond to treatment--a scenario that probably happens, but is not well documented or reported, said Morse. One exception was an outbreak on the Indonesian island of Borneo in 2000 when 32 athletes competing in a swimming event in the Segama River came down with leptospirosis. Transmitted by animal urine, the infection is marked by fever, chills and pink eye.
Previous studies in the Hudson have shown that microbe counts go up after heavy rains, when raw sewage is commonly diverted into the river. Some 27 billion gallons of raw sewage and rainwater are released into the Hudson each year by wastewater treatment plants. 
Antibiotic resistance has become a public health crisis. About 100,000 people die each year from hospital-acquired infections, most of which are due to antibiotic-resistant pathogens, according to the Infectious Diseases Society of America.  Superbugs resistant to methicillin kill about 19,000 people each year, more than HIV/AIDS. The development of resistance has been linked to overuse of antibiotics to treat minor infections in humans, and to industrial feedlots, where low levels of antibiotics are fed to chicken, cattle and pigs to promote growth and prevent infection. The Natural Resources Defense Council estimates that 80 percent of antibiotics in the U.S. are fed to livestock.

Tuesday, July 16, 2013

Deepwater Horizon Debris Likely Source of Gulf of Mexico Oil Sheens

Photo: freedigitalimages.net
A chemical analysis of oil sheens found floating recently at the ocean’s surface near the site of the Deepwater Horizon disaster indicates that the source is pockets of oil trapped within the wreckage of the sunken rig. Both the Macondo well and natural oil seeps common to the Gulf of Mexico were confidently ruled out.

Researchers from Woods Hole Oceanographic Institution (WHOI) and the University of California, Santa Barbara (UCSB) used a recently-patented method to fingerprint the chemical makeup of the sheens and to estimate the location of the source based on the extent to which gasoline-like compounds evaporated from the oil sheens. The study was published online this week in Environmental Science & Technology.

The oil sheens were first reported to the United States Coast Guard by BP in mid-September 2012, raising public concern that the Macondo well, which was capped in July 2010, might be leaking.

“It was important to determine where the oil was coming from because of the environmental and legal concerns around these sheens. First, the public needed to be certain the leak was not coming from the Macondo well, but beyond that we needed to know the source of these sheens and how much oil is supplying them so we could define the magnitude of the problem,” said WHOI chemist Chris Reddy.

When oil sheens appear on the ocean surface, how do researchers determine where the oil is coming from? Every oil sample contains chemical clues pointing to the reservoir it came from, allowing scientists to compare it to other samples to determine if they share a common source. The lead scientists Chris Reddy (WHOI) and Dave Valentine (UCSB) were aptly prepared to investigate these sheens. They have worked on the Deepwater Horizon for much of the last three years, investigating a wide range of problems from the composition of the oil, detection of subsurface plumes, the biodegradation of the oil, the fate of the dispersants, and the chemical transition from floating oil slicks to sunken tar balls.

“Because of our ongoing funding from the National Science Foundation, we were prepared to interrogate the source of mysterious oil sheens in the Gulf of Mexico,” said Valentine. “We’ve been exploring new ways to do this for several years in the context of natural seeps and this event provided us an opportunity to apply our fundamental advances to a real-world problem. This is a classic case where fundamental science finds a real world application.”

This research analyzed 14 sheen samples they skimmed from the sea surface during two trips to the Gulf of Mexico. Using comprehensive two-dimensional gas chromatography (GCxGC), a technique developed in Reddy’s lab, the researchers first confirmed the sheens contained oil from the Macondo well. But the sheen samples also contained trace amounts of “olefins”—industrial chemicals that are used in drilling operations—compounds that were absent from the sample taken directly from the Macondo well. The presence of “olefins” provided a fingerprint for the sheens that the authors could compare to the library of samples they had analyzed over the past three years.

The “olefins” are not found in crude oil and their uniform distribution in the sheens indicated that the Macondo well was unlikely to be the source. The team surmised that the sheens must be coming from equipment exposed to olefins during drilling operations. Reddy’s lab holds a patent on measuring olefins in crude oils.

“The occurrence of these man-made olefins in all of our sheen samples points to a single main source which contains both Macondo oil and lesser amounts of the drilling fluids that harbor the olefins,” said Valentine. “This pointed us to the wreckage of the rig, which was known to have both, as the most likely source for the sheens.”

It also meant the olefins could be used by the team as a fingerprint in the search for the source of the leak.

The research team compared the sheen samples to other field samples, some of which they expected would contain olefins and some they thought would not. The reference samples included two pieces of debris from the Deepwater Horizon, found floating in early May 2010, as well as oil collected by BP in October 2012 during an inspection of the 80-ton cofferdam that had been abandoned at the sea floor after its use in a failed attempt to cover the Macondo well in 2010.

The team’s GCxGC analysis of BP’s cofferdam samples definitively showed the cofferdam was not the sole source of the leak – there were no olefins present. Prior to the analysis, the cofferdam had become the prime suspect as the source when BP found small amounts of oil leaking from the top of the cofferdam. BP acquired oil samples from this leak point before sealing the leak, thinking they had resolved the problem. However, the sheens on the sea surface persisted, and the lack of olefins pointed to another source entirely.

When Reddy and Valentine compared the chemical makeup of the sheens with Deepwater Horizon debris found floating in 2010, they found a match. That debris, which came from the rig itself, was coated with oil and contaminated with drilling mud olefins.

”The ability to fingerprint synthetic hydrocarbons allowed us to crack this case. We were able to exclude a number of suspects and match the olefin fingerprint in the new oil slicks to that of the wreckage from the sunken rig,” said Valentine.

The chemical analysis also told researchers which sheens had surfaced more recently than others, allowing them to reconstruct a trajectory for local ocean currents that pointed back to the source of the oil. By looking for sheens that showed the least amount of evaporation, they determined that oil surfaced closer to the Deepwater Horizon wreckage than the cofferdam site.

To explain how the oil might be trapped and released from the wreckage, the scientists point out that when the Deepwater Horizon rig sank, it was holding tanks containing hundreds of barrels of a mixture of drilling mud and oil. Over time, corrosive seawater can create small holes for oil to slowly escape to the surface. The researchers suspect that the containers on the Deepwater Horizon rig holding entrapped oil may be the source of the recent oil sheen.

“This study shows that drilling mud olefins are a powerful forensic tool to investigate the source of oil sheens,” the authors wrote.

In the process of their research, Reddy and Valentine operated with transparency, alerting the government and BP to their research plans. They believe this fostered a collegial relationship that ultimately improved their research. Throughout the effort to find the source of the oil, there was keen interest in the research both by BP and federal agencies; all stakeholders sought to determine the source of the oil, though each required a different level of certainty about the results, and each had different questions to answer. Reddy and Valentine required great certainty in their results before publishing them, but recognized that their preliminary data could be useful to others who could use it to take immediate action with less certainty.

“We had a fruitful exchange and developed a collegial relationship with both BP and the government. Both provided us with data, and in turn we gave them a preview of our findings with no strings attached,” said Valentine.

At the end of the day, Reddy said, “This is a good study, but the long lasting impacts of this effort were highlighting that academia can play a useful role during a crisis. We can be unbiased and collaborative without losing our integrity. What is lost on many of our colleagues is that interacting with representatives of the government and BP provided advice and input that improved our research. This is a win-win.”

In addition to Reddy and Valentine, the research team consisted of WHOI postdoctoral investigator Christoph Aeppli, UCSB Postdoctoral Scholar Matthias Kellermann, and Robert K. Nelson, a member of the WHOI technical staff.

This research was funded by the National Science Foundation, the Gulf of Mexico Research Initiative DEEP-C Consortium, Woods Hole Oceanographic Institution, and the Swiss National Science Foundation Postdoctoral Fellowship.

Monday, July 15, 2013

Long-forgotten underwater seawall protected some New Jersey homes from Hurricane Sandy

Photo: Jennifer Irish
Picture two residential beach communities on the New Jersey shore: Bay Head and Mantoloking, which sit side-by-side in Ocean County on a narrow barrier island that separates the Atlantic Ocean and Barnegat Bay.

Before Hurricane Sandy landed on Oct. 29, 2012, a motorist traveling north on Ocean Avenue would seamlessly travel through Mantoloking into Bay Head, noticing few changes in residential development, dunes, beaches, and shoreline.

The difference was hidden under the sand.

A forgotten, 1,260-meter seawall buried beneath the beach helped Bay Head weather Sandy's record storm surges and large waves over multiple high tides, according to a team of engineers and geoscientists led by Jennifer L. Irish, an associate professor of civil and environmental engineering in the College of Engineering at Virginia Tech and an authority on storm surge, tsunami inundation, and erosion.

The stone structure dates back to 1882. Its reappearance surprised many area residents, underscoring the difficulties transient communities have in planning for future threats at their shores, the researchers said.

"It's amazing that a seawall built nearly 150 years ago, naturally hidden under beach sands, and forgotten, should have a major positive effect under the conditions in which it was originally designed to perform," said H. Richard Lane, program director in the National Science Foundation's (NSF) Division of Earth Sciences, which funded the research. "This finding should have major implications for planning, as sea level rises and storms increase in intensity in response to global warming."

Air purifiers for airborne mold, particles, chemicals and odors

The discovery illustrates the need for multi-levels of beach protection in oceanfront communities, the researchers said.

"Once we got there, we immediately saw the seawall," Irish said. "The beach and dunes did their job to a certain point, then, the seawall took over, providing significant dampening of the waves. It was the difference between houses that were flooded in Bay Head and houses that were reduced to piles of rubble in Mantoloking."

All oceanfront homes in the two boroughs were damaged, ranging from ground-floor flooding to complete destruction. As measured by water lines on the interior of homes, flooding was similar in both boroughs. The difference was the extent of the storm's impact.

In Mantoloking, the entire dune almost vanished. Water washed over the barrier spit and opened three breaches of 165 meters, 59 meters, and 35 meters, where the land was swept away. In Bay Head, only the portion of the dune located seaward of the seawall was eroded and the section of dune behind the seawall received only minor local scouring.

Later, using Google Earth to evaluate aerial images taken two years before and immediately after Hurricane Sandy, the research team evaluated houses, labeling a structure with a different roofline as damaged, one that no longer sits on its foundation as destroyed, and the remaining houses as flooded.

The researchers classified 88 percent of the oceanfront homes in Bay Head as flooded, with just one oceanfront home destroyed. In Mantoloking, more than half of the oceanfront homes were classified as damaged or destroyed.

Despite the immense magnitude and duration of the storm, a relatively small coastal obstacle reduced potential wave loads by a factor of two and was the difference between widespread destruction and minor structural impacts, the researchers said.

"We have a great deal of compassion for the people who have had to endure the devastation of Hurricane Sandy in Bay Head and Mantoloking," Irish said. "It will have little solace, but we are left with a clear, unintentional example of the need for multiple levels of defense that include hard structures and beach nourishment to protect coastal communities."

Friday, July 12, 2013

One third of world's population benefits from effective tobacco control measure

At 2.3 billion, the number of people worldwide covered by at least one life-saving measure to limit tobacco use has more than doubled in the last five years, according to the WHO Report on the Global Tobacco Epidemic 2013. The number of people covered by bans on tobacco advertising, promotion and sponsorship, the focus of this year’s report, increased by almost 400 million people residing mainly in low- and middle-income countries.

Furthermore, the Report shows that 3 billion people are now covered by national anti-tobacco campaigns. As a result, hundreds of millions of nonsmokers are less likely to start.
However, the Report notes, to achieve the globally agreed target of a 30% reduction of tobacco use by 2025, more countries have to implement comprehensive tobacco control programmes.
Air Purifiers for Tobacco Smoke


Re-enforce bans on tobacco advertising, promotion and sponsorship worldwide

Bans on tobacco advertising, promotion and sponsorship are one of the most powerful measures to control tobacco use. As of today, 24 countries with 694 million people have introduced complete bans and 100 more countries are close to a complete ban. However, 67 countries currently do not ban any tobacco advertising, promotion and sponsorship activities or have a ban that excludes advertising in national broadcast and print media.
“If we do not close ranks and ban tobacco advertising, promotion and sponsorship, adolescents and young adults will continue to be lured into tobacco consumption by an ever-more aggressive tobacco industry,” says WHO Director-General Dr Margaret Chan. “Every country has the responsibility to protect its population from tobacco-related illness, disability and death.”

Tobacco is the leading global cause of preventable death and kills 6 million people every year. It can cause cancer, cardiovascular disease, diabetes and chronic respiratory diseases. If current trends continue, the number of deaths attributed to tobacco smoking is projected to rise to 8 million a year by 2030. In defiance of the deleterious effects of smoking, tobacco companies are spending tens of billions of dollars each year on advertising, promotion and sponsorship.


“We know that only complete bans on tobacco advertising, promotion and sponsorship are effective," stresses Dr Douglas Bettcher, the Director of WHO’s Prevention of Noncommunicable Diseases department. “Countries that introduced complete bans together with other tobacco control measures have been able to cut tobacco use significantly within only a few years.“
Other measures to cut tobacco use

Other key findings of the report include:
  • Effective health warning labels on tobacco packaging continue to be established by more countries. In the past five years, a total of 20 countries with 657 million people put strong warning label requirements in place, with 11 countries (with 265 million people) doing so since 2010.
  • More than half a billion people in nine countries have gained access to appropriate cessation services in the past five years. However, there has been little progress since 2010, as only four additional countries with a combined population of 85 million were newly provided access to cost-covered services including a toll-free national quit line.
  • Creation of smoke-free public places and workplaces continues to be the most commonly established measure at the highest level of achievement. There are 32 countries that passed complete smoking bans covering all work places, public places and public transportation means between 2007 and 2012, protecting nearly 900 million additional people. Since 2010, 12 countries and one territory, with 350 million people, passed strong smoke-free laws at a national level.
In 2008, WHO identified six evidence-based tobacco control measures that are the most effective in reducing tobacco use. Known as “MPOWER”, these measures correspond to one or more of the demand reduction provisions included in the WHO Framework Convention on Tobacco Control (WHO FCTC): Monitor tobacco use and prevention policies, Protect people from tobacco smoke, Offer help to quit tobacco use, Warn people about the dangers of tobacco, Enforce bans on tobacco advertising, promotion and sponsorship, and Raise taxes on tobacco.

This year’s report is the fourth in the series of WHO reports on the status of the MPOWER measures. These measures provide countries with practical assistance to reduce demand for tobacco in line with the WHO FCTC, thereby reducing related illness, disability and death.

The WHO FCTC entered into force in 2005 and, with 177 Parties today, is a powerful tool to combat the deadly tobacco epidemic. 

Thursday, July 11, 2013

Researchers estimate over 2 million deaths annually from air pollution

photo: freedigitalphotos.net
Over two million deaths occur each year as a direct result of human-caused air pollution, a new study has found.

In addition, while it has been suggested that a changing climate can exacerbate the effects of air pollution and increase death rates, the study shows that this has a minimal effect and only accounts for a small proportion of current deaths related to air pollution.

The study, which has been published today, 12 July, in IOP Publishing's journal Environmental Research Letters, estimates that around 470,000 people die each year because of human-caused increases in ozone.

It also estimates that around 2.1 million deaths are caused each year by human-caused increases in fine particulate matter (PM2.5) – tiny particles suspended in the air that can penetrate deep into the lungs, causing cancer and other respiratory disease.

Co-author of the study, Jason West, from the University of North Carolina, said: "Our estimates make outdoor air pollution among the most important environmental risk factors for health. Many of these deaths are estimated to occur in East Asia and South Asia, where population is high and air pollution is severe."

According to the study, the number of these deaths that can be attributed to changes in the climate since the industrial era is, however, relatively small. It estimates that a changing climate results in 1500 deaths due to ozone and 2200 deaths related to PM2.5 each year.

Climate change affects air pollution in many ways, possibly leading to local increases or decreases in air pollution. For instance, temperature and humidity can change the reaction rates which determine the formation or lifetime of a pollutant, and rainfall can determine the time that pollutants can accumulate.

Higher temperatures can also increase the emissions of organic compounds from trees, which can then react in the atmosphere to form ozone and particulate matter.

"Very few studies have attempted to estimate the effects of past climate change on air quality and health. We found that the effects of past climate change are likely to be a very small component of the overall effect of air pollution," continued West.

In their study, the researchers used an ensemble of climate models to simulate the concentrations of ozone and PM2.5 in the years 2000 and 1850. A total of 14 models simulated levels of ozone and six models simulated levels of PM2.5.

Previous epidemiological studies were then used to assess how the specific concentrations of air pollution from the climate models related to current global mortality rates.

The researchers' results were comparable to previous studies that have analysed air pollution and mortality; however, there was some variation depending on which climate model was used.

"We have also found that there is significant uncertainty based on the spread among different atmospheric models. This would caution against using a single model in the future, as some studies have done," continued West.
---------------------------------------------------------------------------------
Combat pollution at home with a high efficiency home air purifier designed for particulate matter and airborne chemicals and odors. Connect with us at allerair.com to learn more.


Wednesday, July 10, 2013

Wildfires may contribute more to global warming than previously predicted

Wildfires produce a witch’s brew of carbon-containing particles, as anyone downwind of a forest fire can attest. A range of fine carbonaceous particles (soot) rising high into the air significantly degrade air quality, damaging human and wildlife health, and interacting with sunlight to affect climate. But measurements taken during the 2011 Las Conchas fire near Los Alamos National Laboratory show that the actual carbon-containing particles emitted by fires are very different than those used in current computer models, providing the potential for inaccuracy in current climate-modeling results.

“We’ve found that substances resembling tar balls dominate, and even the soot is coated by organics that focus sunlight,” said senior laboratory scientist Manvendra Dubey, “Both components can potentially increase climate warming by increased light absorption.”

“The fact that we are experiencing more fires and that climate change may increase fire frequency underscores the need to include these specialized particles in the computer models, and our results show how this can be done,” Dubey said.

Aerosol samples revealed “tar balls” in the skies
Air Purifiers for Wildfire or Tobacco Smoke
Conventional wisdom is that the fire-driven particles contain black carbon or soot that absorbs sunlight to warm the climate, and organic carbon or smoke that reflects sunlight to cool the climate. But in a paper just published in Nature Communications the scientists from Los Alamos and Michigan Technological University analyzed the morphology and composition of the specific aerosols emitted by the Las Conchas fire.

Las Conchas, which started June 26, 2011, was the largest fire in NM history at the time, burning 245 square miles. Immediately after Los Alamos National Laboratory reopened to scientists and staff, the team set up an extensive aerosol sampling system to monitor the smoke from the smoldering fire for more than 10 days.

High-tech tools enable analysis of smoke samples
Dubey, along with postdoctoral fellow Allison Aiken and post-bachelor’s student Kyle Gorkowski, coordinated with Michigan Tech professor Claudio Mazzoleni (a former Los Alamos Director’s fellow) and graduate student Swarup China to perform this study.

The team used field-emission scanning electron microscopy and energy dispersive X ray spectroscopy to analyze the aerosol samples and determined that spherical carbonaceous particles called tar balls were 10 times more abundant than soot.
Furthermore, the bare soot particles, which are composite porous fractal structures made of tiny spherical carbon, are modified significantly by the organics emitted by the fire. About 96 percent of the soot from the fire is coated by other organics substances, with 50 percent being totally coated.  Furthermore, the complexity of the soot can be categorized into 4 morphological structures as “embedded,” “partly coated,” “with inclusions” and “bare.”

What was missing from the modeling and why it matters
Why is this important for climate? Dubey noted that, “Most climate assessment models treat fire emissions as a mixture of pure soot and organic carbon aerosols that offset the respective warming and cooling effects of one another on climate. However Las Conchas results show that tar balls exceed soot by a factor of 10 and the soot gets coated by organics in fire emissions, each resulting in more of a warming effect than is currently assumed.”

“Tar balls can absorb sunlight at shorter blue and ultraviolet wavelengths (also called brown carbon due to the color) and can cause substantial warming,” he said. “Furthermore, organic coatings on soot act like lenses that focus sunlight, amplifying the absorption and warming by soot by a factor of 2 or more. This has a huge impact on how they should be treated in computer models.”

This experimental research was funded by the U.S. Department of Energy’s Office of Science.

Tuesday, July 09, 2013

Treating oil spills with chemical dispersants: Is the cure worse than the ailment?

Photo: think4photop/freedigitalphotos.com
Treating oil spills with chemical dispersants may be better for the coastline, but worse for fish.

A new study, to be presented at the Society for Experimental Biology, suggests that although chemical dispersants may reduce problems for surface animals, the increased contamination under the water reduces the ability for fish and other organisms to cope with subsequent environmental challenges.

A team of researchers headed by Prof Guy Claireaux at the University of Brest in France looked for the first time at the effects of chemically dispersed oil on the performance of European seabass to subsequent environmental challenges.

The researchers designed swimming challenge tests in an 'aquatic treadmill', similar to the tests used in human medicine for health diagnosis. They analysed European seabass' maximum swimming performance, hypoxia tolerance and thermal sensitivity as markers for their capabilities to face natural contingencies. They then exposed the fish to untreated oil, chemically dispersed oil or dispersant alone for 48 hours. During the following 6 weeks they measured individual growth and then once again analysed the seabass' performance in the swimming challenge tests.

Oil exposure impacted the ability of fish to face increased temperature, reduced oxygen availability or to swim against a current and these effects were further aggravated with the addition of the dispersant. The dispersant alone had no effect on the ability of fish to face the challenge tests.

Prof Claireaux said "An oil slick reaching the shore is not good for tourism and organisms living on the coast line. Treating the slick at sea will avoid or reduce these problems affecting surface animals (birds and marine mammals). On the other hand, oil dispersion will increase the contamination of the water column and the organisms that occupy it."

Though applying dispersants at sea may reduce the environmental and economic impacts of an oil spill reaching the shoreline, these results show that the choice of response deployed to deal with a spill involves a trade-off between the effects at the surface and in the water column.

---------------------------------------------------------------
For more stories on health, pollution, chemical exposure and improving your indoor air quality visit www.allerair.com or call to speak to an air quality expert about improving the air in your home 1-888-852-8247.

Monday, July 08, 2013

Harvard researchers warn of legacy mercury in the environment

Photo: Dan/freedigitalphotos.net
Environmental researchers at Harvard University have published evidence that significant reductions in mercury emissions will be necessary just to stabilize current levels of the toxic element in the environment. So much mercury persists in surface reservoirs (soil, air, and water) from past pollution, going back thousands of years, that it will continue to persist in the ocean and accumulate in fish for decades to centuries, they report.

"It's easier said than done, but we're advocating for aggressive reductions, and sooner rather than later," says Helen Amos, a Ph.D. candidate in Earth and Planetary Sciences at the Harvard Graduate School of Arts and Sciences and lead author of the study, published in the journal Global Biogeochemical Cycles.

Amos is a member of the Atmospheric Chemistry Modeling Group at the Harvard School of Engineering and Applied Sciences (SEAS), where researchers have been collecting historical data on mercury emissions as far back as 2000 BC and building new environmental models of mercury cycling that capture the interactions between the atmosphere, oceans, and land.

Their model reveals that most of the mercury emitted to the environment ends up in the ocean within a few decades and remains there for centuries to millennia. These days, emissions are mainly from coal-fired power plants and artisanal gold mining. Thrown into the air, rained down onto lakes, absorbed into the soil, or carried by rivers, mercury eventually finds its way to the sea. In aquatic ecosystems, microbes convert it to methylmercury, the organic compound that accumulates in fish, finds its way to our dinner plates, and has been associated with neurological and cardiovascular damage.

"Today, more than half of mercury emissions come from Asia, but historically the U.S. and Europe were major emitters," says second senior author Daniel J. Jacob, Vasco McCoy Family Professor of Atmospheric Chemistry and Environmental Engineering at Harvard SEAS and Professor of Earth and Planetary Sciences. "We find that half of mercury pollution in the present surface ocean comes from emissions prior to 1950, and as a result the contribution from the U.S. and Europe is comparable to that from Asia."

Friday, July 05, 2013

Scientists: Early exposure to traffic pollution may be the cause of childhood #asthma

Photo: freedigitallphotos.net
A research team led by scientists at the University of California, San Francisco has found that exposure in infancy to nitrogen dioxide (NO2), a component of motor vehicle air pollution, is strongly linked with later development of childhood asthma among African Americans and Latinos.

The researchers said their findings indicate that air pollution might, in fact, be a cause of the disease, and they called for a tightening of U.S government standards for annual exposure to NO2.

The study is reported online currently in the American Journal of Respiratory and Critical Care Medicine ahead of print publication.

In the study, the largest to date of air pollution exposure and asthma risk in minority children in the United States, the team found that for every five parts per billion increase in NO2 exposure during the first year of life, there was a 17 percent increase in the risk of developing asthma later in life.

The study involved 3,343 Latino and 977 African American participants.

"Many previous studies have shown an obvious link between traffic-related pollution and childhood asthma, but this has never been thoroughly looked at before in an all-minority population," said lead author Katherine K. Nishimura, MPH, a graduate student in the laboratory of senior author Esteban G. Burchard, MD, MPH, a UCSF professor of bioengineering and therapeutic sciences and medicine and director of the UCSF Center for Genes, Environment & Health.

Minorities tend to live in areas of higher air pollution and have a higher risk of developing asthma, the researchers said.

What made the current study different from previous research, said Nishimura, was that the scientists looked retrospectively at the study participants' exposure to air pollution in early childhood, before they developed asthma. Children who developed asthma before this exposure period were excluded.

"Any participant with asthma in this study was exposed to air pollution in infancy, before they developed the disease, which is a step in the right direction in inferring causality," said Nishimura.

"This work adds to the growing body of evidence that traffic-related pollutants may be causally related to childhood asthma," said Burchard.

Study co-author John R. Balmes, MD, of UC San Francisco and UC Berkeley, pointed out that a previous study of children living in southern California showed that living and attending school close to major roadways was associated with an increased risk of new-onset asthma. "Together with our findings, this makes for strong evidence that reducing children's exposure to traffic emissions can prevent some cases of asthma," said Balmes.

One immediate implication of the study, said Burchard, is that the national standard for NO2 set by the Environmental Protection Agency (EPA) is "too lax by far." He noted that the current EPA annual standard is 53 parts per billion (ppb), while the study subjects were exposed, on average, to 19 ppb during the first year of life.

"Children growing up in southern California have been shown to have reduced growth of lung function when annual NO2 levels exceed the current national annual standard," added Balmes.

The participants, who were 8 to 21 years old and had no other lung diseases or chronic illnesses, were recruited from study centers in Chicago, New York City, Houston, the San Francisco Bay Area, and Puerto Rico. To adjust for instances when study participants moved residences, air pollution exposure for all subjects was assessed by using the residential histories from birth through time of recruitment. The researchers based their air pollution exposure estimates on EPA annual measurements.

"The geographic diversity of the study population strengthens the results, because the effects we see are consistent across a wide range of urban environments and conditions," said Nishimura. While the study was not designed to investigate how air pollution might cause childhood asthma, said Nishimura, the investigators are looking at two possible causes.

The first is that NO2 can interact with a number of other pollutants to create reactive oxygen species — chemically reactive molecules containing oxygen — which can, in turn, damage developing lungs. Immune systems that develop under such conditions, she said, could be "trained" to respond to pollutants as triggers that could later induce asthma.

Another possible explanation, said Nishimura, is that pollutants can potentially cause a genetic predisposition to asthma by altering methylation patterns in DNA. Methylation is a chemical change that alters gene expression without affecting the underlying structure of the DNA itself.

"It has been shown that changes in methylation, which can be affected by pollution, tobacco smoke and even stress, can be inherited across multiple generations," said Burchard. "Our group is currently investigating methylation as the possible outcome of exposures to a number of pollutants."

Burchard cautioned that air pollution is "not the entire story," noting that despite its low air pollution levels, Puerto Rico has the highest asthma prevalence and morbidity in the United States. "This is intriguing, and leaves us more work to do," he said.