Archive for the ‘Hot Articles’ Category

Getting to the chiral centre of aquatic pollution

The chemical behaviour of pollutants in the aquatic environment requires careful monitoring in order for us to understand the toxicity different compounds will exert on natural ecosystems. Researchers from Israel describe a new modelling approach that may provide an exciting new technique to address a specific aspect of the chemical and environmental behaviour of pollutants.

The term ‘chiral’ is given to molecules that exhibit identical composition, but where the components of the molecule are arranged in a non-superimposable mirror image composition, centred around an asymmetric carbon atom. The two ‘mirror images’ of a chiral molecule are termed enantiomers. The study of the enantiomer-specific properties and how these vary between different molecules is of major interest within the broad fields of inorganic, organic, physical, biological and environmental chemistry.

Many anthropogenic chemicals of environmental concern, such as pesticides, are chiral molecules. These compounds can potentially be major threat to aquatic ecosystems. For example, the molecular structures and environmental implications of many chiral pesticides have been discussed in a review by the USEPA. It is important, therefore,  to have means of accurately tracing the alteration of these compounds in the environment, particularly with reference to their enantiomer-specific environmental toxicity.

Researchers have previously proposed an enantiomeric enrichment factor (EEF), to describe the enantiomeric enrichment – conversion relationship of chiral compounds, derived using the Rayleigh equation, which describes the relationship between changes in the isotopic composition against the contaminant concentration during the degradation process. The EEF can therefore be used as an identifying tool for a specific enzymatic reaction of different molecules.

Developing models to describe enantio-selective biodegradation can alleviate the need for laborious practical work. To achieve this, there is a need to improve our understanding of the mechanisms of biodegradation, to classify chemicals according to their relative biodegradability, and to develop reliable biodegradation estimation methods for new chemicals. Quantitative structure–activity relationship (QSAR) models are typically derived based on the correlation between experimental data and physical properties (e.g. lipophilicity, steric and electronic parameters) and can be used to identify bioavailability, toxicity and biological activities of compounds as dependent variables.

This study, conducted by researchers from The Institute of Chemistry at The Hebrew University of Jerusalem and The Geological Survey of Israel, develops a QSAR model to describe the dependence of the enantiomeric enrichment factor on molecular structures and uses this method to evaluate EEF values for unstudied chiral compounds.  The authors used the multiple linear regression (MLR) method to build the QSAR based on the Linear Hansch model. The enantioselective hydrolysis of 16 derivatives of 2-(phenoxy)propionate (PPMs) (some of which are common herbicides) using three  different lipase enzymes was analysed.

The study provides a demonstration of the predictive power of QSAR and Hansch modeling for analysis of the structural dependence of the EEF, with the model shown to effectively correlate biological activity with key physicochemical properties. More importantly, at times, the QSAR of EEF values was shown to be a much better predictive tool than the QSAR of just the underlying individual kinetic parameters, clearly indicating this method could mark the way forward for research in this field.

The authors note that the use of the QSAR modelling technique used in this study may serve as a powerful tracer tool in environmental studies, assisting in source tracking the enantio-selective conversion of both known and unstudied chiral compounds in aquatic ecosystems.



To read more about this study, download a copy for free* by clicking the link below.
Quantitative structure–activity relationship correlation between molecular structure and the Rayleigh enantiomeric enrichment factor
S. Jammer, D. Rizkov, F. Gelman and O. Lev
Environ. Sci.: Processes Impacts, 2015, Advance Article
DOI: 10.1039/c5em00084j

—————-

About the webwriter

Ian Keyte is a Doctoral Researcher at the University of Birmingham. His research focuses on the sources, behavior and fate of polycyclic aromatic hydrocarbons (PAHs) in the atmosphere.

—————-

* Access is free until 30/08/2015 through a registered RSC account.

Digg This
Reddit This
Stumble Now!
Share on Facebook
Bookmark this on Delicious
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Detecting endocrine disrupting chemicals in wastewater

Some steroidal estrogens and certain polyphenols may be a threat to aquatic ecosystems. These compounds, usually known as endocrine disrupting chemicals (EDCs) pollute our water inducing a potential ecological risk. They may be artificial waste products from chemical industries, but some are natural hormones excreted by cattle, poultry, or even humans. Some of them (like estriol or 4-nonyl-phenol) have been linked with very serious problems like decreasing fertility or causing feminisation in fish.

Thus, having a validated method to estimate the concentration of EDCs in water comes in handy, especially for the detection of these contaminants in the effluent of wastewater treatment plants. Treatment plants act as the last barrier before EDCs are permanently released in the environment.

A group of researchers from the Chinese Academy of Sciences has developed a new analytical method to determine the concentration of up to 12 different EDC contaminants. They have optimised an easy pretreatment that avoids enzymatic digestion of the samples, hence allowing for the first time the detection of the different EDCs conjugated salt forms (sulphate and glucuronide). As these EDC conjugates predominate in urine and faeces, having a method to analyse them effectively is useful to know the origin of these pollutants.

This particular analytical method uses state of the art ultra high performance chromatography (UPLC) and MS/MS detectors to improve limits of quantification to small amounts of EDCs down to 0.04 nanograms per litre. The experiments have determined that some species, like 17b-estradiol and all the glucuronide conjugates, are completely removed from wastewater after treatment.

Others, on the other hand, are only partially eliminated (removals may vary between 34 and 95% of the amount of EDC in the influent). The risk of not eliminating these contaminants from the effluent, ultimately released in the environment, has to be studied in depth.


Click on the link below to read the full article for free*:
Simultaneous detection of endocrine disrupting chemicals including conjugates in municipal wastewater and sludge with enhanced sample pretreatment and UPLC-MS/MS
Bing Zhu, Weiwei Ben, Xiangjuan Yuan, Yu Zhang, Min Yang and Zhimin Qiang
Environ. Sci.: Processes Impacts, 2015, Advance Article
DOI: 10.1039/C5EM00139K


—————-

About the webwriter

Fernando Gomollon-Bel is a PhD Student at the ISQCH (CSIC-University of Zaragoza). His research focuses on asymmetric organic synthesis using sugars as chiral-pool starting materials towards the production of fungical transglycosidase inhibitors.

—————-

* Access is free until 24/08/2015 through a registered RSC account.

Digg This
Reddit This
Stumble Now!
Share on Facebook
Bookmark this on Delicious
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Developing new models to understand social problems in Bangladesh

Bangladesh is, sadly, the centre of a wide variety of meteorological disasters (cyclones, floods, saline water intrusion…). Moreover, it is one of the most vulnerable countries on Earth to any sea-level rise that global climate change may cause. In addition to all of this, Bangladesh population is quickly growing, and it is now one of the most densely populated nations. All these factors affect, of course, the country’s agricultural system and correlate directly with its ability to feed the population. A recently published paper proposes the creation of a new model framework to study both macro- and micro-scale environmental processes and link these to the prosperity of the Bangladeshi households.

Thanks to the efforts of both the government and the people of Bangladesh, poverty level has decreased from 70 to 43 percent in less than twenty years. However, poverty levels are higher in coastal regions, such as the area of this study. Could climate change aggravate this situation? To estimate this, the authors developed a new model that takes into consideration a broad diversity of factors.

To begin with, they have used mathematical models created by the Food and Agriculture Organization (FAO) to estimate the crop productivity. These models were enhanced with climate, soil salinity, cropping patterns and market price data from diverse databases. Also, they have studied demographic information from different sources.

Additionally, they have added what is one of the major innovations of this particular model: financial data of individuals and families. Thus, researchers were able to evaluate and analyse the complex relationship between nature, agriculture and day-to-day life of farmers.

Some conclusions were quite surprising: i.e., if farmers increased their productivity substantially, they would experience a very small increase in their income. This is mainly due to a quite extended access to credit that has led families to assume debt they cannot afford, even working longer hours. Besides, these study points out that crop diversification may not help overcome climate change. Nonetheless, these are the first results of a very new prediction model. This opens new opportunities to further analyses, research and improvements on the models for a better understanding of the situation in Bangladesh.



To read more about this study, download a copy for free* by clicking the link below.

Agricultural livelihoods in coastal Bangladesh under climate and environmental change – a model framework
Attila N. Lázár, Derek Clarke, Helen Adams, Abdur Razzaque Akanda, Sylvia Szabo, Robert J. Nicholls, Zoe Matthews, Dilruba Begum, Abul Fazal M. Saleh, Md. Anwarul Abedin, Andres Payo, Peter Kim Streatfield, Craig Hutton, M. Shahjahan Mondal and Abu Zofar Md. Moslehuddin
Environ. Sci.: Processes Impacts, 2015,17, 1018-1031
DOI: 10.1039/C4EM00600C

—————-

About the webwriter

Fernando Gomollon-Bel is a PhD Student at the ISQCH (CSIC-University of Zaragoza). His research focuses on asymmetric organic synthesis using sugars as chiral-pool starting materials towards the production of fungical transglycosidase inhibitors.

—————-

* Access is free until 16/08/2015 through a registered RSC account.

Digg This
Reddit This
Stumble Now!
Share on Facebook
Bookmark this on Delicious
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Slick research: reviewing available technologies for tackling oils spills

Oil spills can cause widespread environmental damage. The production, refining, storage and distribution of oil are all potential sources of pollution of marine and terrestrial ecosystems. Recent high-profile accidents on offshore oil platforms such as the Deepwater Horizon incident in 2010 give the public clear insight into the effect large-scale oil spills can have. However, as dramatic as these pollution events are, these incidents represent less than 10% of total petroleum hydrocarbon discharges to the environment. The vast majority of pollution results from relatively low-level routine releases on a more local scale. The challenge is therefore to ensure safety of oil production as a whole rather than simply the prevention of large-scale incidents.

As existing oil reserves become increasingly depleted, exploration and drilling is spreading into deeper waters and more remote, fragile environments, such as the Arctic. Since oil production from newly explored or depleted reservoirs is more difficult, the risk of accidental oil spills in the future is likely to increase. Indeed, it is estimated that, for an average platform, each 30 metres of added depth increases the probability of an accident by 8.5%. There is clearly a need for clear and coherent strategies to be in place to help prevent and/or clean up accidental oil spills and to this end, there has been considerable world-wide effort has gone into strategies for minimising accidental spills and the design of new remedial technologies.

the Deepwater Horizon Oil Spill in the Gulf of Mexico caused a major ecological hazard

Deep Water Horizon Oil Spill - US Coast Guard Photo by Petty Officer First Class John Masson (courtesy of Chemistry World)

This critical review is the result of collaboration between the Institute of Ecology and Genetics of Microorganisms in the Russian Academy of Sciences, Perm State University in Russia, The Scottish Environmental Technology Network at the University of Strathclyde in Glasgow, UK, The University of Louisville in Kentucky, USA and the OECD Directorate for Science, Technology and Industry in Paris, France. The paper provides a summary of new knowledge as well as research and technology gaps essential for developing appropriate decision-making tools in actual oil-spill scenarios. The review will therefore be of interest to a wide range of stakeholders, including the oil industry, the scientific community and the public.

The review provides a clear comparison between the behaviour and environmental effects of marine and terrestrial oil spills (e.g. the nature of the spread of oil and size of affected area). The differences in appropriate response strategy for marine and terrestrial spills are clearly defined. The importance of ‘window-of-opportunity’ technology in combating oil spills is highlighted, i.e. the integration of different types of scientific information to allow rapid decision making on the best available strategy to achieve optimal environmental and cost benefits. The authors note that effective response to oil spills will require a) adequate data oil weathering over time; b) real-time remote sensing; c) analysis of response strategy performance. The review discusses the technological advances and challenges involving the multi-media modelling approaches to generating and analysing this information.

An in-depth review of different available technologies is provided and the authors use specific case studies to illustrate their effectiveness. For the marine environment this includes a discussion of chemical treatments (e.g. dispersants, emulsion breakers); in-situ burning; mechanical recovery (e.g. booms, skimmers, adsorbants etc) and bioremediation. In relation to the terrestrial environment, the review discusses the methods to prevent oil spills both on land and into ground/surface waters as well as more advanced clean-up technologies such as thermal desorption, soil vapour extraction, pump and treat technologies and solidification/stabilisation and bioremediation. The review also discusses the details, limitations and environmental effects of on-land containment and control technologies such as diversion/containment measures, trenches, sorbent or viscous liquid barriers.

Control technologies to prevent or tackle accidental oil spills

Technologies to prevent, control or tackle accidental oil spills

The authors emphasise that, because every spill is unique, there is no ‘one size fits all’ technology that will be suitable. The environmental impact and sustainability of remedial technologies vary widely so a suite of remedial technologies is required, and this should be part of the ‘risk-based remedial design’ strategy. The review highlights bioremediation methods as sustainable, cost-effective clean-up solutions and to achieve greater penetration of these techniques into the market depends on the harmonisation of environment legislation and application of innovative laboratory techniques e.g. ecogenomics to improve the predictability of bioremediation. However, it is also stressed that prevention is far less expensive than cure, and oil spill prevention should continue to be the focus for the industry.

This paper is a comprehensive and timely review of oil spill prevention and remediation methods, providing an invaluable summary of available technologies for remediation to increase awareness that a hierarchy of remedial technologies exists. Furthermore, it demonstrates that that there is a need for further development of both “soft” technologies, such as contingency planning, and “hard” engineering solutions for spill prevention. It is stressed that ultimately an integrated approach to prevention and remediation is needed and that greater international cooperation in contingency planning and spill response would probably lead to higher safety standards and fewer accidents.



To access the full article and read more about the technologies and strategies of tackling oil spills, download a copy for free* by clicking the link below.

Oil spill problems and sustainable response strategies through new technologies
Irena B. Ivshina, Maria S. Kuyukina, Anastasiya V. Krivoruchko, Andrey A. Elkin, Sergey O. Makarov, Colin J. Cunningham, Tatyana A. Peshkur, Ronald M. Atlas and James C. Philp
Environ. Sci.: Processes Impacts, 2015, Advance Article
DOI: 10.1039/C5EM00070J

—————-

About the webwriter

Ian Keyte is a Doctoral Researcher at the University of Birmingham. His research focuses on the sources, behavior and fate of polycyclic aromatic hydrocarbons (PAHs) in the atmosphere.

—————-

* Access is free until 18/08/2015 through a registered RSC account.

Digg This
Reddit This
Stumble Now!
Share on Facebook
Bookmark this on Delicious
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Seeing an unseen carbon sink

Wetlands may have a small global footprint, but provide several important services including improving water quality, and providing flood protection and erosion control. In recent years, they have become increasingly recognized for their strong ability to sequester carbon. Despite covering only six to nine percent of the Earth’s surface, it is estimated that they store upwards of 35% of terrestrial carbon.

Whether an ecosystem (such as a wetland) is a carbon source or a carbon sink is determined by its net ecosystem carbon exchange (NEE). Accurate estimates of NEE in turn require accurate estimates of Gross Primary Production (GPP), which is a measure of how much carbon is fixed in plants during photosynthesis –the main pathway for transport of carbon from the atmosphere to the land. GPP can be measured directly in the field. However, these direct measurements are both financially costly and time-consuming.

A promising approach to estimate GPP that overcomes these hurdles uses empirical modeling to correlate GPP with related biogeophysical parameters derived from remote sensing data. These biogeophysical data can be much easier to obtain than field measurements of GPP, especially if the data are available for free, as is the case with data from the Moderate Resolution Imaging Spectroradiometer (MODIS), an instrument onboard two NASA satellites.

Data derived from instruments such as MODIS that are useful for estimating GPP and NEE include land surface temperatures and vegetation indices. These indices describe the health or density of vegetation. Numerous studies have been conducted to estimate the GPP of ecosystems such as forests, croplands and grasslands using remote sensing data. A recent study by Wu and co-workers, from the Chinese Academy of Sciences, is one of the few studies to focus on the GPP of wetlands.

For an estuarine wetland in eastern China, the authors extracted various biogeophysical parameters from MODIS data, and correlated these parameters with field-measured GPP. When validating their model, the authors found that they were able to estimate the GPP of their study site with a high degree of accuracy. However, they caution that their model cannot simply be taken as-is and applied to different climates and ecosystems.

As the authors demonstrate, certain relationships in the model can vary from one ecosystem to another. For example, the relationship that uses land surface temperatures and vegetation indices to estimate light use efficiency, a measure of carbon fixation via photosynthesis, will be different for a boreal forest relative to an estuarine wetland. As yet, the model has only been validated for one particular wetland across several 8-day periods. Thus, the model still needs to be tested on other wetlands and other ecosystem types to evaluate its performance within and across ecosystem types.

Despite the relative newness of this model, its performance so far combined with the free availability, high temporal resolution, and wide spatial coverage of the MODIS data it requires mean it shows great potential for simplifying the way in which we estimate an ecosystem’s ability to sequester carbon.


To access the full article and read more about using satellite data to “see” how well wetlands take up carbon, download a copy for free* by clicking the link below.

Combining remote sensing and eddy covariance data to monitor the gross primary production of an estuarine wetland ecosystem in East China
Mingquan Wu, Shakir Muhammad, Fang Chen, Zheng Niu and Changyao Wang
Environ. Sci.: Processes Impacts, 2015, 17, 753-7625
DOI: 10.1039/C5EM00061K

—————-

About the webwriter

Abha Parajulee is a Ph.D. student at the University of Toronto Scarborough. She is interested in water resources and the behavior of organic contaminants in urban environments.

—————-

* Access is free until 24/07/2015 through a registered RSC account.

Digg This
Reddit This
Stumble Now!
Share on Facebook
Bookmark this on Delicious
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Breaking the mould: assessing microbial pollution in the indoor environment

Understanding the levels and behaviour of mould and fungal particles in the indoor environment is essential in order to carry out more comprehensive exposure assessments and analysis of their associated health effects. Researchers at Korea University and the National Institute of Environmental Research in Seoul present a study, describing a valuable method to achieve this.

Microbial pollution (for example from bacteria and fungi) can present a serious public health risk. A particular concern has been raised regarding filamentous fungi (better known as mould). Indoor mould is primarily caused by excess moisture, for example due to leaking pipes, rising damp or rain seepage. The World Health Organisation warns that mould is a key element of indoor pollution, linked with triggering and exacerbating allergic symptoms and diseases such as asthma and other respiratory illnesses.

Children are particularly sensitive to allergen exposure and, because they spend considerable time indoors, they could be vulnerable to these serious health effects if the indoor environment is not well maintained. Elementary (primary) school children spend upwards of six hours a day in school, mostly in one classroom. This emphasises the importance of minimising the risk of developing or increasing allergic diseases due to pollutants in the school environment.  While there has been considerable interest in assessing the problem of mould in school classrooms, as yet few comprehensive assessments have been performed.

Mould in the indoor environment can be a major source of microbial pollution and associated health effects

Household mould - a cause of indoor microbial pollution (Source: http://www.yourlocalguardian.co.uk)

Traditionally, studies of microbial pollution from mould involve making spore counts by culturing from collected samples due to ease of sampling and analysis. However, this method does not allow adequate evaluation of the exposure to mould due to the different growth rates of different types of mould. Therefore, assessment of spore concentration is not adequate on its own to fully investigate fungal exposure to humans.

More recently, the measurement of (1,3)-β -D-glucan has been proposed as a new tool to determine microbial pollution. This compound exists in fungal cell walls and is more common in airborne spores, so it is proposed that analysis of (1,3)-β -D-glucan in small-scale fungal fragments could be applied for an exposure assessment of mould and associated health effects in the school environment. This research by SungChul Seo and co-workers presents the first study of its kind to apply this approach to a relatively large-scale assessment of actual classrooms.

In the study, the levels of small-size (submicron) fungal fragments as well as airborne moulds, bacteria, and particulate matter (PM10), were evaluated in both indoor and outdoor areas of 70 classrooms in 8 elementary schools and the variation in the concentrations before and after the rainy season (May and July) were investigated. The concentrations of (1,3)-β -D-glucan in submicron fungal fragments, airborne mould and bacteria, and PM10 were measured simultaneously and the association of these levels with physical factors (i.e. temperature and relative humidity) wasalso compared and analysed.

The results indicate that the indoor/outdoor ratios of (1,3)-β -D-glucan concentrations were greater than 1 in every school. It was also shown that in the sampling period after the rainy season, the (1,3)-β -D-glucan concentrations decreased by about 35%, and similar significant decreases in the concentrations of airborne mould, bacteria and PM10 were observed as well. A negative association between the concentration of submicron fungal fragments and relative humidity was also observed.

This study therefore provides valuable insights into the concentrations, behaviour and variability of microbial pollution associated with mould in the school environment. This clearly outlines some of the key practical considerations required to carry out assessments of fungal exposure and could pave the way for similar studies in different locations. The authors note that more comprehensive exposure assessments for smaller-sized fungal particles should be performed for better understanding of their health impact, particularly with regard to seasonal and temporal changes.


To access the full article, download a copy for free* by clicking the link below.

Submicron fungal fragments as another indoor biocontaminant in elementary schools
SungChul Seo, Yeong Gyu Ji, Young Yoo,  Myung Hee Kwon and Ji Tae Choung
Environ. Sci.: Processes Impacts, Advance Article, 2015
DOI: 10.1039/c4em00702f

—————-

About the webwriter

Ian Keyte is a Doctoral Researcher at the University of Birmingham. His research focuses on the sources, behaviour and fate of polycyclic aromatic hydrocarbons (PAHs) in the atmosphere.

—————-

* Access is free until 08/07/2015 through a registered RSC account.

Digg This
Reddit This
Stumble Now!
Share on Facebook
Bookmark this on Delicious
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

A robot that identifies toxic bacteria in shallow water

Anabaena are a group of cyanobacteria species that populate shallow water areas. They are a bit Janus-faced, both good and bad for the ecosystem at the same time. On the one hand, as many bacteria do, they fixate nitrogen. They are also rich in chlorophyll, thus being able to produce photosynthesis. On the other hand, they produce a handful of neurotoxins. These toxins are a huge risk for any other living creature around –even human beings, if they happen to drink contaminated water.

Hence, knowing the density and coverage area of cyanobacteria in a particular ecosystem may be interesting to assess ecological risk. However, when the water is not too deep, fieldwork becomes harder: divers cannot work well and boats can hardly approach the study zone. Moreover, Anabaena species often form filaments that attach to rocks, seagrass and algae. These filaments are very fragile, so the system can be easily perturbed if someone moves around to collect samples.

Image 1. The USV robot equipped with all sort of different sensors and a video camera.

But J. Gutiérrez and colleagues came up with a pretty elegant solution. They designed a missile-shaped robot that is able to sail the surface of shallow lakes and ponds, and measure the amount of Anabaena on the surface without perturbing the ecosystem. This unmanned surface vehicle (USV) features a GPS that tracks the position at all times, as well as a wide variety of chemical sensors that simultaneously record the concentration of different ions (nitrate, ammonium). It is able to measure conductivity and temperature, too. All of this while being operated from the lab using a radio-frequency remote control.

The USV also carries a camera, equipped with an image stabilization system that was improved by Gutiérrez’s team. The camera records video at all time. This video will be analysed frame by frame with the help of a state-of-the art computer programme that will be able to identify Anabaena filaments or colonies in every picture.

Combining these data with the different physicochemical measurements obtained with the sensors, researchers are able to quantify and locate toxic bacteria. The USV-robot allows them to obtain very promising results with a fraction of the usual cost of other on-water monitoring systems.


To access the full article, download a copy for free* by clicking the link below.

On-water remote monitoring robotic system for estimating the patch coverage of Anabaena sp. filaments in shallow water
E. Romero-Vivas, F. D. Von Borstel, C. J. Pérez-Estrada, D. Torres-Ariño, J. F. Villa- Medina and J. Gutiérrez.
Environmental Science: Procceses & Impacts, 2015, Advance Article.
DOI: 10.1039/c5em00097a

—————-

About the webwriter

Fernando Gomollon-Bel is a PhD Student at the ISQCH (CSIC-University of Zaragoza). His research focuses on asymmetric organic synthesis using sugars as chiral-pool starting materials towards the production of fungical transglycosidase inhibitors.

—————-

* Access is free until 05/07/2015 through a registered RSC account.

Digg This
Reddit This
Stumble Now!
Share on Facebook
Bookmark this on Delicious
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Tracing the impact of the Fukushima nuclear accident

Monitoring the environmental effects of major nuclear incidents is an essential but complex undertaking. The use of the iodine-129 isotope as a tracer for detecting nuclear pollution in the environment has been widely established. The present study, by researchers from the Horia Hulubei National Institute for Physics and Engineering in Bucharest, Romania, applies this method to assess the impact of radioactive releases after the 2011 Fukushima accident  in Japan.

On March 11th 2011, a magnitude 9.0 earthquake stuck 130 km off the east coast of Japan, causing a 15 m tsunami, which inundated about 560 sq km of land. In addition to causing a death toll of over 19,000 and destroying or damaging over a million coastal buildings, this tsunami also disabled the power supply and cooling of three nuclear reactors at the Fukushima Daiichi plant, causing a major nuclear accident. Over 100,000 people were evacuated from the area, and an exclusion zone of 20 km around the plant was established. A comprehensive account of the cause and effects of this Fukushima incident is provided by the World Nuclear Association.

exclusion zone at the site of the Fukushima nuclear incident

Fukushima exclusion zone. Source: Thom Davies (http://thomdavies.com/tag/fukushima/)

In the wake of the disaster, much research and discussion has been undertaken to assess its environmental impact and the future safety of nuclear energy generation. Experts estimate that the Fukushima incident caused the largest ever direct release of radioactive material, such as iodine-129 and caesium-137 isotopes into the Pacific Ocean. The nature and extent of how this nuclear material is transported and dispersed in the ocean requires careful consideration, in order to assess the full environmental impact of the accident.

There is considerable uncertainty regarding the eastward migration of this radioactive plume in the Pacific Ocean and efforts to locate and characterise its position and movement have been unsuccessful so far. Iodine-129 displays a long residence time (half-life of 15.7 million years) and relatively low bioavailability. In very small quantities, it can act as an effective sensitive tracer of the radioactive plume in ocean waters. As a consequence, this study by C. Stan-Sion and co-workers used iodine-129 to determine the nuclear plume impact on the West Coast of the USA, roughly two years after the Fukushima incident.

For this research, ocean water samples were collected in La Jolla, San Diego, California on the West Coast of the USA (approx. 8770 km east of Fukushima) between April and July of 2013. Accelerator Mass Spectrometry (AMS) was used to determine the iodine-129/ iodine-127 ratio in the collected samples. Its very high sensitivity for measuring such isotopes was the main reason why this analytical method was chosen.

The results showed two sudden increases of the iodine-129/ iodine-127 isotopic concentration in the ocean water during late spring of 2013. The isotopic iodine-129/ iodine-127 ratio was more than a 2.5 factor higher in USA West Coast water samples, compared with those measured 40 km offshore of Fukushima immediately after the accident.

Map of Fukushima study area

Also, compared with the pre-Fukushima background values, the results of this study show an isotopic ratio of about two orders of magnitude higher. Based on these results, the authors calculated that the plume travelled with an average speed of approximately 12 cm s-1, which is consistent with the zonal current speed in the Pacific Ocean.

This investigation therefore demonstrates how the iodine-129/ iodine-127 isotopic ratios can be used to assess the impact of a certain nuclear accident in locations far removed from the accident site. Finally, the authors also coupled their iodine-129 results with the results of the Ka’imikai-O-Kanaloa international expedition in June 2011 to assess the activity of other radioactive isotopes such as hydrogen-3 and caesium-137, and the activity of these radio-isotopes were compliant with the international regulatory limits.


To access the full article, download a copy for free* by clicking the link below.

AMS analyses of I-129 from the Fukushima Daiichi nuclear accident in the Pacific Ocean waters of the Coast La Jolla – San Diego, USA
C. Stan-Sion, M. Enachescu and A. R. Petre
Environ. Sci.: Processes Impacts, 2015, 17, 932-938
DOI: 10.1039/c5em00124b

—————-

About the webwriter

Ian Keyte is a Doctoral Researcher at the University of Birmingham. His research focuses on the sources, behavior and fate of polycyclic aromatic hydrocarbons (PAHs) in the urban atmosphere.

—————-

* Access is free until 21/06/2015 through a registered RSC account.

Digg This
Reddit This
Stumble Now!
Share on Facebook
Bookmark this on Delicious
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Using fluorescence to measure the quality of water

We use fluorescence to identify counterfeit money. Could we use a very similar technique to assess the quality of fresh water? K. Khamis and colleagues suggest tryptophan-like fluorometers to do so. In their paper, they discuss a novel way of measuring pollutants thanks to this procedure, and also propose novel mathematical models to better conduct in situ experiments, thus avoiding the storage and transportation of samples.

When water is polluted with sewage or farm waste, the amount of dissolved organic matter (DOM) increases. Organic matter contains sugars, lipids, nucleic acids and proteins. The latter are biomolecules which have a very interesting property: fluorescence. Folded proteins are fluorescent, mainly because of the tryptophan residues, and they can absorb and emit light at 280 nm and 350 nm, respectively. As a consequence, fluorescence may be directly correlated with the quantity of organic waste dissolved in water.

However, very few fluorescence sensors have been developed to measure OM in freshwater, mostly because freshwater systems are quite dynamic in space and time. Moreover, certain factors such as temperature or suspended inorganic particles often alter the measurements. Temperature allows electrons to return to their ground-energy state without emitting any fluorescence. Additionally, soil particles can scatter light and reduce the fluorescence signal by up to 80%.

But Khamis and his team did not see a problem in this. On the contrary, they saw this fact as an opportunity to develop new tryptophan-like sensors and state-of-the art algorithms to minimize the effect of quenchers like temperature or soil particles. Researchers located in situ detectors near Birmingham to study urban streams and near Nottingham to study groundwater; they also took a wide set of samples which were analyzed in the lab.

The data obtained from these analyses was then compared to the in situ measurements. Using these two different groups of data, they elaborated mathematical models to compensate the effect of quenchers. These algorithms were fundamental to ensure the accuracy of the quantifications. When the corrections were applied, in situ and lab results appeared to correlate much better.

Thanks to these amazing results, scientists may soon be able to develop cheap, small sized, highly accurate tryptophan-like pollution sensors for freshwater. These detectors could be easily used in the field, hence completely eliminating the need to collect, preserve, store and carry around thousands of samples.


To access the full article, download a copy for free* by clicking the link below:

In situ tryptophan-like fluorometers: assessing turbidity and temperature effects for freshwater applications
K. Khamis, J. P. R. Sorensen, C. Bradley, D. M. Hannah, D. J. Lapworthc and R. Stevens
Environ. Sci.: Processes Impacts, 2015, 17, 740-752
DOI: 10.1039/C5EM00030K

—————-

About the webwriter

Fernando Gomollon-Bel is a PhD Student at the ISQCH (CSIC-University of Zaragoza). His research focuses on asymmetric organic synthesis using sugars as chiral-pool starting materials towards the production of fungical transglycosidase inhibitors.

—————-

* Access is free until 14/06/2015 through a registered RSC account.

Digg This
Reddit This
Stumble Now!
Share on Facebook
Bookmark this on Delicious
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Passive sampling for improved monitoring of indoor air quality

Indoor air pollutants require careful and accurate monitoring in order to ensure people are not exposed to levels that could cause adverse health effects. This study, commissioned by the U.S. Department of Defense, demonstrates the usefulness of diffusive passive air samplers for this purpose.

People can potentially be exposed to dangerous levels of air pollutants indoors, either directly from indoor sources or from polluted air transported from outside. The WHO has acknowledged the threat of poor indoor air quality, identifying a number of specific pollutants of concern and providing guidance on health effects and recommended guideline exposure limits associated with these contaminants. One class of pollutants causing concern are volatile organic compounds (VOCs). The USEPA warn that VOCs have been linked with a number of adverse health effects including damage to the kidney, liver and central nervous system, as well as eye, nose and skin irritation, headaches and nausea.

Domestic combustion sources, such as heating and cooking, are known to be important sources of VOCs; however, an emerging source of interest is the intrusion of vapour from contaminated soil or groundwater into a building. Careful monitoring of VOCs is clearly needed to carry out appropriate risk assessment and identify the relative contribution of this intrusive source on total VOC concentration in the indoor environment. Conventional air sampling methods typically do not allow for monitoring for more than a 24 hour period. However, due to the temporal variability of some VOCs, longer sampling duration is needed to better represent the long-term average concentrations of VOCs.

Passive diffusive samplers are therefore well-suited to this application as they have the ability to detect relatively low concentrations of contaminants over relatively long sampling duration. Passive air samplers commonly use a sorbent material to trap VOCs, which reach the sampler via diffusive transfer from the surrounding atmosphere. A time-weighted average concentration can be calculated subsequently in the lab. Passive sampling has been used for indoor air quality monitoring in occupational settings for decades, but the application to monitoring subsurface vapour intrusion to indoor air requires further work to assess their capabilities and limitations for lower concentrations and longer exposure durations.

Passive diffusive air samplers are shown in this study to be an effective means of monitoring indoor air quality     Passive air samplers

Air samplers

This article by Todd McAlary and co-workers, combining work of research laboratories in Canada, the USA, UK and Italy, describes laboratory testing of passive diffusive samplers for assessing indoor air concentrations of VOCs in order to demonstrate and validate their potential usefulness.

Four different passive samplers were tested, utilising different types of sorbent, under a wide range of controlled laboratory conditions (temperature, humidity, VOC concentration and sampling duration) with review from leading experts on each sampler type. 10 different VOCs were measured including aliphatic compounds such as alkenes and alkanes as well as aromatic compounds such as benzene and naphthalene, in order to assess compounds that cover a range of physiochemical properties and some compounds that pose a challenge to sampling.

The results demonstrate that passive samplers can potentially provide data that is more representative of long-term average indoor air concentrations than conventional methods that are limited to shorter sample durations. The results show the passive samplers proved data that meets a set of success criteria for most of the compounds tests, although some compounds were identified as being more problematic. The study provides a unique and valuable new body of data on indoor air quality monitoring. However, the authors also caution that passive sampling programs will need to be supplemented by quality assurance measures. For example, outdoor air samples should be taken simultaneously with indoor sampling to help ascertain the relative contributions of different pollution sources.

To access the full article, download a copy for free* by clicking the link below:

Passive sampling for volatile organic compounds in indoor air-controlled laboratory comparison of four sampler types
Todd McAlary, Hester Groenevelt, Stephen Disher, Jason Arnold, Suresh Seethapathy, Paolo Sacco, Derrick Crump, Brian Schumacher, Heidi Hayes, Paul Johnson and Tadeusz Góreckic
Environ. Sci.: Processes Impacts, 2015, Advance Article
DOI: 10.1039/c4em00560k

—————-

About the webwriter

Ian Keyte is a Doctoral Researcher at the University of Birmingham. His research focuses on the sources, behavior and fate of polycyclic aromatic hydrocarbons (PAHs) in the atmosphere.

—————-

* Access is free until 07/06/2015 through a registered RSC account.

Digg This
Reddit This
Stumble Now!
Share on Facebook
Bookmark this on Delicious
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)