Wednesday, December 01, 2010

Biofuels Production Has Unintended Consequences on Water Quality and Quantity in Mississippi


Growing corn for biofuels production is having unintended effects on water quality and quantity in northwestern Mississippi.
More water is required to produce corn than to produce cotton in the Mississippi Delta requiring increased withdrawals of groundwater from the Mississippi River Valley alluvial (MRVA) aquifer for irrigation. This is contributing to already declining water levels in the aquifer. In addition, increased use of nitrogen fertilizer for corn in comparison to cotton could contribute to low dissolved oxygen conditions in the Gulf of Mexico.
These are some of the key findings from a study conducted by the U.S. Geological Survey (USGS) to assess water quality and quantity in the Mississippi Delta, in relationship to biofuels production.
"Because corn uses 80 percent more water for irrigation than cotton, exchanging corn for cotton will decrease water-levels," according to Heather Welch, USGS Hydrologist and author of this USGS Report. Declining water levels in the MRVA aquifer are particularly significant in the Mississippi Delta, where the infiltration of rainfall to replenish the aquifer is low. "This is a low flat area. When it does rain, much of the precipitation is lost through evapotranspiration and to streamflow, so the rainwater never reaches the aquifer," explains Welch.
In 2006, the U.S. Department of Energy Biomass Program implemented the Biofuels Initiative. The initiative calls for the replacement of 30 percent of gasoline levels by ethanol by 2030 and the reduction of ethanol costs to prices competitive with gasoline by 2012. In the Mississippi Delta, implementation of this initiative resulted in a 47-percent decrease in the number of acres dedicated to producing cotton, which resulted in a corresponding 288-percent increase in corn acreage in the region from 2006 to 2007.
Using the USGS SPARROW model (SPAtially Referenced Regression on Watershed), scientists found that the conversion of cotton to corn acreage (comparing 2007 to 2002) is estimated to have increased the nitrogen load for the Yazoo River by 7 percent. The Yazoo River Basin has been identified as a contributor of nitrogen to the Gulf of Mexico. Levels of nitrogen in the Gulf of Mexico have resulted in low dissolved oxygen conditions which can impact fish and bottom dwelling organisms.
Locally, water level declines and decreasing water quality contributes to the Delta's poor ecosystem health. "We are seeing a loss of habitat complexity, and lowered water levels have decreased baseflow to streams," says Jeannie Barlow, USGS Hydrologist and co-author of the study. "Some streams have remained dry for months in the summer and fall during periods of low rainfall," says Barlow.
According to data provided by the Yazoo Mississippi Delta Joint Water Management District (YMD), the total amount of water stored in the aquifer has declined since 1980, and current withdrawals from the aquifer are greater than the amount of water entering the aquifer.
These USGS findings provide essential scientific information about the effects of corn-based ethanol on water resources that Delta producers can use when making their planting decisions.
This study was sponsored by scientists from the Energy Resources Program and the National Water Quality Assessment Program and conducted by scientists from the USGS Mississippi Water Science Center.

Quartz Crystal Microbalances Enable New Microscale Analytic Technique


A NIST researcher prepares quartz crystal microbalance disks with samples of carbon nanotubes for microscale thermogravimetric analysis. Typical sample sizes are about 2 microliters, or about 1 microgram.
A new chemical analysis technique developed by a research group at the National Institute of Standards and Technology (NIST) uses the shifting ultrasonic pitch of a small quartz crystal to test the purity of only a few micrograms of material. Since it works with samples close to a thousand times smaller than comparable commercial instruments, the new technique should be an important addition to the growing arsenal of measurement tools for nanotechnology, according to the NIST team.
As the objects of scientific research have gotten smaller and smaller -- as in nanotechnology and gene therapy -- the people who worry about how to measure these things have been applying considerable ingenuity to develop comparable instrumentation. This new NIST technique is a riff on thermogravimetric analysis (TGA), an imposing name for a fairly straightforward concept. A sample of material is heated, very slowly and carefully, and changes in its mass are measured as the temperature increases. The technique measures the reaction energy needed to decompose, oxidize, dehydrate, or otherwise chemically change the sample with heat.
TGA can be used, for example, to characterize complex biofuel mixtures because the various components vaporize at different temperatures. The purity of an organic sample can be tested by the shape of a TGA plot because, again, different components will break down or vaporize at different temperatures. Conventional TGA, however, requires samples of several milligrams or more of material, which makes it hard to measure very small, laboratory-scale powder samples -- such as nanoparticles -- or very small surface chemistry features such as thin films.
What's needed is an extremely sensitive "microbalance" to measure the minute changes in mass. The NIST group found one in the quartz crystal microbalance, essentially a small piezoelectric disk of quartz sandwiched between two electrodes. An alternating current across the electrodes causes the crystal to vibrate at a stable and precise ultrasonic frequency -- the same principle as a quartz crystal watch. Added mass (a microsample) lowers the resonant frequency, which climbs back up as the microsample is heated and breaks down.
In a new paper.** the NIST materials science group demonstrates that their microbalance TGA produces essentially the same results as a conventional TGA instrument, but with samples about a thousand times smaller. They can detect not only the characteristic curves for carbon black, aluminum oxide and a sample organic fluid, but also the more complex curves of mixtures.
"We started this work because we wanted to analyze the purity of small carbon nanotube samples," explains analytical chemist Elisabeth Mansfield. More recently, she says, they've applied the technique to measuring the organic surface coatings biologists put on gold nanoparticles to modify them for particular applications. "Measuring how much material coats the particles surface is very hard to do right now," she says, "It will be a really unique application for this technique."
The prototype apparatus requires that the frequency measurements be made in a separate step from the heating. Currently, the team is at work integrating the microbalance disks with a heating element to enable the process to be simultaneous.

Functional Amino Acids Regulate Key Metabolic Pathways


Functional amino acids play a critical role in the development of both animals and humans, according to Dr. Guoyao Wu, AgriLife Research animal nutritionist. 
Functional amino acids play a critical role in the development of both animals and humans, according to a Texas AgriLife Research scientist.
In a journal article appearing in the American Society for Nutrition (Advances in Nutrition 1:31-37, 2010), Dr. Guoyao Wu, AgriLife Research animal nutritionist and senior faculty fellow in the department of animal science at Texas A&M University, calls for scientists to "think out of the box" and place more emphasis on this area of study.
"We need to move forward and capitalize on the potential of functional amino acids in improving health and animal production," he said.
A functional amino acid is an amino acid that can regulate key metabolic pathways to improve health, growth, development and reproduction of animals and humans, Wu said.
"This involves cell signaling through amino acids and their metabolites, and the metabolic pathways may include protein synthesis, antioxidative reactions and oxidation of energy substrates," he said. "A functional amino acid can be either a 'nonessential' or an 'essential' amino acid."
Past research emphasis has focused primarily on essential amino acids. However, Wu says both essential amino acids and non-essential amino acids should be taken into consideration.
"This is important when formulating balanced diets to maximize growth performance in livestock species, poultry and fish," he said. "It is also recommended that nonessential amino acids be provided to humans to prevent growth retardation and chronic diseases."
Wu's previous research discovered that arginine, an amino acid, contributes many positive benefits in growth and embryo development in pigs, sheep and rats. Arginine also aids in fighting obesity. Wu has identified this as an important area for expanded research on new amino acids and health.
"Currently in the U.S., more than 60 percent of adults are overweight or obese," he said. "Globally, more than 300 million adults are obese and more than 1 billion are overweight. Also, a large number of children in the U.S. and other countries are overweight or obese. The most urgent needs of new research in amino acids and health are the roles of functional amino acids in the treatment and prevention of obesity and its associated cardiovascular dysfunction."
Wu also said that dietary supplementation with arginine can help improve meat quality in pigs prior to slaughter.
The two top scientific discoveries in the field of amino acids and health over the past two decades are nitric oxide synthesis from arginine and the role of amino acids in cell signaling.
"An important area of research in the next few years may be to study the molecular and cellular mechanisms whereby some amino acids (e.g., arginine) can regulate metabolic pathways in animals and humans," he said. "An example is how arginine reduces obesity and ameliorates the metabolic syndrome, and how elevated levels of leucine may contribute to mitochondrial dysfunction and insulin resistance (including vascular resistance) in obese subjects."
He said "unquestionably" recent advances in understanding functional amino acids are "expanding our basic knowledge of protein metabolism and transforming practices of nutrition worldwide."
Though nutritional studies conducted on animals have benefited human health, Wu suggests that caution should be taken to "extrapolate animal data to humans" as dietary requirements differ from one species to another.
Wu said that humans need diets with balanced portions of amino acids for cardiovascular and reproductive health.

Methane-Powered Laptops? Materials Scientists Unveil Tiny, Low-Temperature Methane Fuel Cells


Top view, cathode side, of a free-standing Pt/YSZ/Pt fuel cell showing characteristic buckling patterns. The cell width is 160 microns.
Making fuel cells practical and affordable will not happen overnight. It may, however, not take much longer.
With advances in nanostructured devices, lower operating temperatures, and the use of an abundant fuel source and cheaper materials, a group of researchers led by Shriram Ramanathan at the Harvard School of Engineering and Applied Sciences (SEAS) are increasingly optimistic about the commercial viability of the technology.
Ramanathan, an expert and innovator in the development of solid-oxide fuel cells (SOFCs), says they may, in fact, soon become the go-to technology for those on the go.
Electrochemical fuel cells have long been viewed as a potential eco-friendly alternative to fossil fuels -- especially as most SOFCs leave behind little more than water as waste.
The obstacles to using SOFCs to charge laptops and phones or drive the next generation of cars and trucks have remained reliability, temperature, and cost.
Fuel cells operate by converting chemical energy (from hydrogen or a hydrocarbon fuel such as methane) into an electric current. Oxygen ions travel from the cathode through the electrolyte toward the anode, where they oxidize the fuel to produce a current of electrons back toward the cathode.
That may seem simple enough in principle, but until now, SOFCs have been more suited for the laboratory rather than the office or garage. In two studies appearing in the Journal of Power Sources this month, Ramanathan's team reported several critical advances in SOFC technology that may quicken their pace to market.
In the first paper, Ramanathan's group demonstrated stable and functional all-ceramic thin-film SOFCs that do not contain any platinum.
In thin-film SOFCs, the electrolyte is reduced to a hundredth or even a thousandth of its usual scale, using densely packed layers of special ceramic films, each just nanometers in thickness. These micro-SOFCs usually incorporate platinum electrodes, but they can be expensive and unreliable.
"If you use porous metal electrodes," explains Ramanathan, "they tend to be inherently unstable over long periods of time. They start to agglomerate and create open circuits in the fuel cells."
Ramanathan's platinum-free micro-SOFC eliminates this problem, resulting in a win-win: lower cost and higher reliability.
In a second paper published this month, the team demonstrated a methane-fueled micro-SOFC operating at less than 500° Celsius, a feat that is relatively rare in the field.
Traditional SOFCs have been operating at about 800-1000°C, but such high temperatures are only practical for stationary power generation. In short, using them to power up a smartphone mid-commute is not feasible.
In recent years, materials scientists have been working to reduce the required operating temperature to about 300-500°C, a range Ramanathan calls the "sweet spot."
Moreover, when fuel cells operate at lower temperatures, material reliability is less critical -- allowing, for example, the use of less expensive ceramics and metallic interconnects -- and the start-up time can be shorter.
"Low temperature is a holy grail in this field," says Ramanathan. "If you can realize high-performance solid-oxide fuel cells that operate in the 300-500°C range, you can use them in transportation vehicles and portable electronics, and with different types of fuels."
The use of methane, an abundant and cheap natural gas, in the team's SOFC was also of note. Until recently, hydrogen has been the primary fuel for SOFCs. Pure hydrogen, however, requires a greater amount of processing.
"It's expensive to make pure hydrogen," says Ramanathan, "and that severely limits the range of applications."
As methane begins to take over as the fuel of choice, the advances in temperature, reliability, and affordability should continue to reinforce each other.
"Future research at SEAS will explore new types of catalysts for methane SOFCs, with the goal of identifying affordable, earth-abundant materials that can help lower the operating temperature even further," adds Ramanathan.
Fuel cell research at SEAS is funded by the same NSF grant that enabled the "Robobees" project led by Robert J. Wood, Assistant Professor of Electrical Engineering. Wood and Ramanathan hope that micro-SOFCs will provide the tiny power source necessary to get the flying robots off the ground.
Ramanathan's co-authors on the papers were Bo Kuai Lai, a Research Associate at SEAS, and Ph.D. candidate Kian Kerman '14.

Monday, November 29, 2010

RavenSkin insulation stores up daytime heat for release when temperatures drop

RavenSkin insulation delays heat transfer for when temperatures drop
RavenSkin insulation delays heat transfer for when temperatures drop

RavenBrick, the company that brought us the smart tinting RavenWindow, has added to its folio of temperature regulating building materials with RavenSkin. Unlike traditional insulation that blocks all heat equally, this innovative wall insulation material absorbs heat during the day to keep the interior cool and slowly releases the stored heat at night to warm the building when the sun goes down.
The core of the system is a phase change material (PCM) that delays the transfer of heat energy from the sun to the interior of the house. PCMs are materials which melt and solidify at a certain temperature, absorbing or releasing heat when the material changes from solid to liquid or vice versa.
RavenSkin consists of an exterior layer of the company’s RavenWindow technology which reflects solar energy when it is hot and lets it pass through when cold. An air gap separates this outer layer from a glass layer that reflects back infrared (IR) to create a greenhouse effect in the air gap. Next is a layer of downconverter material that converts the incoming sunlight to IR below a certain temperature, before letting it pass through to the PCM layer which stores the converted IR energy. This energy is then released as heat through the back painted layer to the interior of the house when the temperature drops to a certain level. This interior surface can also be painted to suit your interior décor.
The current size of the individual panels, which are rated at R11 or greater, is limited to 55 x 55 inches (140 x 140 cm) and RavenBrick says they could provide savings of up to 100 percent on heating and cooling – depending on the building design and location. The company says it’s patent pending technology is perfect for warehouses, sheds and off-grid housing and provides a speedy return on investment for skyscrapers and other commercial, industrial or institutional buildings.
Boeing subsidiary Spectrolab has announced it will mass-produce a 39.2 percent efficiency ...
Boeing subsidiary Spectrolab has announced it will mass-produce a 39.2 percent efficiency solar cell

When it comes to solar cells, everyone is chasing the highest conversion efficiency. Although we’ve seen conversion efficiencies of over 40 percent achieved with multi-junction solar cells in lab environments, Boeing subsidiary Spectrolab is bringing this kind of efficiency to mass production with the announcement of its C3MJ+ solar cells which boast an average conversion efficiency of 39.2 percent.
As far back as 2006 Spectrolab was achieving conversion efficiencies of over 40 percent in the lab with its high-efficiency multi-junction concentrator solar cells and it reached a peak of 41.6 percent with a test cell last year, setting a new world record. The company’s newest terrestrial concentrator photovoltaic (CPV) cell, called the C3MJ+, uses essentially the same technology as its record breaking test cell and follows on from its C3MJ solar cell in production since mid-2009 which boasts a conversion efficiency of 38.5 percent. The C3MJ+ solar cells
"Given the new cells' close similarity to our existing production cells, we believe that our current C3MJ customers will be able to easily upgrade for more efficiency," said Russ Jones, Spectrolab director of CPV Business Development.
Spectrolab claims the title of the world’s leading supplier of solar cells for satellites with its cells supplying power to around 60 percent of satellites currently in orbit, as well as the International Space Station. Boeing hopes to transfer that success to the terrestrial solar cell market with the new high-efficiency solar cells that are expected to be available from January. And it won’t be resting on its laurels. It expects Spectrolab will achieve a 40 percent average production efficiency for terrestrial solar cells in 2011.

'Plastisoil' could mean cleaner rivers and less plastic waste

Plastisoil is a concrete-like substance made from discarded plastic bottles, that rain wat...
Plastisoil is a concrete-like substance made from discarded plastic bottles, that rain water can pass through instead of running into storm sewers
A new cement-like material that could be used to form sidewalks, bike and jogging paths, driveways and parking lots, may be able to lessen two environmental problems, namely plastic waste and polluted rainwater runoff. The substance is called Plastisoil, and it was developed by Naji Khoury, an assistant professor of civil and environmental engineering at Temple University in Philadelphia. In order to make Plastisoil, discarded polyethylene terephthalate (PET) plastic bottles are pulverized and mixed with soil, and then that mixture is blended with a coarse aggregate and heated. The result is a hard yet non-watertight substance, similar to pervious concrete or porous asphalt.
With traditional concrete and asphalt paving, rainwater stays on the surface and runs into the storm sewers, accumulating oil and other road filth along the way. With pervious surfaces such as Plastisoil, that water is able to go down through them, and into the soil below. This certainly reduces the amount of pollutants entering the rivers, although Khoury and his team at Temple are currently trying to determine if Plastisoil could even serve as a filter, that removed pollutants as the water filtered through.
Khoury said that it uses less energy to produce one ton of Plastisoil than one ton of cement or asphalt, and that it’s less expensive to manufacture than similar products. It takes 30,000 PET bottles to make one ton of the material, although he is hoping to be able to use other types of plastic in the future.

'Glue' producing bacteria used to fill gaps in cracking concrete

Looks like a job for BacillaFilla (Image: Shaire Productions via Flickr) Looks like a job for BacillaFilla 
Earlier this year we took a look at the development of self-healing concrete that repairs its own cracks using a built-in healing agent. While this kind of technology holds promise for construction in the future, it’s not so useful for the vast amounts of existing concrete in need of replacement or repair. UK researchers have come up with a solution to this problem that uses bacteria to produce a special "glue" to knit together cracks in existing concrete structures.

'BacillaFilla'

A team of students from Newcastle University created bacteria and programmed the genetically modified microbe to only start germinating when triggered by the specific pH of concrete. The researchers have also built in a self-destruct gene to ensure bacteria are unable to survive in the environment if they fail to germinate. Once the cells have germinated, they swarm down the fine cracks in the concrete and know they have reached the bottom when they start clumping.
This clumping activates the cells to differentiate into three types: cells which produce calcium carbonate, cells which become filamentous to act as reinforcing fibers, and cells which produce a Levans glue which acts as a binding agent. The calcium carbonate and bacterial glue combine with the filamentous cells, ultimately hardening to the same strength of the surrounding concrete to form what the researchers have dubbed "BacillaFilla" which knits the concrete structure together again.

Helping cut CO2 emissions

With concrete production a major contributor of global carbon dioxide emissions, the BacillaFilla could provide a way to repair instead of replace existing concrete structures.
Joint project instructor Dr Jennifer Hallinan explains: “Around five per cent of all man-made carbon dioxide emissions are from the production of concrete, making it a significant contributor to global warming. Finding a way of prolonging the lifespan of existing structures means we could reduce this environmental impact and work towards a more sustainable solution.”
“This could be particularly useful in earthquake zones where hundreds of buildings have to be flattened because there is currently no easy way of repairing the cracks and making them structurally sound,” she added.
The Newcastle University students designed the BacillaFilla as their entry for this year's International Genetically Engineered Machines contest (iGEN) that is run out of MIT. The team took out Gold for their research, which was up against over 130 entries.

Solar-powered air-conditioning for vehicles developed

The solar-powered AC system test vehicle
The solar-powered AC system test vehicle

The more environmentally conscious among us still driving gasoline-powered cars often feel a pang of guilt as we turn on the air-conditioning on a hot day, knowing that we’ve just significantly reduced the fuel efficiency of the vehicle and sent more greenhouse gases into the atmosphere. While solar-powered AC systems – even portable ones – are nothing new, there’s been a problem getting their size down to a point that would allow them to cool a vehicle. While cars may have to wait a bit longer, truck drivers look like being spoiled for choice with another solar-powered AC system joining the i-Cool Solar system we looked at earlier this month.
If there’s one place that AC is a necessity for a vehicle, it is the humid climes of Hong Kong. So it’s not surprising to see the development of a solar-powered AC system for vehicles come out of the Hong Kong Polytechnic University (PolyU) through a collaboration with industry partners, Green Power Industrial Ltd. and Swire Coca-Cola Hong Kong.
The system features a photovoltaic panel attached to the roof of the truck’s cab, which collects solar energy to charge a specially made battery system that powers an electric motor to drive a variable frequency-driven (VFD) compressor, which produces the cooled air. This allows it to operate on cloudy or rainy days and, because it is a stand-alone system, the AC can be switched on when the vehicle engine isn’t running.
The solar AC was installed on a Coca-Cola delivery truck for a series of tests and proved to work on the road. PolyU and its partners plan to explore further use of the system in Hong Kong.
"We look forward to having more fruitful collaboration with Green Power Industrial Ltd and Swire Coca-Cola Hong Kong to build a low-carbon city. Together, we can jointly make a contribution for sustainable development of our community," said PolyU President Professor Timothy W. Tong.

A High-Yield Biomass Alternative to Petroleum for Industrial Chemicals


A team of University of Massachusetts Amherst chemical engineers have developed a way to produce high-volume chemical feedstocks including benzene, toluene, xylenes and olefins from pyrolytic bio-oils, the cheapest liquid fuels available today derived from biomass. The new process could reduce or eliminate industry's reliance on fossil fuels to make industrial chemicals worth an estimated $400 billion annually.
A team of University of Massachusetts Amherst chemical engineers report in November 25 issue of Science that they have developed a way to produce high-volume chemical feedstocks including benzene, toluene, xylenes and olefins from pyrolytic bio-oils, the cheapest liquid fuels available today derived from biomass. The new process could reduce or eliminate industry's reliance on fossil fuels to make industrial chemicals worth an estimated $400 billion annually.
Instead of buying petroleum by the barrel, chemical manufacturers will now be able to use relatively cheaper, widely available pyrolysis oils made from waste wood, agricultural waste and non-food energy crops to produce the same high-value materials for making everything from solvents and detergents to plastics and fibers.
As principal investigator George Huber, associate professor of chemical engineering at UMass Amherst, explains, "Thanks to this breakthrough, we can meet the need to make commodity chemical feedstocks entirely through processing pyrolysis oils. We are making the same molecules from biomass that are currently being produced from petroleum, with no infrastructure changes required."
He adds, "We think this technology will provide a big boost to the economy because pyrolysis oils are commercially available now. The major difference between our approach and the current method is the feedstock; our process uses a renewable feedstock, that is, plant biomass. Rather than purchasing petroleum to make these chemicals, we use pyrolysis oils made from non-food agricultural crops and woody biomass grown domestically. This will also provide United States farmers and landowners a large additional revenue stream."
In the past, these compounds were made in a low-yield process, the chemical engineer adds. "But here we show how to achieve three times higher yields of chemicals from pyrolysis oil than ever achieved before. We've essentially provided a roadmap for converting low-value pyrolysis oils into products with a higher value than transportation fuels."
In the paper, he and doctoral students Tushar Vispute, Aimaro Sanno and Huiyan Zhang show how to make olefins such as ethylene and propylene, the building blocks of many plastics and resins, plus aromatics such as benzene, toluene and xylenes found in dyes, plastics and polyurethane, from biomass-based pyrolysis oils. They use a two-step, integrated catalytic approach starting with a "tunable," variable-reaction hydrogenation stage followed by a second, zeolite catalytic step. The zeolite catalyst has the proper pore structure and active sites to convert biomass-based molecules into aromatic hydrocarbons and olefins.
Huber, Vispute and colleagues discuss how to choose among three options including low- and high-temperature hydrogenation steps as well as the zeolite conversion for optimal results. Their findings indicate that "the olefin-to-aromatic ratio and the types of olefins and aromatics produced can be adjusted according to market demand." That is, using the new techniques, chemical producers can manage the carbon content from biomass they need, as well as hydrogen amounts. Huber and colleagues provide economic calculations for determining the optimal mix of hydrogen and pyrolytic oils, depending on market prices, to yield the highest-grade product at the lowest cost.
A pilot plant on the UMass Amherst campus is now producing these chemicals on a liter-quantity scale using this new method. The technology has been licensed to Anellotech Corp., co-founded by Huber and David Sudolsky of New York City. Anellotech is also developing UMass Amherst technology invented by the Huber research team to convert solid biomass directly into chemicals. Thus, pyrolysis oil represents a second renewable feedstock for Anellotech.
Sudolsky, Anellotech's CEO, says, "There are several companies developing technology to produce pyrolysis oil from biomass. The problem has been that pyrolysis oils must be upgraded to be useable. But with the new UMass Amherst process, Anellotech can now convert these pyrolysis oils into valuable chemicals at higher efficiency and with very attractive economics. This is very exciting."

Scientists Crack Materials Mystery in Vanadium Dioxide


Theoretical research at Oak Ridge National Laboratory can help explain experimental results in vanadium dioxide, such as the formation of thin conductive channels (seen in white) that can appear under strain in a nanoscale vanadium dioxide sample. 
A systematic study of phase changes in vanadium dioxide has solved a mystery that has puzzled scientists for decades, according to researchers at the Department of Energy's Oak Ridge National Laboratory.
Scientists have known that vanadium dioxide exhibits several competing phases when it acts as an insulator at lower temperatures. However, the exact nature of the phase behavior has not been understood since research began on vanadium dioxide in the early 1960s.
Alexander Tselev, a research associate from the University of Tennessee-Knoxville working with ORNL's Center for Nanophase Materials Sciences, in collaboration with Igor Luk'yanchuk from the University of Picardy in France used a condensed matter physics theory to explain the observed phase behaviors of vanadium dioxide, a material of significant technological interest for optics and electronics.
"We discovered that the competition between several phases is purely driven by the lattice symmetry," Tselev said. "We figured out that the metallic phase lattice of vanadium oxide can 'fold' in different ways while cooling, so what people observed was different types of its folding."
Vanadium dioxide is best known in the materials world for its speedy and abrupt phase transition that essentially transforms the material from a metal to an insulator. The phase change takes place at about 68 degrees Celsius.
"These features of electrical conductivity make vanadium dioxide an excellent candidate for numerous applications in optical, electronic and optoelectronic devices," Tselev said.
Devices that might take advantage of the unusual properties of VO2 include lasers, motion detectors and pressure detectors, which could benefit from the increased sensitivity provided by the property changes of vanadium dioxide. The material is already used in technologies such as infrared sensors.
Researchers said their theoretical work could help guide future experimental research in vanadium dioxide and ultimately aid the development of new technologies based on VO2.
"In physics, you always want to understand how the material ticks," said Sergei Kalinin, a senior scientist at the CNMS. "The thermodynamic theory will allow you to predict how the material will behave in different external conditions."
The results were published in the American Chemical Society's Nano Letters. The research team also included Ilia Ivanov, John Budai and Jonathan Tischler at ORNL and Evgheni Strelcov and Andrei Kolmakov at Southern Illinois University.
The team's theoretical research expands upon previous experimental ORNL studies with microwave imaging that demonstrated how strain and changes of crystal lattice symmetry can produce thin conductive wires in nanoscale vanadium dioxide samples.
This research was supported in part by the Department of Energy's Office of Science and by the National Science Foundation.

Optimizing Large Wind Farms


Instantaneous streamwise velocity magnitudes on three perpendicular planes across a wind turbine array boundary layer, obtained from computer simulation. The dark semicircles denote the positions of the wind turbines and the blue regions behind them denote the meandering wakes. Such simulations have been used to develop a model for wind farm roughness length, from which optimal wind turbine spacings can be deduced. Optimal spacing is found to be about 15 diameters.
Wind farms around the world are large and getting larger. Arranging thousands of wind turbines across many miles of land requires new tools that can balance cost and efficiency to provide the most energy for the buck.
Charles Meneveau, who studies fluid dynamics at Johns Hopkins University, and his collaborator Johan Meyers from Leuven University in Belgium, have developed a model to calculate the optimal spacing of turbines for the very large wind farms of the future. Theyl presented their work November 23 at the American Physical Society Division of Fluid Dynamics (DFD) meeting in Long Beach, CA.
"The optimal spacing between individual wind turbines is actually a little farther apart than what people use these days," said Meneveau.
The blades of a turbine distort wind, creating eddies of turbulence that can affect other wind turbines farther downwind. Most previous studies have used computer models to calculate the wake effect of one individual turbine on another.
Starting with large-scale computer simulations and small-scale experiments in a wind tunnel, Meneveau's model considers the cumulative effects of hundreds or thousands of turbines interacting with the atmosphere.
"There's relatively little knowledge about what happens when you put lots of these together," said Meneveau.
The energy a large wind farm can produce, he and his coworkers discovered, depends less on horizontal winds and more on entraining strong winds from higher in the atmosphere. A 100-meter turbine in a large wind farm must harness energy drawn from the atmospheric boundary layer thousands of feet up.
In the right configuration, lots of turbines essentially change the roughness of the land -- much in the same way that trees do -- and create turbulence. Turbulence, in this case, isn't a bad thing. It mixes the air and helps to pull down kinetic energy from above.
Using as example 5 megawatt-rated machines and some reasonable economic figures, Meneveau calculates that the optimal spacing between turbines should be about 15 rotor diameters instead of the currently prevalent figure of 7 rotor diameters.

Carbon Emission Reduction Strategies May Undermine Tropical Biodiversity Conservation, Conservationists Warn

Conservationists have warned that carbon emission reduction strategies such as REDD may undermine, not enhance, long-term prospects for biodiversity conservation in the tropics.
Their warning comes only days ahead of the Cancun COP 16 climate change talks (Nov. 29 to Dec. 10, 2010).
REDD is a United Nations designed mechanism for carbon emission trading that provides financial compensation to developing countries for improved management and protection of their forest resources. If it works, REDD could strengthen the global fight against climate change, and create an opportunity for carbon-rich tropical countries to protect threatened biodiversity as a co-benefit of maintaining forests and the carbon they store.
Writing in the journal Carbon Balance and Management, a network of conservation scientists, including University of Kent's Dr Matthew Struebig, use data for Indonesia, a species-rich tropical country and the world's third largest source of carbon emissions, to highlight ways in which emission reduction strategies could turn sour for wildlife.
Lead author Dr Gary Paoli of Daemeter Consulting in Indonesia explained: 'Biodiversity and forest carbon are correlated at a global scale but we show that this is not the case at sub-national levels in Indonesia. This creates a trade-off between the emission reduction potential and biodiversity value of different ecosystems. In short, the highest carbon savings are not necessarily located in places with the highest levels of species diversity.'
The authors, from Southeast Asia, Europe and the USA, compiled studies of wildlife, plants, land-cover and carbon emissions to show that carbon-dense peat swamp forests, focal ecosystems for REDD in Indonesia, do not coincide with areas supporting the highest concentrations of threatened biodiversity.
Dr Struebig, who works between Kent's Durrell Institute of Conservation and Ecology (DICE) and Queen Mary, University of London, said: 'Peat swamp forests attract the bulk of REDD funds -- they hold around 8 times more carbon than other lowland forests, and provide habitat for high profile species such as orang-utan, tigers and Asian elephants. However, when we look at overall numbers of plants, mammals and birds, especially species of greatest conservation concern, we find that peat forests typically support lower densities and fewer species than other lowland forest types.'
The paper points out that preferential targeting of peatland under REDD could intensify pressures to establish oil palm and paper/pulp plantations in forests that are more important for biodiversity conservation. This problem is not unique to Indonesia, but is a concern throughout the tropics. The authors argue that a regulatory framework is urgently needed to guide implementation of REDD, and recommend three ways to ensure that effective carbon emissions reduction strategies also deliver substantial long-term biodiversity co-benefits in tropical countries -- home to 51 % of the world's 48,170 threatened species.
The authors urge developing countries to prepare their own explicit national targets for ecosystem and species protection across all native ecosystem types. Using these targets, priority ecosystems and threatened species under-represented in the protected area network should be identified. Co-financing from REDD can then be mobilised to redefine acceptable land-use practices within priority areas needed to fill biodiversity conservation gaps. In this way, REDD will offset opportunity costs of foregone development, and ensure that carbon emission reductions deliver biodiversity gains where they are most needed.
Gary Paoli added: 'If such a national planning process was made a pre-requisite for REDD funding, and payments linked to delivery of biodiversity co-benefits, then net positive impacts on biodiversity would be ensured.'
Co-author Dr Erik Meijaard of People and Nature Consulting International commented: 'A target-based approach also respects the sovereignty of countries to prepare their own targets, and fulfils objectives of the Convention on Biological Diversity, both for recipient (tropical) countries and donor (developed) nations who are signatories to the convention.'
The authors note that much of the groundwork for their recommendations has already been set, but support from national governments and the United Nations will prove critical to the success of REDD and its biodiversity outcomes.

Physicists Create New Source of Light: Bose-Einstein Condensate 'Super-Photons'


The creators of the "super-photon" are Julian Schmitt (left), Jan Klaers, Dr. Frank Vewinger and professor Dr. Martin Weitz (right).
Physicists from the University of Bonn have developed a completely new source of light, a so-called Bose-Einstein condensate consisting of photons. Until recently, expert had thought this impossible. This method may potentially be suitable for designing novel light sources resembling lasers that work in the x-ray range. Among other applications, they might allow building more powerful computer chips.
The scientists are reporting on their discovery in the upcoming issue of the journal Nature.
By cooling Rubidium atoms deeply and concentrating a sufficient number of them in a compact space, they suddenly become indistinguishable. They behave like a single huge "super particle." Physicists call this a Bose-Einstein condensate.
For "light particles," or photons, this should also work. Unfortunately, this idea faces a fundamental problem. When photons are "cooled down," they disappear. Until a few months ago, it seemed impossible to cool light while concentrating it at the same time. The Bonn physicists Jan Klärs, Julian Schmitt, Dr. Frank Vewinger, and Professor Dr. Martin Weitz have, however, succeeded in doing this -- a minor sensation.
How warm is light?

When the tungsten filament of a light bulb is heated, it starts glowing -- first red, then yellow, and finally bluish. Thus, each color of the light can be assigned a "formation temperature." Blue light is warmer than red light, but tungsten glows differently than iron, for example. This is why physicists calibrate color temperature based on a theoretical model object, a so-called black body. If this body were heated to a temperature of 5,500 centigrade, it would have about the same color as sunlight at noon. In other words: noon light has a temperature of 5,500 degrees Celsius or not quite 5,800 Kelvin (the Kelvin scale does not know any negative values; instead, it starts at absolute zero or -273 centigrade; consequently, Kelvin values are always 273 degrees higher than the corresponding Celsius values).
When a black body is cooled down, it will at some point radiate no longer in the visible range; instead, it will only give off invisible infrared photons. At the same time, its radiation intensity will decrease. The number of photons becomes smaller as the temperature falls. This is what makes it so difficult to get the quantity of cool photons that is required for Bose-Einstein condensation to occur.
And yet, the Bonn researchers succeeded by using two highly reflective mirrors between which they kept bouncing a light beam back and forth. Between the reflective surfaces there were dissolved pigment molecules with which the photons collided periodically. In these collisions, the molecules 'swallowed' the photons and then 'spit' them out again. "During this process, the photons assumed the temperature of the fluid," explained Professor Weitz. "They cooled each other off to room temperature this way, and they did it without getting lost in the process."
A condensate made of light

The Bonn physicists then increased the quantity of photons between the mirrors by exciting the pigment solution using a laser. This allowed them to concentrate the cooled-off light particles so strongly that they condensed into a "super-photon."
This photonic Bose-Einstein condensate is a completely new source of light that has characteristics resembling lasers. But compared to lasers, they have a decisive advantage, "We are currently not capable of producing lasers that generate very short-wave light -- i.e. in the UV or X-ray range," explained Jan Klärs. "With a photonic Bose-Einstein condensate this should, however, be possible."
This prospect should primarily please chip designers. They use laser light for etching logic circuits into their semiconductor materials. How fine these structures can be is limited by the wavelength of the light, among other factors. Long-wavelength lasers are less well suited to precision work than short-wavelength ones -- it is as if you tried to sign a letter with a paintbrush.
X-ray radiation has a much shorter wavelength than visible light. In principle, X-ray lasers should thus allow applying much more complex circuits on the same silicon surface. This would allow creating a new generation of high-performance chips -- and consequently, more powerful computers for end users. The process could also be useful in other applications such as spectroscopy or photovoltaics.

Sunday, November 28, 2010

Taming Thermonuclear Plasma With a Snowflake


This is a "snowflake" divertor -- a novel plasma-material interface is realized in the National Spherical Torus Experiment.
Physicists working on the National Spherical Torus Experiment (NSTX) at the Princeton Plasma Physics Laboratory are now one step closer to solving one of the grand challenges of magnetic fusion research -- how to reduce the effect that the hot plasma has on fusion machine walls (or how to tame the plasma-material interface).
Some heat from the hot plasma core of a fusion energy device escapes the plasma and can interact with reactor vessel walls. This not only erodes the walls and other components, but also contaminates the plasma -- all challenges for practical fusion. One method to protect machine walls involves divertors, chambers outside the plasma into which the plasma heat exhaust (and impurities) flow. A new divertor concept, called the "snowflake," has been shown to significantly reduce the interaction between hot plasma and the cold walls surrounding it.
Strong magnetic fields shape the hot plasma in the form of a donut in a magnetic fusion plasma reactor called a tokamak. As confined plasma particles move along magnetic field lines inside the tokamak, some particles and heat escape because of instabilities in the plasma. Surrounding the hot plasma is a colder plasma layer, the scrape-off layer, which forms the plasma-material interface. In this layer, escaped particles and heat flow along an "open" magnetic field line to a separate part of the vessel and enter a "divertor chamber." If the plasma striking the divertor surface is too hot, melting of the plasma-facing components and loss of coolant can occur. Under such undesirable conditions, the plasma-facing component lifetime would also be an issue, as they would tend to wear off too quickly.
While the conventional magnetic X-point divertor concept has existed for three decades, a very recent theoretical idea and supporting calculations by Dr. D.D. Ryutov from Lawrence Livermore National Laboratory have indicated that a novel magnetic divertor -- the "snowflake divertor" -- would have much improved heat handling characteristics for the plasma-material interface. The name is derived from the appearance of magnetic field lines forming this novel magnetic interface.
This magnetic configuration was recently realized in NSTX and fully confirmed the theoretical predictions. The snowflake divertor configuration was created by using only two or three existing magnetic coils. This achievement is an important result for future tokamak reactors that will operate with few magnetic coils. Because the snowflake divertor configuration flares the scrape-off layer at the divertor surface, the peak heat load is considerably reduced, as was confirmed by the divertor heat flux on NSTX. The plasma in the snowflake divertor, instead of heating the divertor surface on impact, radiated the heat away, cooled down and did not erode the plasma-facing components as much, thus extending their lifetime. Plasma TV images show more divertor radiation in the snowflake divertor plasmas in comparison with the standard plasmas. Importantly, the snowflake divertor did not have an impact on the high performance and confinement of the high-temperature core plasma, and even reduced the impurity contamination level of the main plasma.
These highly encouraging results provide further support for the snowflake divertor as a viable plasma-material interface for future tokamak devices and for fusion development applications.
Researchers are presenting their work at the 52nd annual meeting of the American Physical Society's Division of Plasma Physics, being held in Chicago Nov. 8-12.

Graphene Gets a Teflon Makeover


Graphane crystal. This novel two-dimensional material is obtained from graphene (a monolayer of carbon atoms) by attaching hydrogen atoms (red) to each carbon atoms (blue) in the crystal
University of Manchester scientists are among the first* to make a new material which could replace or compete with Teflon in thousands of everyday applications.
Professor Andre Geim, who along with his colleague Professor Kostya Novoselov won the 2010 Nobel Prize for graphene -- the world's thinnest material, has now modified it to make fluorographene -- a one-molecule-thick material chemically similar to Teflon.
Fluorographene is fully-fluorinated graphene and is basically a two-dimensional version of Teflon, showing similar properties including chemical inertness and thermal stability.
The results are reported in the advanced online issue of the journal Small. The work is a large international effort and involved research groups from China, the Netherlands, Poland and Russia.
The team hope that fluorographene -- a flat, crystal version of Teflon and is mechanically as strong as graphene -- could be used as a thinner, lighter version of Teflon, and also find applications in electronics, such as for new types of LED devices.
Graphene, a one-atom-thick material that demonstrates a huge range of unusual and unique properties, has been at the centre of attention since groundbreaking research carried out at The University of Manchester six years ago.
Its potential is almost endless -- from ultrafast transistors just one atom thick to sensors that can detect just a single molecule of a toxic gas and even to replace carbon fibres in high performance materials that are used to build aircraft.
Professor Geim and his team have exploited a new perspective on graphene by considering it as a gigantic molecule that, like any other molecule, can be modified in chemical reactions.
Teflon is a fully-fluorinated chain of carbon atoms. These long molecules bound together make the polymer material that is used in a variety of applications including non-sticky cooking pans.
The Manchester team managed to attach fluorine to each carbon atom of graphene..
To get fluorographene, the Manchester researchers first obtained graphene as individual crystals and then fluorinated it by using atomic fluorine.
To demonstrate that it is possible to obtain fluorographene in industrial quantities, the researchers also fluorinated graphene powder and obtained fluorographene paper.
Fluorographene turned out to be a high-quality insulator which does not react with other chemicals and can sustain high temperatures even in air.
One of the most intense directions in graphene research has been to open a gap in graphene's electronic spectrum, that is, to make a semiconductor out of metallic graphene. This should allow many applications in electronics. Fluorographene is found to be a wide gap semiconductor and is optically transparent for visible light, unlike graphene that is a semimetal.
Professor Geim said: "Electronic quality of fluorographene has to be improved before speaking about applications in electronics but other applications are there up for grabs."
Rahul Nair, who led this research for the last two years and is a PhD student working with Professor Geim, added: "Properties of fluorographene are remarkably similar to those of Teflon but this is not a plastic.
"It is essentially a perfect one-molecule-thick crystal and, similar to its parent, fluorographene is also mechanically strong. This makes a big difference for possible applications.
"We plan to use fluorographene an ultra-thin tunnel barrier for development of light-emitting devices and diodes.
"More mundane uses can be everywhere Teflon is currently used, as an ultra-thin protective coating, or as a filler for composite materials if one needs to retain the mechanical strength of graphene but avoid any electrical conductivity or optical opacity of a composite."
Industrial scale production of fluorographene is not seen as a problem as it would involve following the same steps as mass production of graphene.
The Manchester researchers believe that the next important step is to make proof-of-concept devices and demonstrate various applications of fluorographene.
Professor Geim added: "There is no point in using it just as a substitute for Teflon. The mix of the incredible properties of graphene and Teflon is so inviting that you do not need to stretch your imagination to think of applications for the two-dimensional Teflon. The challenge is to exploit this uniqueness."

Easy Fabrication of Non-Reflecting and Self-Cleaning Silicon and Plastic Surfaces


Scanning electron microscopy images of the nanostructured silicon surface made by the maskless plasma etching method; the elastomeric stamp replicated from the silicon surface and the nanostructure replicated to two different polymers using the elastomeric stamp. Both the original silicon surface and the replicated polymer surfaces are non-reflecting and self-cleaning.
The Microfabrication group of Aalto University which specializes in microfabrication and microfludics has developed a new and rapid method for fabrication of non-reflecting and self-cleaning surfaces. Surface properties are based on the nanostructured surface. The research results were just published in the journal Advanced Materials.
The most laborious part the fabrication process was excluded when the Aalto University's Microfabrication group developed a novel maskless method for fabrication of pyramid-shaped nanostructures on a silicon surface using deep reactive ion etching. The nanostructured silicon wafer can be further used as a template to create an ealstomeric stamp, which can be used to replicate the original non-reflective and self-cleaning nanostructure into the different polymers.
Smooth silicon surfaces are mirror-like and they reflect more than 50 percent of incoming light, while nanostructured silicon and polymeric surfaces are almost completely non-reflecting. The reflectance is reduced at broad wavelength range due to smooth refractive index transition from air to substrate because of the nanostructures, says Lauri Sainiemi from Microfabrication group.
Non-reflecting surfaces and their fabrication methods are hot research topics because they are needed in realization of more efficient solar cells. Similar nanostructured silicon and polymeric surfaces can also be utilized in chemical analysis, because low reflectance is needed in analysis procedure. The second beneficial property of the surfaces is self-cleaning, which is based on nanostructures, which are coated with a thin low surface energy film.
The applications of the developed nanofabrication methods for silicon and polymers range from sensors to solar cells. The biggest strength of the fabrication methods is their scalability and possibility to large scale industrial manufacturing. I believe that there is interest because our fabrication methods enable simple and low-cost manufacturing of nanostructures on large areas and the methods are compatible with single-crystalline, poly-crystalline and amorphous silicon as well as wide variety of different polymers, concludes Sainiemi.
The group has already developed surfaces for chemical analysis of drugs in collaboration with other research groups and that research will continue in future. An interesting novel field is the development of more effective self-cleaning and dirt-repellant surfaces that would especially benefit solar cell research. The fabrication of water-repellent surfaces is fairly straightforward, but liquids with low surface tension can still contaminate the surface. At the moment we are developing novel surfaces that also repel oily liquids.

Dangerous Chemicals in Food Wrappers Likely Migrating to Humans


Popcorn popped in a microwave. PAPs are applied as greaseproofing agents to paper food contact packaging such as fast food wrappers and microwave popcorn bags.
University of Toronto scientists have found that chemicals used to line junk food wrappers and microwave popcorn bags are migrating into food and being ingested by people where they are contributing to chemical contamination observed in blood.
Perfluorinated carboxylic acids or PFCAs are the breakdown products of chemicals used to make non-stick and water- and stain-repellent products ranging from kitchen pans to clothing to food packaging. PFCAs, the best known of which is perfluorooctanoic acid (PFOA), are found in humans all around the world.
"We suspected that a major source of human PFCA exposure may be the consumption and metabolism of polyfluoroalkyl phosphate esters or PAPs," says Jessica D'eon, a graduate student in the University of Toronto's Department of Chemistry. "PAPs are applied as greaseproofing agents to paper food contact packaging such as fast food wrappers and microwave popcorn bags."
In the U of T study, rats were exposed to PAPs either orally or by injection and monitored for a three-week period to track the concentrations of the PAPs and PFCA metabolites, including PFOA, in their blood. Human exposure to PAPs had already been established by the scientists in a previous study. Researchers used the PAP concentrations previously observed in human blood together with the PAP and PFCA concentrations observed in the rats to calculate human PFOA exposure from PAP metabolism.
"We found the concentrations of PFOA from PAP metabolism to be significant and concluded that the metabolism of PAPs could be a major source of human exposure to PFOA, as well as other PFCAs," says Scott Mabury, the lead researcher and a professor in the Department of Chemistry at the University of Toronto.
"This discovery is important because we would like to control human chemical exposure, but this is only possible if we understand the source of this exposure. In addition, some try to locate the blame for human exposure on environmental contamination that resulted from past chemical use rather than the chemicals that are currently in production.
"In this study we clearly demonstrate that the current use of PAPs in food contact applications does result in human exposure to PFCAs, including PFOA. We cannot tell whether PAPs are the sole source of human PFOA exposure or even the most important, but we can say unequivocally that PAPs are a source and the evidence from this study suggests this could be significant."
Regulatory interest in human exposure to PAPs has been growing. Governments in Canada, the United States and Europe have signaled their intentions to begin extensive and longer-term monitoring programs for these chemicals. The results of this investigation provide valuable additional information to such regulatory bodies to inform policy regarding the use of PAPs in food contact applications.
The study was conducted by Jessica D'eon and Scott Mabury of the University of Toronto's Department of Chemistry and is published November 8 in Environmental Health Perspectives. Research was funded by the Natural Sciences and Engineering Research Council of Canada.

Water Purification: Is Colloidal Silver Necessary for Bacteria Removal?

Nicole Heinley, a graduate student at Missouri University of Science and Technology, traveled to Guatemala twice in the past year to conduct research on ceramic pot filters that are used locally to remove bacteria from water. Now, Heinley's findings are about to be published in the Journal of Water Science and Technology.
Ceramic pot filters, which are made out of sawdust and clay, have been around in poor countries for hundreds of years. The focus of Heinley's research is on the colloidal silver -- or lack of it -- that is typically used to line the filters. The silver mixture is thought to have disinfection properties -- but the actual disinfection mechanism of the silver is poorly understood.
Heinley wanted to find out if the colloidal silver, which is the most expensive part of the filters, is necessary at all. "It's the only material that has to be imported to manufacture the filters," she says. "The remaining materials -- sawdust and clay -- are available locally."
In the journal article, Heinley and Dr. Curt Elmore, associate professor of geological engineering at Missouri S&T, conclude that the silver may not be necessary to effectively remove bacteria from source water. In their study, filters not lined with silver removed a high rate of E. coli.
"Additional, long-term studies of filters without silver should be undertaken in order to further investigate the issue," Heinley says.
Heinley and Elmore traveled to Guatemala with students from a geological engineering class during winter break and spring break earlier this year. Heinley collected contaminated water samples from a little river in the city of Antigua and studied the structure of the ceramic pot filters available locally. Back at Missouri S&T, she continued the research.

On the Way to CO2-Free Power Plants


In the experimental plant, scientists at TU Darmstadt will explore two novel processes for CO2 capture. 
The Technische Universität Darmstadt has dedicated a pilot plant for capturing carbon dioxide contained in flue gases of power plants. Its Institute for Energy Systems and Technology plans to utilize the plant for investigating two innovative methods for CO2 capture that require less energy and lower operating costs than earlier approaches.
Combustion of fossil fuels, such as coal, fuel oil, or natural gas, liberates large quantities of carbon dioxide, a gas that significantly affects global climate. A key technology that would reduce emissions and lead to more environmentally friendly power plants is the capture and storage of carbon dioxide from flue gases of power plants (carbon capture and storage (CCS)). CCS might be able to reduce CO2 emissions resulting from the employment of fossil fuels for power generation and other uses in industry to near zero and thereby contribute to reducing greenhouse-gas emissions. Earlier approaches to CO2‑capture require expending significantly more energy and entail greatly increased operating costs, which raises questions regarding their efficiency and acceptance. The TU Darmstadt's Institute for Energy Systems and Technology's new pilot plant will be utilized for investigating two new methods for CO2 capture that will allow nearly totally eliminating CO2 emissions and require virtually no additional energy input and entail only slight increases in operating costs.
Over the next two years, the institute's director, Prof. Dr.-Ing. Bernd Epple, and his 26 coworkers will be investigating the "carbonate looping" and "chemical looping" methods for CO2 capture. Both methods employ natural substances and reduce the energy presently required for CO2‑capture by more than half. As Epple put it, "These methods represent milestones on the way to CO2‑free power plants. They might allow coal-fired, oil-fired, and natural-gas-fired power plants to reliably and cost-effectively generate power without polluting the environment."
The carbonate looping method involves utilizing naturally occurring limestone to initially bind CO2 from the stream of flue gases transiting power plants' stacks in a first-stage reactor. The resultant pure CO2 is reliberated in a second reactor and can then be stored. The advantage of the carbonate-looping method is that even existing power plants can be retrofitted with this new method.
On new power plants, the chemical looping method will even allow capturing CO2 with hardly any loss of energy efficiency. Under this method, a dual-stage, flameless, combustion yields a stream of exhaust gases containing only CO2 and water vapor. The CO2 can then be captured and stored.
The investigations of these new methods are being supported with grants totaling seven million Euros from the European Union, the German Federal Ministry for Economic Affairs, and various industrial partners. Due to the pilot plant's height, the TU‑Darmstadt has built a new, twenty-meter high experimentation hall on its "Lichtwiese" campus to house it. Construction of the new hall and pilot plant took twenty months. The plant has already demonstrated its ability to bind CO2 in conjunction with initial trial runs.

Rice Hulls a Sustainable Drainage Option for Greenhouse Growers


Roberto Lopez, from left, and Chris Currey have shown that rice hulls (at right) can be used as a substitute for perlite in growing media without affecting plant growth regulators.
Greenhouse plant growers can substitute rice hulls for perlite in their media without the need for an increase in growth regulators, according to a Purdue University study.
Growing media for ornamental plants often consists of a soilless mix of peat and perlite, a processed mineral used to increase drainage. Growers also regularly use plant-growth regulators to ensure consistent and desired plant characteristics such as height to meet market demands. Organic substitutes for perlite like tree bark have proven difficult because they absorb the plant-growth regulators and keep them from getting to the plants. Using bark requires a 25 percent increase in the volume of growth regulators applied.
"We were not sure whether rice hulls, as an organic component, would hold up the growth regulator," said Roberto Lopez, a Purdue assistant professor of horticulture and co-author of a HortTechnology paper that outlined the findings. "Testing showed that there were no differences in plants grown with rice hulls or perlite."
Pansies and calibrachoa were planted in an 80-20 mix of both peat and perlite and peat and rice hulls and then treated with several different growth regulators. The plants treated with and without growth regulators and grown in peat and perlite and peat and rice hulls had similar heights and stem lengths.
Finding a waste product to replace perlite could reduce the price of growing media since perlite must be mined and heat processed.
"It's a really energy-intensive process and, because it's a mineral, it's non-renewable," said Chris Currey, a horticulture graduate student and co-author of the HortTechnology paper.
Rice hulls are an attractive option, Lopez said, because they can be easily transported on barges and rice growers in the South could increase profits by selling a traditional waste product.
"Often these rice hulls were being burnt because there's not a lot of other use for them," Lopez said.
Syngenta and Fine Americas funded the research. Lopez and Currey collaborated with Purdue research technician Diane Camberato and graduate student Ariana Torres.

'Super-Hero' Material Stretched Into a Possible Electronics Revolution


Cornell researchers made a thin film of europium titanate ferromagnetic and ferroelectric by "stretching" it. They did it by depositing the material on an underlying substrate with a larger spacing between its atoms.
It's the Clark Kent of oxide compounds, and - on its own - it is pretty boring. But slice europium titanate nanometers thin and physically stretch it, and then it takes on super hero-like properties that could revolutionize electronics, according to new Cornell research.
Researchers report that thin films of europium titanate become both ferroelectric (electrically polarized) and ferromagnetic (exhibiting a permanent magnetic field) when stretched across a substrate of dysprosium scandate, another type of oxide. The best simultaneously ferroelectric, ferromagnetic material to date pales in comparison by a factor of 1,000.
Simultaneous ferroelectricity and ferromagnetism is rare in nature and coveted by electronics visionaries. A material with this magical combination could form the basis for low-power, highly sensitive magnetic memory, magnetic sensors or highly tunable microwave devices.
The search for ferromagnetic ferroelectrics dates back to 1966, when the first such compound - a nickel boracite - was discovered. Since then, scientists have found a few additional ferromagnetic ferroelectrics, but none stronger than the nickel compound - that is, until now.
"Previous researchers were searching directly for a ferromagnetic ferroelectric - an extremely rare form of matter," said Darrell Schlom, Cornell professor of materials science and engineering, and an author on the paper.
"Our strategy is to use first-principles theory to look among materials that are neither ferromagnetic nor ferroelectric, of which there are many, and to identify candidates that, when squeezed or stretched, will take on these properties," said Craig Fennie, assistant professor of applied and engineering physics, and another author on the paper.
This fresh strategy, demonstrated using the europium titanate, opens the door to other ferromagnetic ferroelectrics that may work at even higher temperatures using the same materials-by-design strategy, the researchers said.
Other authors include David A. Muller, Cornell professor of applied and engineering physics; and first author June Hyuk Lee, a graduate student in Schlom's lab.
The researchers took an ultra-thin layer of the oxide and "stretched" it by placing it on top of the disprosium compound. The crystal structure of the europium titanate became strained because of its tendency to align itself with the underlying arrangement of atoms in the substrate.
Fennie's previous theoretical work had indicated that a different kind of material strain - more akin to squishing by compression - would also produce ferromagnetism and ferroelectricity. But the team discovered that the stretched europium compound displayed electrical properties 1,000 times better than the best-known ferroelectric/ferromagnetic material thus far, translating to thicker, higher-quality films.
This new approach to ferromagnetic ferroelectrics could prove a key step toward the development of next-generation memory storage, superb magnetic field sensors and many other applications long dreamed about. But commercial devices are a long way off; no devices have yet been made using this material. The Cornell experiment was conducted at an extremely cold temperature - about 4 degrees Kelvin (-452 Fahrenheit). The team is already working on materials that are predicted to show such properties at much higher temperatures.
The team includes researchers from Penn State University, Ohio State University and Argonne National Laboratory.
The research was supported by the Cornell Center for Materials Research, a National Science Foundation-funded Materials Research and Engineering Center (MRSEC), and corresponding MRSECs at Penn State and Ohio State.

Sunday, November 14, 2010

How Lead Gets Into Urban Vegetable Gardens


If you're a vegetable gardener in a lot of older cities, there's a fair chance you have a significant amount of lead in your soil. One common mitigation approach is to build a raised bed and fill it with freshly composted, low-lead soil from elsewhere, right? Maybe not, according to researchers studying the mysterious case of the lead contamination found within raised beds in community gardens in the Boston communities of Roxbury and Dorchester.
"Raised beds are surrounded by a sea of contaminated soil," said Daniel Brabander of Wellesley College. Brabander, his students and colleagues have been studying the lead in 144 backyard gardens in coordination with The Food Project, an organization committed to food security, nutrition and sustainable urban agriculture. Eighty-one percent of the gardens they studied were found to have lead levels above the U.S. EPA limits of 400 micrograms of lead per gram (µg/g) of soil.
To solve that problem, raised wooden beds with freshly composted soil were installed in backyard and community gardens by the Food Project. But the researchers have found that the soil in raised beds that starts with as little as 110 micrograms of lead per gram of soil rose to an average of 336 µg/g of lead in just four years.
Just how this is happening is the focus of a Nov. 1 presentation by Emily Estes at the meeting of the Geological Society of America in Denver.
"We're trying to get a better handle on the mode of transport and the source," said Estes. That means some pretty detailed monitoring and chemical analyses of the minerals in the soils as well as the kind of lead that's in the soil.
Lead contamination in most cities comes from primarily two sources: leaded gasoline and lead paint. Although both sources have been banned, plenty of that lead remains in urban soils all around the raised bed gardens. Roxbury and Dorchester soils have a lot of lead, but they are not unique.
"It's more elevated than similar neighborhoods, but not unlike other cities," said Estes.
"On the East Coast, where cities are a bit older, it's more of a problem," said Brabander. And even within a city the lead contamination can vary significantly, he said, depending on historical traffic patterns and even such very local effects like a house containing lead paint burning down on a lot that is later used for gardening.
The main suspects in transporting lead into raised beds are wind and perhaps rain, which splatters the ground and can potentially throw fine particles of contaminated soil into the raised beds.
The good news in Roxbury and Dorchester is that the kinds of lead being found are not particularly good at being absorbed by the human body, said Estes. There's also a relatively simple and inexpensive way to keep the lead out of raised beds: just scoop away the top inch or two of soil every year from a raised bed and properly dispose of it, according to local regulations.
This research is funded by a Brachman-Hoffman Fellowship that supports new scientific research directions among faculty at Wellesley College.

Unique Duality: 'Exotic' Superconductor With Metallic Surface Discovered

A new material with a split personality -- part superconductor, part metal -- has been observed by a Princeton University-led research team. The discovery may have implications for the development of next-generation electronics that could transform the way information is stored and processed.
The new material -- a crystal called a topological superconductor -- has two electronic identities at once. At very low temperatures, the interior of the crystal behaves like a normal superconductor, able to conduct electricity with zero resistance. At the same time, the surface is metallic, able to carry a current, albeit with some resistance.
This is in direct contrast to most existing materials that are classified as electronic states of matter, including metals, insulators and conventional superconductors, which are consistent in how they do, or don't, conduct electricity. For example, every single atom of every single copper wire is able to carry a current, which dissipates a bit as it travels. Similarly, all the molecules in normal superconductors conduct electricity without resistance when the material is placed at the appropriate temperature.
"The known states of electronic matter are insulators, metals, magnets, semiconductors and superconductors, and each of them has brought us new technology," said M. Zahid Hasan, an associate professor of physics at Princeton who led the research team. "Topological superconductors are superconducting everywhere but on the surface, where they are metallic; this leads to many possibilities for applications."
Hasan and his colleagues published their findings Nov. 1 in the journal Nature Physics.
According to Hasan, one of the most exciting potential uses for the material would be in energy-efficient quantum computers that would have the ability to identify errors in calculation as they occur and resist them during processing. The successful development of such machines is thought to hinge on catching and manipulating elusive particles called Majorana fermions, which were first predicted more than 70 years ago but never before observed, Hasan explained. The split electronic personality of the new superconductors with unusual surface properties, when placed in contact with a special kind of insulator, may enable scientists to coax the electrons whizzing about on the surface to become Majorana fermions, he added.
"These highly unusual superconductors are the most ideal nurseries to create and manipulate Majorana fermions, which could be used to do quantum computing in a fault-resistant way " said L. Andrew Wray, the first author of the paper, who received his doctoral degree from Princeton in 2010. "And because the particles would exist on a superconductor, it could be possible to manipulate them in low power-consumption devices that are not only 'green,' but also immune to the overheating problems that befall current silicon-based electronics."
The significant caveat is that any potential application could be several decades in development.
"Of course, it takes time to go from new physics to new technology -- usually 20 to 30 years, as was the case with semiconductors," Hasan said.
Initial find of insulators begins path to discovery

In 2007, a Hasan-led research team reported the discovery of three-dimensional topological insulators -- a strange breed of insulator with a metallic surface. While three-dimensional topological insulators may have potential for use in next-generation electronics, their properties alone are not ideal for use in quantum computers, Hasan said.
Quantum computers store and process information using the "quantum" behavior of subatomic particles -- phenomena that occur on the ultrasmall scale and are completely at odds with the world that can be seen by the naked eye, such as the ability of electrons to be in two different places at the same time. Quantum computers could one day enable the manipulation of data at speeds that far exceed today's conventional machines, which are rapidly approaching the fundamental limits of their computing capabilities.
However, efforts to create higher-performing quantum computers have been hampered by the notoriously fickle and unpredictable behavior of particles on the quantum scale.
For the past two years, Hasan and his collaborators have been tweaking the properties of a topological insulator called bismuth selenide to create a material with a metallic surface and a superconducting interior, which would have properties well suited to exploitation in the electronics of the future.
To make a superconductor with topological behavior, or unusual surface properties, Princeton chemistry professor Robert Cava and his research group invented a new kind of crystal by inserting atoms of copper into the atomic lattice structure of a semiconductor made out of the compound bismuth selenide. This process, called intercalation doping, is a method used to change the number of electrons in a material and tweak its electrical properties.
The scientists discovered that, with the right amount of doping, they were able to turn the crystal into a superconductor at very low temperatures -- below 4 degrees Kelvin, or around -452 degrees Fahrenheit. However, initial laboratory-based results suggested that the superconductor cannot retain topological properties indefinitely, though they do persist for months if the material is kept in a vacuum.
To assess the topological characteristics of the material, the researchers used a technique known as X-ray spectroscopy to bombard the crystal with X-rays and "pop" individual electrons out of the material. These electrons were then analyzed, providing a series of clues that allowed the team to determine the true nature of the crystal.
These X-ray tests discovered that the scientists had, indeed, created a topological superconductor. Furthermore, they found that the electrons on the crystal's metallic surface were not normal electrons. Rather, the surface featured rare electrons that act like mass-less, light particles. The scientists recognized the particles because the first direct observation of such electrons, called helical Dirac fermions, in three-dimensional materials was reported last year by a separate Hasan-led research team.
Scientific theory, by physicist Charles Kane of the University of Pennsylvania, predicts that if a topological superconductor were to be placed in contact with a topological insulator, some of the electrons at the interface could become long-sought Majorana fermions if the composite material were placed into a very strong magnetic field.
The particles are desirable in electronic devices because, while normal electrons have a negative charge, Majorana fermions are neutral. This charge-less nature means that they wouldn't interact with each other, nor would they be affected by the other charges on the surrounding atoms that make up the crystal.
Because the fermions would not be attracted or repelled by nearby particles and atoms, they would travel in very predictable, predetermined paths -- and this is where their true potential lies.
If the motion of multiple Majorana fermions could be predicted, then topological quantum computers that stored information in these particles could be fault-tolerant, or resistant to errors, he explained. This could be further extended to design methods that would enable the computer to "know" that it had performed a calculation wrong and correct for the error.
"There are many different types of topological superconductors and the exact identification of the current superconductors will require further experiments," Hasan added.
In addition to Hasan, Cava and Wray, who is now a postdoctoral fellow at the Advanced Light Source facility at the Lawrence Berkeley National Laboratory, Princeton scientists on the team included: graduate student Su-Yang Xu; former postdoctoral researchers Yew San Hor and Dong Qian; and Yuqi Xia, who received his doctoral degree from Princeton in 2010. Additional researchers on the team included Alexei Fedorov of the Advanced Light Source at Berkeley Lab and Hsin Lin and Arun Bansil, both of Northeastern University.
"This is an exciting result by Zahid Hasan and coworkers that builds on his previous experimental discovery of the first three-dimensional topological state of matter, the topological insulator," said Joel Moore, an associate professor of physics at University of California-Berkeley and a member of Princeton's class of 1995.
"Theorists believe that if a topological insulator can be made superconducting, the resulting state would have several remarkable properties," Moore said. "The most exotic might be the existence of a new kind of emergent particle, the Majorana fermion … . We have known for some time that solids made up of ordinary nuclei and electrons can host 'emergent' particles with stranger properties, such as fractional charge, but the Majorana fermion, which has zero mass and zero charge, might be the strangest of all. While no single measurement can confirm the existence of topological superconductivity, the work by Hasan is a considerable step in the right direction."
In future projects, Hasan and his collaborators hope to detect Majorana fermions and invent ways to control their properties. Additionally, the research group will aim to identify other types of topological superconductors and topological insulators. Two important goals will be to find topological materials that exhibit superconductivity at higher temperatures and topological insulators whose interior is highly insulating
The research was funded by the U.S. Department of Energy, a National Science Foundation American Competitiveness and Innovation Fellowship and the Alfred P. Sloan Foundation.

Predictive Power of Dairy Cattle Methane Models Insufficient to Provide Sound Environmental Advice, Study Finds


Canadian and Dutch researchers have shown that current equations to predict methane production of cows are inaccurate.
Canadian and Dutch researchers have shown that current equations to predict methane production of cows are inaccurate. Sound mitigation options to reduce greenhouse gas emissions of dairy farms require a significant improvement of current methane equations, according to a study of the Dutch-Canadian team in the journal Global Change Biology.
The researchers, from University of Guelph and University of Manitoba (Canada) and Wageningen University & Research centre (the Netherlands), compared the observed methane production of cows with that predicted by nine different methane equations that are applied in whole farm greenhouse gas models. "The prediction accuracy of these equations is small, and the equations are not suitable to quantify methane production of cows," says Dr Jan Dijkstra, senior researcher worker at Wageningen University and adjunct professor at University of Guelph. "The predictive power of methane equations will have to be markedly improved if such whole farm models are used for sound decisions by governments to reduce environmental impact of dairying."
On a global basis, according to the FAO livestock is responsible for some 18% of all greenhouse gases emitted. Methane is the most important greenhouse gas on a dairy farm.The FAO estimates that about 52% of all greenhouse gases from the dairy sector is in the form of methane. Several whole-farm models are available that predict the total amount of greenhouse gases (the sum of CO2, CH4 en N2O) of dairy farms. Such whole-farm models are applied to make an inventory of total greenhouse gas emission on farm, and to estimate the effect of management changes (changes in breeding, nutrition, etc.) on greenhouse gas emissions. Methane is the single most important element in such estimates. Methane is 25 times more potent than CO2. Hence, the accuracy of estimation of total greenhouse gas emissions of whole-farm models largely depends on the accuracy of the prediction of methane emitted per cow.
The research team compiled a large dataset of actual observations on methane emissions of dairy cattle. The observations were largely derived from respiration chamber experiments, in which methane produced in the gut of the cow is accurately determined. These observations were used to evaluate the predictive power of equations to predict methane production.
The prediction accuracy of all equations was low. The equations hardly account for the effect of dietary composition on enteric methane production. Most equations do not use any dietary information at all, but estimate methane production based on feed intake or milk production. For example, the widely used IPCC (Intergovernmental Panel on Climate Change) equation that predicts methane production based on energy intake of the cow, cannot distinguish the effect of a higher energy intake on methane due to a rise in feed intake level, from that due to a rise in dietary fat content at the same feed intake level. However, a higher feed intake will increase methane production, whereas a rise in dietary fat content will decrease methane production.
From the analysis, it also appears that the variation in predicted methane production is far smaller that the variation in actually observed methane production. Consequently, the methane equations do not fully represent the range of effects of dietary changes on enteric methane production of cows.
The research team concluded that the low prediction accuracy and poor prediction of variation in observed values may introduce substantial error into inventories of GHG emissions and lead to incorrect mitigation recommendations. For sound inventories and mitigation recommendations, much better methane predictions are required. At present, the researchers are actively developing more detailed and accurate models that predict methane production, based on the fermentation processes in the gastro-intestinal tract of cows.

Electrons Get Confused: Researchers May Have Observed the Fastest Melting of All Time


This is the K1-XV-line-spectrum of beryllium-oxide.
Scientists from Helmholtz-Zentrum Berlin (HZB) have observed exotic behaviour from beryllium oxide (BeO) when they bombarded it with high-speed heavy ions: After being shot in this way, the electrons in the BeO appeared "confused," and seemed to completely forget the material properties of their environment. The researchers' measurements show changes in the electronic structure that can be explained by extremely rapid melting around the firing line of the heavy ions. If this interpretation is correct, then this would have to be the fastest melting ever observed.
The researchers are publishing their results in Physical Review Letters.
In his experiments, Prof. Dr. Gregor Schiwietz and his team irradiated a beryllium oxide film with high-speed heavy ions of such strong charge that they possessed maximum smashing power. Unlike most other methods, the energy of the heavy ions was chosen so that they would interact chiefly with their outer valence electrons. As heavy ions penetrate into a material, there are typically two effects that occur immediately around the fired ions: the electrons in the immediate surroundings heat up and the atoms become strongly charged. At this point, Auger electrons are emitted, whose energy levels are measurable and show up in a so-called line spectrum. The line spectrum is characteristic for each different material, and normally changes only slightly upon bombardment with heavy ions.
As a world's first, the HZB researchers have now bombarded an ion crystal (BeO), which has insulator properties, with very high-speed heavy ions (xenon ions), upon which they demonstrated a hitherto unknown effect: The line spectrum of the Auger electrons changed drastically -- it became "washed out," stretching into higher energies. Together with a team of physicists from Poland, Serbia and Brazil, the researchers observed distinctly metallic signatures from the Auger electrons emitted by the heated BeO material. The Auger electrons appeared to have completely "forgotten" their insulator properties. The researchers see this as clear evidence that the band structure breaks down extremely rapidly when the BeO is bombarded with heavy ions -- in less than about 100 femtoseconds (one femtosecond is a millionth of a millionth of a millisecond). This breakdown is triggered by the high electron temperatures of up to 100000 Kelvin. In the long term, however, the material of the otherwise cold solid remains overall intact.
The HZB researchers' results deliver strong evidence of ultra-fast melting processes around the firing line of the heavy ions. This melting is followed by annealing that deletes all permanent signs of the melting process. Prof. Schiwietz hopes to find other ionic crystals that exhibit the same rapid melting process, but in which the annealing process is suppressed. If any are found, then a conceivable application would be programming at femtosecond speeds.

Transparent Conductive Material Could Lead to Power-Generating Windows


Top: Scanning electron microscopy image and zoom of conjugated polymer (PPV) honeycomb. Bottom (left-to-right): Confocal fluorescence lifetime images of conjugated honeycomb, of polymer/fullerene honeycomb double layer and of polymer/fullerene honeycomb blend. Efficient charge transfer within the whole framework is observed in the case of polymer/fullerene honeycomb blend as a dramatic reduction in the fluorescence lifetime.
Scientists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory and Los Alamos National Laboratory have fabricated transparent thin films capable of absorbing light and generating electric charge over a relatively large area. The material, described in the journal Chemistry of Materials, could be used to develop transparent solar panels or even windows that absorb solar energy to generate electricity.
The material consists of a semiconducting polymer doped with carbon-rich fullerenes. Under carefully controlled conditions, the material self-assembles to form a reproducible pattern of micron-size hexagon-shaped cells over a relatively large area (up to several millimeters).
"Though such honeycomb-patterned thin films have previously been made using conventional polymers like polystyrene, this is the first report of such a material that blends semiconductors and fullerenes to absorb light and efficiently generate charge and charge separation," said lead scientist Mircea Cotlet, a physical chemist at Brookhaven's Center for Functional Nanomaterials (CFN).
Furthermore, the material remains largely transparent because the polymer chains pack densely only at the edges of the hexagons, while remaining loosely packed and spread very thin across the centers. "The densely packed edges strongly absorb light and may also facilitate conducting electricity," Cotlet explained, "while the centers do not absorb much light and are relatively transparent."
"Combining these traits and achieving large-scale patterning could enable a wide range of practical applications, such as energy-generating solar windows, transparent solar panels, and new kinds of optical displays," said co-author Zhihua Xu, a materials scientist at the CFN.
"Imagine a house with windows made of this kind of material, which, combined with a solar roof, would cut its electricity costs significantly. This is pretty exciting," Cotlet said.
The scientists fabricated the honeycomb thin films by creating a flow of micrometer-size water droplets across a thin layer of the polymer/fullerene blend solution. These water droplets self-assembled into large arrays within the polymer solution. As the solvent completely evaporates, the polymer forms a hexagonal honeycomb pattern over a large area.
"This is a cost-effective method, with potential to be scaled up from the laboratory to industrial-scale production," Xu said.
The scientists verified the uniformity of the honeycomb structure with various scanning probe and electron microscopy techniques, and tested the optical properties and charge generation at various parts of the honeycomb structure (edges, centers, and nodes where individual cells connect) using time-resolved confocal fluorescence microscopy.
The scientists also found that the degree of polymer packing was determined by the rate of solvent evaporation, which in turn determines the rate of charge transport through the material.
"The slower the solvent evaporates, the more tightly packed the polymer, and the better the charge transport," Cotlet said.
"Our work provides a deeper understanding of the optical properties of the honeycomb structure. The next step will be to use these honeycomb thin films to fabricate transparent and flexible organic solar cells and other devices," he said.
The research was supported at Los Alamos by the DOE Office of Science. The work was also carried out in part at the CFN and the Center for Integrated Nanotechnologies Gateway to Los Alamos facility. The Brookhaven team included Mircea Cotlet, Zhihua Xu, and Ranjith Krishna Pai. Collaborators from Los Alamos include Hsing-Lin Wang and Hsinhan Tsai, who are both users of the CFN facilities at Brookhaven, Andrew Dattelbaum from the Center for Integrated Nanotechnologies Gateway to Los Alamos facility, and project leader Andrew Shreve of the Materials Physics and Applications Division.