Wednesday, August 25, 2010

2009 Snapshot of U.S. Energy Use by Lawrence Livermore National Laboratory

energy usa 2009 image


Look at How Much Energy is Wasted!
The image above  is a snapshot of energy use in the United States in 2009. On the left, the different sources (solar, nuclear, hydro, wind, etc) and how many quads of energy they contribute, and then by following the lines you can see how that energy is used, and how much of it is wasted. Read on for more details.
US-electricity-generation-by-source image
Image: Wikipedia, CC
The estimated U.S. energy use in 2009 equaled 94.6 quadrillion BTUs ("quads"), down from 99.2 quadrillion BTUs in 2008.. The average American household uses about 95 million BTU per year. Of course, energy use follows economic activity, so like in Europe, this decline can partly be explained by that.
But the LLNL also points out that "higher efficiency appliances and vehicles" further reduced energy use, and that there's a switch to more renewable sources and natural gas (which is bad, but not as bad as coal).
"Wind power increased dramatically in 2009 to.70 quads of primary energy compared to .51 in 2008. Most of that energy is tied directly to electricity generation and thus helps decrease the use of coal for electricity production." (source)
CO2 Emissions Data to Come
Carbon emission data isn't out yet, but it is predicted that it will follow a similar curve because of the reduction in coal, natural gas, and petroleum use.
One thing to note on the graph: Look at how inefficient our petroleum use is. Most of it ends up in the "rejected" box. That's because most of it is used for transportation, and the internal combustion engine is notably inefficient. A good reason to electrify transportation.

Japan's Ice Aquarium Displays Fish Frozen In Mid-Swim

japan ice fish photo


japan ice museum image


What a way to explore the ocean...
Japan's Kori no Suizokukan (Ice Aquarium) in Kesennuma, northeastern Japan has about 450 specimens of about 80 species on display for anyone to examine or ponder. But they aren't floating or zipping through aquariums. Nope...they're frozen in blocks of ice.

Crave writes, "Turn on the TV in Japan and you're bound to see someone slicing up a tuna on a cooking show while commentators ooh and aah. It's no wonder, then, that during the current heat wave frying Tokyo, people are heading north for chills and eye candy in the form of giant fish popsicles."
Indeed, the museum is a chilly -5 degrees F.
The museum opened in 2002 in the Uminoichi seafood market -- a likely place for such a display of everything from crabs and octopuses to ice from the Antarctic.
Could this be the only way we'll have access to our ocean's species in the future? It's a bleak thought. But for now, it's a novel way to display the marine life of our seas

Mosquitos From Climate Change Hell Chasing Midwesterners Back Inside


Fear of disease is such a strong motivator. And most people hate insects,. Hence, you will occasionally see reference to the risk of climate-led outbreaks of mosquito-borne diseases like encephalitis or West Nile as a way to get people to pay attention to climate change. That fear-factor didn't materialize in the political game, nor should it. (Mosquito species which spread those particular diseases do not grow more numerous with passing flood waters.)
Forty days and nights of rain in the upper Mid-west this summer, however, have unleashed the aggressively biting, but otherwise benign, plains flood-water mosquito (Aedes trivittatus) in record densities. Kids can't play outside. Golf games and camping trips are canceled. Outdoor Viking gas ranges lie unlit, alone in the shadows of darkened "chimineas." Farmers and landscape workers have to be miserable.
There are upsides. Makers of repellents and flying insect sprays are doing a land office businesses. Corporate picnics and family reunions are on hold. We can expect a strong return to the screen porch - architectural innovation spurred by climate change - that also provides a nice alternative to air conditioning. But, will Michelle Bachmann's voting "base" in her Minnesota district connect the dots?
Probably not. Michele will just say that all we need is to make DDT legal again...
Meanwhile, here's what it looks like around Obama's home turf.
In the far western suburbs of Chicago, WGN TV reports
Wheaton Park Officials said the mosquito population has risen 45% since last summer.
Being chewed upon tends to make people more subjective. In other coverage by WGN,
Abatement officials in the North Shore communities of Evanston, Wilmette and Glencoe are calling the mosquito populations unprecedented. Residents in Schaumburg, Palatine and Wheeling say they're being "eaten alive" even after applying insect repellent.
And on the far south side, mosquitos like forclosures.
"You have so many abandoned homes where pools aren't being cared for and storm gutters are not being attended to,"
Pretty much the same story in Minnesota, Iowa, Wisconsin, Indiana, and so on. Update: In anticipation of the Think Tankers who will attempt to spin this, I'd like to point out that this species drops its eggs outside the water - up on the banks and levees and beyond - in anticipation of periodic flooding. The eggs can go 2 or 3 or more years without floods. However, and this is the first key point, when floods become more intense and long lasting, as expected with climate change effects regionally, water inundates a much greater land area, creating exponentially greater suitable habitat for the floodwater mosquito to lay eggs in. This land may be on the perimeter of human development or amongst it. That may well be the case already (my speculation).
Second, underscoring the point that the floodwater mosquito in the US currently is not a disease vector, the adverse impacts of concern are: nuisance, decreased property value, lowered quality of life in the burbs, unpleasant working conditions for forest products and agriculture e tc..

Can Aquaponics Pay for Itself?

Aquaponics usually stirs up a good deal of interest and debate here. From the awesome urban aquaponics of Growing Power to industrial-scale aquaponics operations, plenty of people believe in the idea of recycling fish poop into plant food in an efficient semi-closed-loop system. And yet questions remain—I've asked before whether aquaponics is cruel, whether soil-less farming can be organic, and whether aquaponics is an efficient way to feed ourselves. Now one study is claiming to answer another important question—can aquaponics pay for itself?
In theory, the idea of a semi-automated backyard food growing system that can produce both high-quality animal protein, and fresh, organic produce would seem like a no-brainer from an economic standpoint. But there is a significant cost outlay involved in setting up an aquaponics system—not to mention a learning curve on keeping it running.
Now Backyard Aquaponics—the Australian makers of aquaponics kits, and publishers of an online aquaponics magazine—are claiming that an independent cost-benefit analysis has proven that one of their kits can pay for itself in as little as 2 to 2.5 years. Furthermore, they are claiming these numbers are based on conservative estimates.
The Cost/benefit Analysis of Aquaponics Systems (PDF download) was created by a Richard Chiang who, it should be noted, appears to be an advocate for aquaponics in the Canberra region of Australia. It was also based on figures for energy usage, labor needs and estimated yields that were provided by the kit suppliers themselves—so this is by no means a definitive study of the economics of aquaponics.
Nevertheless, it's good to see folks putting some numbers together on the viability, or not, of this potentially promising field. I'd love to hear from other aquaponics practitioners about how aquaponics shakes out from economic perspective

Tuesday, August 24, 2010

Geoengineering won't undo sea levels rises

Almost all the technologies for geoengineering our way out of climate change fail a key test: they can't stop the sea from rising and swamping low-lying countries.
"You can't slap the brakes on sea levels now," says John Moore of Beijing Normal University in China. "There's too much inertia in the system."
Moore and colleagues modelled the effects of deploying five different geoengineering techniques during the 21st century, and combined each one with three scenarios for greenhouse gas emissions: continuing to grow at current rates, cutting back dramatically, or cutting only slightly.
Injecting sulphate aerosols into the stratosphere – which reduces the amount of sunlight reaching the surface of the Earth – had little effect. If emissions are allowed to grow at current rates, the model showed sea levels rising by 1.1 metres by 2100. Aerosols could reduce that to 0.8 metres by 2100, but with the rate of rise showing no sign of slowing down at the end of the century, this would simply delay greater rises, not prevent them.
Blocking sunlight with space mirrors did make sea levels start to drop by the end of the century, but only when coupled with stringent emissions cuts. Results were marginally better for a world in which biofuels were rapidly developed and the resulting carbon dioxide was locked underground. This also reversed the rise in sea level by 2100, assuming strict emissions cuts elsewhere – though even in this scenario, the oceans still rose by 30 centimetres.
Tim Lenton of the University of East Anglia, UK, says the study underlines the fact that what matters most is how much CO2 is in the atmosphere. As a result, he says, the priority should be to reduce emissions and create carbon sinks. Michael Marshall

Mystery of the Atlantic's missing plastic flotsam

The amount of floating plastic trapped in a north Atlantic current system hasn't got any bigger in 22 years, despite more and more plastic being thrown away.
Since 1986 students taking samples of plankton in the Atlantic and Caribbean Oceans have also noted when their nets caught plastic debris. Kara Lavender and colleagues at the Sea Education Association in Woods Hole, Massachusetts, analysed the data, and found that of 6136 samples recorded, more than 60 per cent included pieces of plastic, typically just millimetres across. The areas of highest plastic concentration are within the north Atlantic subtropical gyre, where currents gather the debris.
Lavender and her team were surprised to find that the amount of floating plastic had not increased in the gyre. Although it has been illegal since the 1970s for ships to throw plastic overboard, Lavender thinks that the overall rate of plastic rubbish reaching the ocean will have increased, given the fivefold increase in global production of plastic since 1976.

Slipping through the net?

"Where the extra plastic is going is the big mystery," she says. Plastic resists biodegradation and can last decades or more in the ocean. Eventually sunlight and wave motion break it into smaller pieces, which can be harmful to marine life – clogging the stomachs of fish and seabirds, for example.
Law suggests that the plastic might be degrading into pieces small enough to pass through the 0.3-millimetre-mesh nets used in the study, or becoming coated in biofilms and sinking out of range of the nets. However it is unclear why the rate of degradation during the study period should have increased to offset the extra plastic going into the ocean.
She says it is unlikely that ocean currents are pushing plastic out of the gyre, although Simon Boxall of the National Oceanography Centre in Southampton, UK, who wasn't involved in the study, disagrees. He says the Atlantic gyre has an exit strategy in the form of the Gulf Stream. "We've seen high levels of plastic in the Arctic" he says.

Eternal pollution

Wherever it is going at the moment, the plastic on our oceans will eventually be broken down into microscopic pieces and individual molecules whose environmental effect is unknown. "The million-dollar question is, is it causing any damage?" says Boxall.
"When plastic particles get so small are they just like roughage going through the system? Some studies suggest that persistent chemicals in newer plastics function as endocrine disruptors and mimic hormones such as oestrogen."
And this fine-grained plastic is very long-lived. "The depressing thing is it's likely to remain in the oceans essentially forever,"

Frog cells give artificial nose the power of super smell

How do you give a robot a sharper sense of smell? By using genetically modified frog cells, according to Shoji Takeuchi, a bioengineer at the University of Tokyo in Japan.
Today's electronic noses are not up to the job, he says. Although e-noses have been around for a while – and are used to sniff out rotten food in production lines – they lack accuracy.
That's because e-noses use quartz rods designed to vibrate at a different frequency when they bind to a target substance. But this is not a foolproof system, as subtly different substances with similar molecular weights may bind to the rod, producing a false positive.
Instead, Takeuchi believes there is nothing quite as good as biology for distinguishing between different biomolecules, such as disease markers in our breath. So he and his team have developed a living smell sensor.

Frog power

First, immature eggs, or oocytes, from the African clawed frog Xenopus laevis were genetically modified to express the proteins known to act as smell receptors. He chose X. laevis cells as they are widely studied and their protein expression mechanism is well understood.
The team then placed the modified cells between electrodes and measured the telltale currents generated when different molecules bound to the receptors. They found this method can distinguish between many nearly identical biomolecules.
As a proof of concept, Takeuchi has built a robot that shakes its head when moth pheromones are sensed by the nose.
He now wants to extend the frog-based technology. "The X. laevis oocyte has high versatility for the development of chemical sensors for various odorants," he says. "We believe that a shared ability to smell might open a new relationship between man and robot."

How collapsing bubbles could shoot cancer cells dead

JETS of fluid propelled by the collapse of microscopic bubbles could deliver drugs directly into cancer cells, if an idea from a team of engineers pays off. They have made the bubbles project a fine jet that is powerful enough to puncture the cell wall and enter the cell.
Applying a pulse of heat or ultrasound to a fluid can produce bubbles that initially expand rapidly, before collapsing suddenly when the pulse ends. Pei Zhong and his team at Duke University in Durham, North Carolina, knew that the collapsing bubbles send a pressure wave through the surrounding fluid, and that oscillations at the surface of the bubble can generate a needle-like jet. The problem is predicting where the jet will go, and how powerful it will be.
"Previously, there has been little control in jetting direction, and it has been hard to control the strength of the jet," Zhong says. Now the team has shown that when pairs of bubbles collapse in close proximity, they interact in a predictable way.
Using successive pulses from two lasers, one with a wavelength of 1064 nanometres and the other radiating at 532 nanometres, the team rapidly heated a sample of fluid containing the dye trypan blue. The first pulse produced a bubble of vapour 50 micrometres across, and the second produced another bubble close to the first. As the bubbles cooled and contracted, their surfaces began to oscillate, creating vortices in the surrounding fluid. The interaction between neighbouring bubbles caused them to collapse, creating a pair of jets shooting out in opposite directions. This should provide the degree of control necessary for a targeted drug delivery system, Zhong says.
The size of the bubble is crucial, as it dictates the size of the jet, Zhong says. "We want to produce a tiny jet that can penetrate a cell without killing it," he adds.
Zhong and his team tested their bubble needle on cells obtained from a rat tumour. High-speed photography showed that the microbubble pair could be made to collapse in such a way that the jet of blue dye created a hole between 0.2 and 2 micrometres across - allowing the jet of liquid to enter without instantly destroying the cell (Physical Review Letters, DOI: 10.1103/PhysRevLett.105.078101). This shows the jets are suitable for targeting drugs at cells within the body, Zhong says.
He says that it should be possible to use microbubble pairs generated by ultrasound rather than lasers as a clinical drug-delivery system.
Not everyone is convinced that the system will work in a clinical setting, however. In the presence of biological tissues, the oscillating bubbles may be less stable than in the test solution, making it difficult to deliver drugs with any accuracy, says Constantin Coussios, a biomedical engineer at the University of Oxford.

Gulf Oil Plume Tracked

Oil Spill: First journal report to characterize deep-sea oil plume

 

Images taken by Sentry during its descent through an oil plume near the Deepwater Horizon well show a highly turbid oil-emulsion layer at a depth of 1,065 to 1,300 meters. Richard Camilli/WHOI
Images taken by Sentry during its descent through an oil plume near the Deepwater Horizon well show a highly turbid oil-emulsion layer at a depth of 1,065 to 1,300 meters.
The first peer-reviewed report in a major journal to focus on the catastrophic BP oil spill in the Gulf of Mexico describes an enormous, persistent plume of hydrocarbons more than 3,000 feet deep moving southwest from the site of the spill to sites up to 22 miles away. In addition, measurements of oxygen levels in the plume show that oil-feeding microbes are not degrading oil as quickly as experts thought.
Ocean scientist Richard Camilli, marine chemist Christopher M. Reddy, and colleagues at Woods Hole Oceanographic Institution (WHOI), in Massachusetts, and in Australia collected 57,000 data points during their investigation of the plume, one of several such plumes in the Gulf. The researchers used instruments on a ship and on the submersible automated vehicle Sentry, including a mass spectrometer that tagged a number of easily identifiable components of oil: benzene, toluene, ethylbenzene, and xylenes (Science, DOI: 10.1126/science.1195223).
The oil spill, which is now recognized as the largest in history, began on April 20 and dumped an estimated 200 million gal of oil into the ocean until BP engineers stopped the flow with a temporary cap on July 15. Scientists and government officials have issued conflicting reports on the scope of the disaster, most notably regarding the existence and size of underwater plumes.
Scientists at the National Oceanic & Atmospheric Administration have characterized several oil plumes in the Gulf, and recently researchers at the University of South Florida announced that samples from the plumes match those from BP’s oil (C&EN, Aug. 2, page 12). The published WHOI study adds credibility to those reports.
During its 10-day mission in June, WHOI’s Sentry zigzagged through the plume, providing definitive boundary information and measuring the rate at which the plume was migrating from the well site. Measurements showed no significant drops in dissolved oxygen, implying that oil is not being quickly degraded by microbes, which consume the gas.
Although the NOAA work mainly focused on dispersed oil near the wellhead, the WHOI researchers found “a river of oil in deep water and the current moving surprisingly quickly southwest from the wellhead,” notes Jeffrey Short, an oil spill expert and consultant for Oceana, an environmental advocacy group. The paper’s estimate of the migration rate of dispersed oil in the plume “is the first rate I’ve seen or heard about,” he says.
The amount, location, and form of remaining oil from the spill continue to be hotly debated. The federal government’s National Incident Command released a report on Aug. 2 that was widely interpreted to suggest that 75% of the oil had already degraded. However, experts with the University of Georgia, Athens, immediately responded with their own interpretation, saying that almost 80% of the oil persists in the environment.

Beet Plantings Banned

Ruling: Judge says USDA will have to review Monsanto crop

Sugar beets are the source of more than half of U.S.-produced sugar. USDA ARS
Sugar beets are the source of more than half of U.S.—produced sugar.
 
A federal judge has temporarily nixed future plantings of Monsanto’s genetically engineered Roundup Ready sugar beets in a case brought against the Department of Agriculture in 2008 by environmental groups including the Center for Food Safety and the Sierra Club. The order does not affect crops already planted on 1 million acres in 10 states.
Worried about “the likelihood and potential amount of irreparable harm to the environment,” the judge ordered USDA to prepare the environmental impact statement that should have been prepared for the beets, engineered to be resistant to Monsanto’s glyphosate herbicide Roundup, when they were first deregulated in 2005.
Monsanto says the judge’s action does not question the safety or benefits of the genetically engineered crop and would not have a significant impact on its business. The Sugar Industry Biotech Council, of which Monsanto is a member, notes that USDA can adopt interim measures to allow future plantings of Roundup Ready sugar beets. USDA is mum on the possibility but says it will not be able to complete an impact statement until April 2012.
Judge Jeffrey S. White of the U.S. District Court for the Northern District of California wrote in his order that he was “troubled by maintaining the status quo that consists of 95% of sugar beets being genetically engineered while [USDA’s] Animal & Plant Health Service conducts the environmental review that should have occurred before the sugar beets were deregulated.”
The environmental groups maintain that Roundup Ready crops, which also include soybeans and corn, “have led to increased use of herbicides, proliferation of herbicide-resistant weeds, and contamination of conventional and organic crops.”
John Roberts, an analyst with investment research firm Buckingham Research, notes that although sugar beets are not as important a crop as soy and corn, both of which Monsanto and others firms such as DuPont and Syngenta have genetically engineered, “they’re part of the broadening of biotech acceptance to a wider range of crops.”
Roberts adds that “wheat is the next mega-crop” that biotech firms are targeting. Success for biotech there “will clearly rely on broader acceptance of biotech,” because wheat is mostly intended for human consumption.

Antidepressant's Unusual Speed Explained

Neuroscience: Ketamine, which can overcome depression in hours, stimulates rapid synapse formation

 
 
ANTIDEPRESSANT MECHANISM 
Ketamine and Ro 25-6981 stimulate a rapid increase in neuronal connections.
Unlike commercially available antidepressants, which require weeks or months to take effect, a single dose of ketamine can overcome depression in hours, a speed advantage that can spell the difference between life and death for suicidal patients. Researchers at Yale University School of Medicine have now discovered why the compound works so fast. Their findings illuminate the mechanisms underlying depression and also suggest new targets for its treatment.
Depression is believed to correlate with a reduction in the number of synapses, or connections between neurons, in the prefrontal cortex of the brain, notes Ronald S. Duman, who studies molecular psychiatry and pharmacology at Yale. Duman's team now reports that ketamine undoes this damage by increasing the number of these synapses in rats within 24 hours of administration, whereas traditional treatments do not (Science 2010, 329, 959).
The researchers determined that ketamine stimulates the mammalian target of rapamycin (mTOR) signaling cascade, which is involved in protein synthesis and synaptic modification in neurons. Ketamine activates this pathway by preventing the neurotransmitter glutamate from binding to the N-methyl-D-aspartate (NMDA) class of receptors on neurons.
"Together, these findings suggest that the rapid activation of mTOR-mediated signaling pathways may be an important and novel strategy for the rational design of fast-acting antidepressants," note John F. Cryan and Olivia F. O'Leary, neuropharmacologists at University College Cork, in Ireland, in a commentary about the work (Science 2010, 329, 913). They add that drugs that target this pathway would provide an alternative to the antidepressants currently on the market, nearly all of which function by boosting brain levels of neurotransmitters such as serotonin and norepinephrine.
Ketamine itself is unsuitable as a commercial antidepressant. At doses higher than required for the antidepressant effect, it serves as an anesthetic. It can induce hallucinations—hence its popularity as the street drug "Special K"—and it must be injected. Ketamine can be administered by a doctor, but this practice is inconvenient because the antidepressant effect of a dose wears off after about a week.
The pharmaceutical industry is now trying to develop safe, fast-acting antidepressants that can be given orally and can't be abused, says Duman.
One lead is Ro 25-6981, a compound originally developed by Roche. Duman's team showed that the compound acts on the mTOR pathway, just as ketamine does. But Ro 25-6981 might avoid ketamine's side effects because it activates only the NR2B subclass of NMDA receptors, Duman says.
Last month, Roche and the drug discovery and development firm Evotec announced the start of Phase II clinical studies of another potential rapidly acting antidepressant, a selective NR2B inhibitor called EVT 101.
Researchers hope that ketamine substitutes will preserve another impressive characteristic of the anesthetic—its ability to overcome depression in patients who don't respond to conventional treatments. For example, Carlos A. Zarate Jr., chief of experimental therapeutics at the National Institute of Mental Health in Bethesda, Md., and colleagues recently reported that a single injection of ketamine reduced depression after just 40 minutes in bipolar patients who had failed to obtain relief from other treatments (Arch. Gen. Psychiatry 2010, 67, 793). Furthermore, only 6% of participants responded to placebo, but 71% responded to ketamine.

PotashCorp Snubs BHP Billiton

Fertilizers: Both companies see a bright future for crop nutrients but can't agree to a takeover deal

PotashCorp
 
DAILY GRIND A PotashCorp mine in Saskatchewan.
 
Potash Corp. of Saskatchewan has rejected the $40 billion acquisition offer of BHP Billiton, and now the Australian mining giant is taking its bid for the fertilizer company directly to shareholders.
The offer "provides PotashCorp shareholders with value certainty and an immediate opportunity to realize the value of their shares in the face of volatile equity markets," BHP Billiton's chairman, Jacques Nasser, said in a letter to PotashCorp's board.
In a conference call with investors, PotashCorp CEO William J. Doyle gave a rebuttal. "We believe we are on the verge of an inflection point where potash demand is expected to return to historical trend line growth and tightened supply and improved pricing," he said.
PotashCorp's board unanimously rejected BHP Billiton's $130.00-per-share offer, maintaining that it "grossly undervalues" the company. The offer represents a 16% premium of its stock price on Aug. 16, the day before the offer was disclosed to the public. In mid-2008, when the fertilizer industry peaked before the financial downturn, PotashCorp stock was trading at more than $200 per share. That year, the company had nearly $3.5 billion in earnings on $9.4 billion in sales.
On Aug. 17, PotashCorp shares immediately jumped to $140.00 on news of the offer, an indication that shareholders are anticipating a better deal than BHP Billiton's tender offer.
Analysts also are anticipating a bid that is better than the initial $130.00-per-share offer. After BHP Billiton's acquisition bid, David Begleiter, a stock analyst at Deutsche Bank, raised his target price for PotashCorp to $150.00 per share to reflect what he believes is the "minimum acquisition price," and he wouldn't rule out offers climbing as high as $175.00 per share.
PotashCorp is the world's largest producer of potash, which contains potassium compounds that enhance root growth in crops such as corn and soybeans. Its more than 13 million metric tons of potash mining capacity represents about 20% of the world's total. The firm generated net income of $1.0 billion on $4.0 billion in sales in 2009.
BHP Billiton mines for metals such as aluminum, copper, lead, zinc, gold, silver, iron, and titanium. It also has diamond and coal mines and an oil and gas exploration business. The company has been keen on entering the potash business because of a growing and increasingly wealthy world population. The company recently acquired a large land position in PotashCorp's backyard in Saskatchewan, where it has been planning a new potash mine.

Uncovering A New Chlorophyll


For the first time in more than 60 years, researchers have found a new kind of chlorophyll, the pigment used by both plants and bacteria to catch sunlight and convert it into energy by means of photosynthesis (Science, DOI: 10.1126/science.1191127).
The newly christened chlorophyll f was found in cyanobacteria living in rocky outcroppings off the west coast of Australia. It absorbs redder wavelengths of light—stepping beyond the visible to the infrared range—than its four chlorophyll siblings and thus widens the spectrum of light known to be harvested by photosynthetic organisms. The finding might be exploited by those aiming to use biotechnology to produce renewable energy from light.
The discovery of this fifth chlorophyll pigment was “totally unexpected,” says Min Chen, a biochemist at the University of Sydney, who led the study. The last time a chlorophyll pigment was discovered was in 1943. For decades, most researchers thought this chlorophyll d was actually a laboratory artifact because it could not be reproducibly found in nature. Then in 1996, Japanese researchers finally discovered a cyanobacterium that carries out photosynthesis primarily with chlorophyll d, which also absorbs in the infrared.
Chen and her colleagues were looking for microorganisms that produce chlorophyll d in an Australian coastal area called Hamelin Pool when they unsuspectingly collected bacteria that produce chlorophyll f. In the shallow waters of Hamelin Pool, biofilms of cyanobacteria form hard, rocky structures called stromatolites.
Chlorophyll f was discovered in cyanobacteria living in rocky stromatolites (below) on the west coast of Australia. Courtesy of Reut Abramovich & Alyssa Cobb (both)
 
Chlorophyll f was discovered in cyanobacteria living in rocky stromatolites (below) on the west coast of Australia.
The researchers picked up some of the stromatolites, ground them up, and cultured in red light the cyanobacteria growing inside them, only to discover that they were producing chlorophyll d as well as an unknown pigment, chlorophyll f, whose structure they elucidated using nuclear magnetic resonance spectroscopy.
Chlorophyll f differs only slightly from chlorophylls a, b, and d. Its uniqueness primarily involves placement of a formyl group on carbon-2 of the pigment molecule, whereas the same formyl group resides on C-3 of chlorophyll d, for example. This minor chemical modification “changes the spectral properties dramatically,” allowing chlorophyll f to absorb wavelengths of light to about 760 nm, some 20 nm deeper into the infrared than chlorophyll d, Chen says. Chlorophyll-f-producing cyanobacteria living in stromatolites have an advantage because they can absorb the longer wavelengths of light that percolate deep within the rocky formations, she adds.
“This is a very important new development,” says Robert Blankenship, who studies photosynthetic reactions at Washington University in St. Louis. The finding “could have biotechnological implications because it permits use of a wider range of the solar spectrum and could possibly contribute to improving the efficiency of photosynthesis,” for example, in biofuel production.
Next up, Chen and her colleagues are planning to further study the function of chlorophyll f in photosynthesis and narrow down the exact species of cyanobacteria that produces the new pigment.

Nanoscale DNA sequencing technique to advance personalized medicine

Illustration depicting a single strand of DNA moving through a nanopore that is being used to sequence the DNA (Image: Ian Derrington)


One of the long held hopes for DNA sequencing is that it will usher in an era of personalized, predictive medicine by providing individualized blueprints of genetic predispositions for specific conditions and diseases, such as cancer, diabetes and addiction. Researchers have now devised a method that works at a very small scale to sequence DNA quickly and relatively inexpensively that could open the door to more effective individualized medicine.
The technique creates a DNA reader that combines biology and nanotechnology using a nanopore taken from Mycobacterium smegmatis porin A. The nanopore has an opening 1 billionth of a meter in size, just large enough to measure a single strand of DNA as it passes through. The nanopore has an opening 1 billionth of a meter in size, just large enough to measure a single strand of DNA as it passes through.
The scientists placed the pore in a membrane surrounded by potassium-chloride solution. A small voltage was applied to create an ion current flowing through the nanopore, and the current's electrical signature changed depending on the nucleotides traveling through the nanopore. Each of the nucleotides that are the essence of DNA – cytosine, guanine, adenine and thymine – produced a distinctive signature.

One at a time please

The team had to solve two major problems. One was to create a short and narrow opening just large enough to allow a single strand of DNA to pass through the nanopore and for only a single DNA molecule to be in the opening at any time. Michael Niederweis at the University of Alabama at Birmingham modified the M. smegmatis bacterium to produce a suitable pore.
The second problem was that the nucleotides flowed through the nanopore at a rate of one every millionth of a second, far too fast to sort out the signal from each DNA molecule. To compensate, the researchers attached a section of double-stranded DNA between each nucleotide they wanted to measure. The second strand would briefly catch on the edge of the nanopore, halting the flow of DNA long enough for the single nucleotide to be held within the nanopore DNA reader. After a few milliseconds, the double-stranded section would separate and the DNA flow continued until another double strand was encountered, allowing the next nucleotide to be read.
The delay, though measured in thousandths of a second, is long enough to read the electrical signals from the target nucleotides.
"We can practically read the DNA sequence from an oscilloscope trace," said Jens Gundlach, a University of Washington physics professor and lead author of a paper describing the new technique published the week of Aug. 16 in the Proceedings of the National Academy of Sciences.

'Martian technology' to keep solar panels dust-free

Deserts are the obvious locations for solar power plants. The land is cheap and the sunshine is plentiful. Unfortunately so too is the dust, dirt and wind that leads to dirty solar panels that can take a big hit in efficiency. Sending a guy around with a squeegee in the sweltering heat doesn’t sound like the best job in the world and self-cleaning systems that rely on water aren’t always an option in areas where clean water is hard to come by. Another solution is self-dusting solar panels that are cleaned by an electric charge provided by the solar panels themselves. The self-dusting solar panels are based on technology developed for another dry and dusty environment – Mars.
The technology involves placing a transparent, electrically sensitive material deposited on glass or a transparent plastic sheet covering the panels. Sensors monitor dust levels on the surface of the panel and energize the material when dust concentration reaches a critical level. The electric charge sends a dust-repelling wave cascading over the surface of the material, lifting away the dust and transporting it off of the screen's edges.
Within two minutes, the process removes about 90 percent of the dust deposited on the panels and requires only a small amount of electricity generated by the panel for cleaning purposes.
"We think our self-cleaning panels used in areas of high dust and particulate pollutant concentrations will highly benefit the systems' solar energy output," study leader Malay K. Mazumder, Ph.D. said. "A dust layer of one-seventh of an ounce per square yard decreases solar power conversion by 40 percent," Mazumder explains. "In Arizona, dust is deposited each month at about 4 times that amount. Deposition rates are even higher in the Middle East, Australia, and India."
Mazumder, who is with Boston University, said the need for that technology is growing with the popularity of solar energy. Use of solar, or photovoltaic, panels increased by 50 percent from 2003 to 2008, and forecasts suggest a growth rate of at least 25 percent annually into the future.
"Our technology can be used in both small- and large-scale photovoltaic systems. To our knowledge, this is the only technology for automatic dust cleaning that doesn't require water or mechanical movement."
Working with NASA, Mazumder and colleagues initially developed the self-cleaning solar panel technology for use in lunar and Mars missions. "Mars of course is a dusty and dry environment," Mazumder said, "and solar panels powering rovers and future manned and robotic missions must not succumb to dust deposition. But neither should the solar panels here on Earth."
The current market size for solar panels is about $24 billion, Mazumder said. "Less than 0.04 percent of global energy production is derived from solar panels, but if only four percent of the world's deserts were dedicated to solar power harvesting, our energy needs could be completely met worldwide. This self-cleaning technology can play an important role."
The team described the benefits of the self-cleaning coating in a report at the 240th National Meeting of the American Chemical Society (ACS).

The Power Of Plastic

FLEX TIME Made from light-sensitive polymers, lightweight flexible photovoltaic sheets (such as the Konarka product shown here) are converting ordinary surfaces into inexpensive power generators

Like a one-two punch combination, “cheap” and “plastic” are sometimes hurled one after the other at manufactured goods to denote poor quality—as in “cheap plastic toy” or “cheap plastic car parts.” Although the C and P words are used in those cases pejoratively, when it comes to certain types of solar cells, cheap and plastic are precisely what make them so attractive.
Any type of device that converts light to electricity holds promise for tapping the essentially inexhaustible and non-carbon-emitting energy supply that flows from the sun. Devices that mediate that energy conversion by way of light-sensitive organic polymers or other organic molecules may be especially attractive because of the low cost of the materials and manufacturing methods required to produce such cells.
Inexpensive polymer-based solar cells, which are also known as organic photovoltaic cells, already exist; they have been around since the 1990s. But their performance, and in particular the efficiency with which they convert light to electricity—typically on the order of 5%—has remained much lower than the 10–15% conversion efficiencies provided by costly traditional solar cells based on silicon, cadmium-telluride, and other inorganic semiconductors. That single-digit value pales even further when compared with some highly specialized and pricey state-of-the-art inorganic research devices that yield conversion efficiencies topping 40%.
Recently, however, scientists have begun closing the performance gap, albeit in small increments, through an enormous surge in organic solar-cell research. The studies are aimed at designing new types of semiconducting polymers and other novel compounds, improving processing conditions, and developing a more complete understanding of the basic phenomena that govern the performance of organic photovoltaic devices.
Proponents of solar technology often point out that the quantity of solar energy impinging on Earth’s surface in an hour is greater than the total energy consumed by humankind in a single year. With statistics as staggeringly powerful as that one, “we simply cannot ignore the opportunity that is presented by using solar energy,” says Russell A. Gaudiana, vice president for research at Konarka Technologies. Based in Lowell, Mass., Konarka develops and manufactures a line of lightweight and flexible organic photovoltaic panels for portable power use and other applications.
The market opportunities also cannot be ignored. Overall, the photovoltaics industry generated some $37 billion in global revenues in 2008, according to Solarbuzz, an international solar energy research and consulting company. That value rose to $38.5 billion in 2009, the company reports. Nearly all of that revenue, however, comes from sales of inorganic solar-cell products such as rooftop panels.
“For now, we can safely claim that organic photovoltaics has a nearly zero percent share of the market,” says Yang Yang with a note of humor. Yang, an organic electronics specialist and professor of materials science at the University of California, Los Angeles, explains that nearly all organic solar-cell companies are in the research and development phase and not yet selling products.
Although the field has not yet begun to stimulate much sales activity, in terms of research, it’s flourishing. During the past few years, several thousand journal papers and conference proceedings have been published on organic solar cells. For cells based on common inorganic semiconductors, the number of publications during the same period is about an order of magnitude lower.
One reason organic solar cells are such a hot topic, in Yang’s view, “is that the threshold to entering this field of research is very low.” He explains that scientists who want to conduct studies in this area can easily buy relatively inexpensive organic compounds, investigate their properties by using common laboratory tools, and publish the findings. In contrast, working with some types of inorganic semiconductors requires specialized equipment and advanced handling procedures.

“Overall, the photovoltaics industry generated 
some $37 billion in global revenues in 2008. 
That value rose to $38.5 billion in 2009.”

But perhaps the biggest reason that research in organic photovoltaics is booming is the magnitude of the potential payoff. “Solar energy only accounts for a small percentage of our energy production today largely because photovoltaic technology is prohibitively expensive,” according to Alex K-Y. Jen, a chemistry and materials science professor at the University of Washington, Seattle. Inorganic photovoltaic products, and in particular high-performance devices, are very costly because of demanding and energy-intensive crystal-growth, device-fabrication, and associated manufacturing processes.
That’s where organic chemistry may be able to offer a key advantage over its inorganic counterpart, Jen says. If device performance can be improved, then organic solar-cell technology may become economically viable by taking advantage of low-cost solution-phase processing and mass production via commercial-scale roll-to-roll printing methods.
In addition, unlike traditional rigid solar panels, lightweight and flexible plastic panels can be incorporated into a wide variety of nontraditional and irregularly shaped products. Examples include backpacks and other types of handbags that can charge laptops, mobile phones, and other portable electronics while the user is on the go. Several of these types of products, which feature Konarka solar panels, are available today from various retailers. Other products based on “smart fabrics” that turn clothing, tents, umbrellas, and other ordinary items into power generators are under development by a number of companies, as are polymer solar panels that are designed to be incorporated into windows, skylights, walls, and other building surfaces.
Regardless of the shape and intended application, all photovoltaic cells have common features. At the heart of a solar cell is a light-sensitive semiconducting material in which the first steps of the power generation process occur. In traditional solar cells, that material is typically silicon. In an organic photovoltaic device, the photoactive region generally consists of two materials. One serves as a light-harvesting electron donor and the other as an electron acceptor.
Photons impinging on that region can cause electronic excitations in the donor material leading to formation of excitons. Excitons are high-energy couples in which an energetic electron is bound to a positively charged electron vacancy or hole. To produce electrical current, the electron-hole pair must migrate to the interface between the electron donor and acceptor materials, where it splits into separate mobile charges. The charges then diffuse to their respective electrodes—electrons are transported by way of the electron acceptor material to the cathode, and holes travel via the electron donor to the anode. Overall, the charge-transfer and charge-transport processes generate a flow of electrical current that can be used as a power source.
Researchers have experimented with a variety of electron donor and acceptor materials as well as with ways of pairing them. In the 1990s, a team led by chemistry Nobel Laureate Alan J. Heeger, a professor at UC Santa Barbara, discovered that photoinduced electron transfer in composites of conducting polymers (electron donors) and C60-type compounds (electron acceptors) proceeds much more efficiently than in the absence of the fullerenes. The early studies focused on phenylene-vinylene polymers and functionalized fullerenes, including a phenyl-butyric acid-substituted C61 compound known as PCBM.
Just Right</span> In blends of P3HT and PCBM, moderate heat treatment induces mixing of the phases in a way that promotes charge transfer. Excessive heating leads to unwanted crystallization and phase segregation. Adapted from J. Phys. Chem. Lett.
One way to physically bring the acceptor and donor materials together is to deposit a layer of one material on the other. In semiconductor parlance, the planar interface between the electronically dissimilar materials in that arrangement is known as a planar heterojunction. That geometry is sometimes used to fashion solar cells from nonpolymeric organic compounds. The planar design is simple. But because the interfacial area is small and fixed, it’s difficult to tweak the cell’s performance—in particular the conversion efficiency.
To greatly increase the interfacial area, Heeger’s team blended the polymer and fullerene components in a way that formed an interpenetrating bicontinuous network of donor-acceptor junctions. In that arrangement, which is now referred to as a bulk heterojunction (BHJ), the polymer and fullerene phases are intermingled on the nanometer scale.
Gaudiana likens the morphology of a BHJ active layer to a sponge. The solid part represents the nanosized interconnected bits of fullerene. The polymer is represented by the holes, which are intimately connected to other holes throughout the sponge and never far from a solid region. Blending the phases on that scale, in effect, distributes small regions of interface throughout the photoactive layer. As a result, excitons need to diffuse only a short distance before quickly reaching a donor-acceptor interface where they can dissociate into separate charges.
Once they separate, electrons and holes follow tortuous paths—hopping repeatedly from one nanosized domain of fullerene (or polymer) to the next until they reach their respective electrodes. It’s a complicated charge-transfer and charge-transport mechanism, but thus far nearly all of the top-performing organic solar cells—the ones providing the greatest conversion efficiencies—have been based on BHJs.
In the 1990s, by applying the BHJ strategy to the phenylene-vinylene-PCBM system, the Santa Barbara group produced solar cells that yielded conversion efficiencies of some 3%, which was an outstanding value at that time for organic photovoltaics. Since then, functionalized fullerenes including PCBM have remained the electron acceptor materials of choice. In contrast, many types of polymers have been studied since then but poly(alkyl-thiophenes)—and in particular poly(3-hexylthiophene) (P3HT)—paired with PCBM have emerged during the past five years as some of the most important and best-studied systems. Various groups have reported attaining 5% conversion efficiencies from cells made from P3HT-PCBM.

“Solar energy only accounts for a small percentage of our energy production today largely because photovoltaic technology is prohibitively expensive.”

In addition to thiophenes, various types of fluorene-, carbazole-, and cyclopentadithiophene-based copolymers have been studied widely by researchers in academia and industry. Although companies tend to keep quiet about the specifics of the solar-cell materials they study, Konarka researchers acknowledge that in addition to other promising materials, they have examined the P3HT-PCBM system and the cyclopentadithiophene class of polymers in detail. A team of the company’s leading scientists surveyed results from those organic solar cells and others in a review published in Advanced Materials (2009, 21, 19).
Heavy lifting has been required to raise the performance of these lightweight solar cells above the best results reported just one or two years ago. One research thrust focuses on improving the cells’ operating characteristics by customizing the electron energy levels of the polymers with respect to those of the fullerene. Tuning the levels appropriately could enhance exciton dissociation kinetics and raise the value of a solar-cell parameter known as the open-circuit voltage.
With those goals in mind, University of Chicago chemistry professor Luping Yu teamed up with UCLA’s Yang to design and test a series of novel copolymers made by reacting a benzodithiophene compound with various thienothiophenes. The team aimed to lower the polymers’ highest occupied molecular orbital (HOMO) levels by attaching successively stronger electron-withdrawing groups to the polymer backbone.
The strategy worked. By replacing an alkoxy group that was adjacent to a carbon­yl group with an alkyl chain at the same position, the group lowered the HOMO level by roughly 0.1 eV. They lowered the level by another 0.1 eV by adding a fluorine atom. Then the group paired the novel polymers with PCBM to prepare solar cells. Consistent with the trend in the customized electron energy levels, the solar cell containing the fluoropolymer yielded the best results—and a record-breaking conversion efficiency of roughly 6.8%, as certified by the National Renewable Energy Laboratory, in Golden, Colo. (Nat. Photonics 2009, 3, 649). In follow-up work on this family of polymers, Yu’s group recently reported slightly higher conversion efficiencies (just over 7%) (Acc. Chem. Res., DOI: 10.1021/ar1000296).
Banking on a belief that academic success can spawn commercial success, organic photovoltaics start-up Solarmer Energy licensed a portfolio of technology developed in the UCLA and Chicago laboratories and recently set up shop in El Monte, Calif. Vishal Shrotriya, a research director at Solarmer, notes that in 2009 the company set three certified world records and just beat the best of the trio last month with a solar cell that has an efficiency of 8.13%. “We are definitely on target to reach 10% by the end of 2011 or even sooner,” Shrotriya says. The company expects to begin selling products for powering portable electronics and solar cells incorporated into smart fabrics next year.
In addition to tuning electron energy levels, scientists are trying to improve organic solar cells by stabilizing BHJs. University of Massachusetts, Amherst, chemistry professor Dhandapani Venkataraman explains that the BHJ structure is controlled by subtleties in molecular architecture, intermolecular interactions between polymers and fullerenes, and the packing propensities of the molecules’ π-conjugated moieties.
Regarding packing, for example, the hexyl chains in annealed (heat treated) P3HT films form lamellar structures that pack via π-π interactions with a spacing of about 3.5 Å, which is roughly half the packing spacing in PCBM, he says. As a result of the packing mismatch, there is very little order in unannealed films of the blend. Overannealing the films leads to unwanted phase segregation (too much order) and forms highly crystalline regions. Only gently annealed blends adopt the BHJ structure with the molecular order suitable for solar cells (J. Phys. Chem. Lett. 2010, 1, 947).
Controlling molecular order in solar-cell films is one of the key objectives in research led by Scott E. Watkins and Gerard J. Wilson of the Commonwealth Scientific & Industrial Research Organization (CSIRO), Australia’s national lab agency. Just recently, the team used a free-radical polymerization method designed for making well-defined block copolymers to prepare a series of novel benzothiadiazole-containing pendant polymers. Among other findings, the team observed on the basis of atomic force microscopy that films of the block copolymers are far smoother and more ordered than films made from blends of the two small-molecule pendants or blends of the two homopolymers (Macromolecules DOI: 10.1021/ma1008572).
In related work, David J. Jones and Wallace W. H. Wong of Australia’s University of Melbourne, together with CSIRO scientists, showed that using compounds such as hexabenzocoronenes for harvesting light and transporting holes in solar-cell films is advantageous because these small molecules self-assemble, which enhances the rates of those processes (Adv. Funct. Mater. 2010, 20, 927).
Although much of the publicity surrounding solar-cell research focuses on experimental findings, it’s no surprise that computational experts are using their computers to search for promising candidate molecules. It may be surprising, however, to learn that they are doing much of the job with other people’s computers. At Harvard University, Johannes Hachmann, a postdoc working with chemistry professor Alán Aspuru-Guzik, and coworkers carry out these calculations via a large distributed computing network. By signing up, participants around the world allow their computers to be used when they would otherwise be idle. Hachmann notes that the second phase of this effort, called the Clean Energy Project, began this summer. “During the first month we already screened nearly 200,000 molecules in 2 million quantum chemistry calculations,” he says.
The process of designing new organic compounds for photovoltaics, calculating their properties, synthesizing and purifying the molecules, analyzing the films formed from them, and testing the solar cells in which they are incorporated is lengthy indeed. “It takes several weeks to put a new polymer through the battery of tests,” Konarka’s Gaudiana says. “It’s a slow process and requires a lot of work,” he admits, “but it’s the only way to understand what’s going on in detail. Without that level of understanding, there’s little chance for improvement.”

Sunday, August 22, 2010

French team smashes five year efficiency record in eco-marathon

A five year Shell Eco Marathon fuel efficiency record has been smashed by a team of French students. Team Polyjoule broke the record on the first day of the event and then went on to break its own record by a further 482 kilometers. But the students still expect even more from their hydrogen fueled vehicle and are already looking toward next year's Marathon.
ETH Zurich of Switzerland achieved the equivalent of 3,836 kilometers on just one liter of fuel in 2005, and set a bar that no-one has been able to top - until now. A joint effort by Polytech Nantes and Lycée La Joliverie smashed the record on the very first day of this year's Eco Marathon, recording 4,414 kilometers on the equivalent of one liter of fuel (10,382 mpg).
Team Polyjoule had a shaky start which threatened to hamper any attempts, after their hydrogen-powered prototype broke down during pre-marathon testing. Once they pooled resources with Lycée La Joliverie, however, they proceeded to stomp all over the Swiss record. The feat is said to have been made possible by enhancing their vehicle’s electronics monitoring system, which minimizes energy loss.
The French students were not quite finished with Shell's Eco Marathon, though, which saw over 200 teams taking part. On the very last day of the annual event, they added another 482km to their own record, traveling the new official world record of 4,896.1 kilometers per liter of fuel - a distance "roughly the equivalent of driving from the head to toe of Europe, from the North Cape in Norway down to the toe of the Italian peninsula."
Polytech Nantes team leader Pauline Tranchard praised the team effort, which saw the students not only set a new world record but also take first place in the fuel cell category, which led them to an overall first place victory. "Five years’ research went into getting us to 4,896 kilometers on one liter of fuel," she said. "Our insight and the wealth of experience that our colleagues from the Lycée de La Joliverie de Nantes brought to the table were both instrumental in helping us reach what many might have considered an unattainable goal."
Tranchard believes that the team can do even better but will have to wait until next year to prove it.

Aerofarms urban agriculture system - less space, less water and no pesticides

With increasing pressure on global food supplies requiring ever more intelligent use of technology, urbanized vertical aeroponic methods are shaping up as a promising alternative to traditional farming. Aeroponics requires less space, less water and no pesticides and the AeroFarms system takes things further by using LEDs in stacked units to maximize efficiency and use of available space.
The AeroFarms system allows leafy greens and herbs in particular to be grown at room temperature indoors in urban environments. As soil is replaced by a proprietary reusable cloth growing medium, there's no washing of produce required, resulting in an increase in shelf-life of anywhere from one to four weeks depending on what's being grown. In addition, due to the indoor growing environment and shortened growth cycle, the lack of pests allows produce to be grown pesticide-free. Also, aeroponic methods use less than 20% of the water required by traditional agricultural methods and less than 80% of that required by hydroponic methods. Finally, transportation costs are almost negligible when compared to agricultural methods that make production in urban areas impossible.
One criticism often leveled at aeroponic systems that use artificial light is that a significant amount of energy is required. Aerofarms' seeks to minimize energy use through the use of LEDs which have nearly five times the life expectancy of High pressure sodium (HPS) lamps, can be placed closer to the plant (which assists in stacking) and can be designed to evenly distribute light to the crop. Aerofarms' has also conducted research into what specific wavelengths of light are required by growing plants. By using LEDs that target these wavelengths, it's thought that significant energy savings can be achieved.
It should be pointed out, however, that Aerofarms' system is not for hobby farmers. But if you're thinking of setting up a commercial farm in an urban setting, the company claims a 20 to 33% return on investment.

100MW concentrated solar power plant to be built in the UAE

The largest concentrated solar power (CSP) plant in the Middle East is to be built in Madinat Zayed, approximately 120 km (75 miles) southwest of Abu Dhabi in the United Arab Emirates (UAE). When it becomes operational in 2012, the plant, dubbed Shams 1, will feature some 6,300,000 square-feet of solar parabolic collectors, cover 741 acres of desert and will produce enough electricity to power 62,000 households.
With a capacity of approximately 100MW and a solar field consisting of 768 parabolic trough collectors, Sham 1 represents one of the first steps in the region towards the introduction of sustainable energy sources in an energy market which until now has depended mostly on hydrocarbons. It is expected to displace approximately 175,000 tonnes of CO2 per year, equivalent to planting 1.5 million trees or removing 15,000 cars from Abu Dhabi’s roads.
The plant will generate solar thermal electricity through focused sunlight, concentrated by the plant’s parabolic trough collectors, heating a coolant which then generates high-pressure steam that drives a conventional steam turbine. The same technology is being implemented in large-scale commercial solar thermal power stations in Spain and northern Africa.
Shams (which is Arabic for sun) 1 will be built, owned and operated by a consortium including Masdar, an Abu Dhabi renewable energy company, Abengoa Solar, a technology company that will supply the parabolic trough collectors, and Total, one of the world’s major oil and gas groups. Masdar will own a 60 percent share of the plant, while an Abengoa Solar and Total joint venture will own the other 40 per cent.
The plant will directly contribute towards Abu Dhabi’s target of achieving seven percent renewable energy power generation capacity by 2020 and has been approved for a solar incentive premium in the form of a long term Green Power Agreement by the Abu Dhabi Government which will see electricity generated by the plant sold to the Abu Dhabi Water and Electricity Company (ADWEC) under a long-term electricity sales contract.
Construction of Shams 1 will commence in mid 2010, and it is due to go on line in 2012.

SunPower claims new solar cell efficiency record of 24.2 percent

Although we’ve seen sunlight to electricity conversion efficiencies of over 40 percent with multi-junction solar cells in lab environments, most mass-produced cells can only boast a conversion rate of around 15 percent. Now SunPower Corp., a Silicon Valley-based manufacturer of high-efficiency solar cells, solar panels and solar power systems, has claimed a new world record solar cell efficiency of 24.2 percent.
Solar cell efficiency is the rate at which the cells capture and convert sunlight into energy. The 24.2 percent efficiency record for large-scale silicon wafers was confirmed by the U.S. Department of Energy’s National Renewable Energy Lab (NREL) on a full-scale prototype produced at the SunPower Corp.’s manufacturing plant in the Philippines.
"This new world record demonstrates SunPower's ability to extend our lead in manufacturing the world's highest efficiency solar cells," said Bill Mulligan, vice president of technology and development for SunPower. "Our patented and proprietary, high-efficiency solar cell technology drives down the cost of solar energy by increasing the energy production from each solar panel."
Improved cell efficiency is a much sought after goal of researchers and manufacturers of solar cells as it increases the cost effectiveness of solar cells by allowing the equivalent or greater amount of power to be captured using the same area of solar cells.

On the road to cleaner air with air-purifying concrete

Although much of the focus of pollution from automobiles centers on carbon emissions, there are other airborne nasties spewing from the tailpipes of fossil fuel-powered vehicles. These include nitrogen oxides (NOx). In the form of nitrogen dioxide it reacts with chemicals produced by sunlight to form nitric acid – a major constituent of acid rain – and also reacts with sunlight, leading to the formation of ozone and smog. Everyone is exposed to small amounts of nitrogen oxides in ambient air, but exposure to higher amounts, in areas of heavy traffic for example, can damage respiratory airways. Testing has shown that surfacing roads with air purifying concrete could make a big contribution to local air purity by reducing the concentration of nitrogen oxides by 25 to 45 percent.
Last fall in the municipality of Hengelo, the Netherlands, researchers at the Eindhoven University of Technology (TU/e) resurfaced around 1,000 square meters of the busy Castorweg Road with air-purifying concrete paving stones, while another area of 1,000 square meters was surfaced with normal paving stones. The air-purifying concrete contains titanium dioxide, a photocatalytic material that removes the nitrogen oxides from the air and converts them into harmless nitrate with the aid of sunlight. The nitrate is then rinsed away by rain.
The researchers carried out three air-purity measurements on the Castorweg last spring, at heights of between a half and one-and-a-half meters. Over the area paved with air-purifying concrete the NOx content was found to 25 to 45 per cent lower than that over the area paved with normal concrete. Further measurements are planned for later this year.
“The air-purifying properties of the new paving stones had already been shown in the laboratory, but these results now show that they also work outdoors”, said prof. Jos Brouwers.
Brouwers, who has been professor of building materials in the TU/e Department of Architecture, Building and Planning since September 2009, sees numerous potential applications, especially at locations where the maximum permitted NOx concentrations are now exceeded. Aside from the reduction in nitrogen oxides, the stones also have another advantage: they break down algae and dirt, so that they always stay clean.
The concrete stones used in the tests are made by paving stone manufacturer Struyk Verwo Infra, and are already available for use. For roads where an asphalt surface is preferred the air-purifying concrete can be mixed with open asphalt, according to Brouwers. It can also be used in self-cleaning and air-purifying building walls.
Brouwers says the use of air-purifying concrete does not have a major impact on the cost of a road. Although the stones themselves are 50 per cent more expensive than normal concrete stones, the total road-building costs are only ten percent higher.

Air-cleaning paving slabs assessed in Germany

Last month, we told you about an experiment with air-purifying concrete that was recently conducted in the Netherlands. Researchers resurfaced 1,000 square meters of a busy road with concrete paving stones that contained titanium dioxide (TiO2), a photocatalytic material that removes automobile-produced nitrogen oxides (NOx) from the air and converts them into nitrate with the aid of sunlight. When the air was tested up to one-and-a-half meters above those stones, NOx levels were found to be 25 to 45 percent lower than above regular concrete on the same road. Now, a similar study is underway in Germany, and is already showing promising results.
Last year, at 55 percent of urban air monitoring stations in Germany, NOx levels over the maximum permissible limit were recorded. One particularly bad street was Petersberger Straße in the city of Fulda. For the study, the length of that street is being covered with paving slabs coated with TiO2. The Air Clean slabs were developed by F. C. Nüdling Betonelemente, and their effectiveness was subsequently proven by researchers at the Fraunhofer Institute for Molecular Biology and Applied Ecology IME.
Finding the right formulation for the slabs was a tricky process. The team at F. C. Nüdling tried various surfaces, colors, and TiO2 contents, and ultimately ended up having to create their own cement formula, as traditional mixes proved unsatisfactory. The slabs were then laid down in a specially-created street canyon testing ground, and left for an extended period of time. Over the course of the test, NOx degradation rates of 20 to 30 percent were recorded at a height of three meters above the slabs, in moderate to light wind conditions. When there was no wind, degradation rates were as high as 70 percent for both nitrogen monoxide(NO) and nitrogen dioxide(NO2).
The Air Clean slabs are already in use at the Gothaer Platz in Erfurt, where rates of 20 percent for NO2 and 38 percent for NO were recorded. After 14 to 23 months of use, the slabs appear to be just as effective at neutralizing pollutants as they were when they were first implemented.
The researchers from Fraunhofer also wanted to know what happened to the nitrate that was the end result of the NOx conversion process. As was the case in the Netherlands, rain washed it off the roads, down the storm sewers, into water treatment plants, and ultimately into the rivers and groundwater. In local water sources, the highest amount that the German team could trace back to the paving stones was about five milligrams per liter, which is well below the maximum permissible 50 milligrams per liter.

Scientists create a multitool for working with nanoparticles

If you had to sort a bunch of nanoparticles by size, what would you use? A microscope, tweezers, and a very finely-calibrated caliper? Actually, you’d probably use the nanofluidic “multi-tool” created by researchers at the National Institute of Standards and Technology (NIST) in the U.S. Before you start picturing a teeny-tiny Leatherman, which would admittedly be pretty cool, you should be aware that the NIST device is more like a coin separator, that sorts your nickels, dimes and quarters. In this case, however, they would be nickels, dimes and quarters that are smaller than a bacterium.
The device was actually created a year ago, but has just been showcased in an article in the journal Lab on a Chip. That article outlined how the device recently performed the first of a planned series of nanoscale tasks – it successfully separated and measured a mixture of spherical nanoparticles of different sizes (ranging from about 80 to 250 nanometers in diameter) dispersed in a solution.
Viewed in cross-section, the device is a wedged-shaped chamber, with a flat roof and a broad, staircase-like floor. That chamber is taller at the front, but becomes narrower at the back, as the roof and floor close together one precisely-calibrated step at a time.
Using electrophoresis, a method of moving charged particles through a solution by forcing them forward with an applied electric field, the nanoparticles were channeled into the chamber and up the staircase. The larger particles got stuck relatively near the front, as they could no longer squeeze between the floor and the roof, while the smaller particles were able to move farther back before they also were stopped. Ultimately, all the particles ended up at specific steps of the chamber, as dictated by their size.
“Integrated into a microchip, the device could enable the sorting of complex nanoparticle mixtures, without observation, for subsequent application” NIST stated in its press release. “This approach could prove to be faster and more economical than conventional methods of nanoparticle sample preparation and characterization.”
The research team is now looking at using the device for sorting nanoparticles by other criteria, such as shape or composition, instead of size.
Enhanced by Zemanta