Saturday, September 11, 2010

Oral Biologists Use Chemistry To Formulate Cavity Fighting Mints

Oral biologists formulated a mint that fights cavities with an ingredient called Cavistat. Cavistat contains two main components that protect the teeth. First, the amino acid arginine metabolizes certain bacteria, which neutralizes the acid generated by sugars. This raises the pH to help prevent damage to teeth. Cavistat also introduces other chemical compounds that protect against the dissolving of the minerals of the teeth.
Sodas, candy and processed foods are packed with tooth-decaying, cavity-causing sugar. For the past 40 years, experts have seen a decrease in the amount of tooth decay in children; but according to Centers for Disease Control statistics, the trend is reversing. To tackle the problem, one dental scientist has found a way to use candy to help prevent cavities.
Tooth decay in kids has increased 28 percent in the past eight years. Experts believe too many sugary, processed foods and not enough brushing are to blame. A key factor in fighting cavities is found in your mouth.
"Saliva is the great protector against cavities," said Israel Kleinberg, D.D.S., Ph.D., an oral biologist at Stony Brook University in Stony Brook, N.Y.
Dr. Kleinberg says 40 years of research and more than $1 billion has been spent trying to figure out what saliva has that fights tooth decay.
"I'm one of the pioneers in that as a whole new science," Dr. Kleinberg said. "It's where one mixes dentistry and biochemistry."
Dr. Kleinberg discovered how saliva's chemistry helps teeth neutralize the acidity created from eating food by balancing the pH levels in the mouth.
"[It's] like if you've got a swimming pool," Dr. Kleinberg said. "You have got to get the pH right. If you've got a neutral pH, you've got the ideal condition."
He developed a candy to fight cavities. The candy is fluoride-free and protects teeth in two ways. First, it raises pH levels to neutralize more acid than saliva alone. Second, it protects the minerals in tooth enamel. Arginine, an amino acid, combines with calcium in Cavistat, the candy's main ingredient, and sticks to teeth -- leaving behind a layer of protection.
Kids who ate two mints twice a day for one year had 68 percent fewer cavities in their molars than children who didn't chew the mints.
"The number of cavities, we think that ultimately is going to get to almost zero," Dr. Kleinberg said.
That would bring a smile to just about everyone's face.
All the ingredients in the mints are natural and considered foods, so the product doesn't need FDA approval.
WHAT DOES IT DO? BasicMints contain Cavistat, a cavity-fighting agent that includes two major components. Cavistat disrupts oral chemistry and biology in two ways. First, it introduces an amino acid called arginine to the mouth. When bacteria in the mouth break the arginine down, it neutralizes the acid generated by sugars in food, which reduces the amount of acid in the mouth and helps prevent damage to teeth. Additionally the Cavistat adds other chemical compounds that protect the minerals that make up teeth from dissolving.
ANATOMY OF A TOOTH: We think of teeth as being the part visible above the gum, but this is only the tip, or crown, of a tooth. There is also a neck that lies at the gum line, and a root, located below the gum. The crown of each tooth has an enamel coating to protect the underlying dentine. Enamel is even harder than bone, thanks to rows of tightly packed calcium and phosphorus crystals. The underlying dentine is slightly softer, and contains tiny tubules that connect with the central nerve of the tooth within the pulp. The pulp forms the central chamber of the tooth, and is made of soft tissue containing blood vessels that carry nutrients to the tooth. It also contains nerves so teeth can sense hot and cold, as well as lymph vessels to carry white blood cells to fight bacteria.

Steampunk chip takes the heat

Steampunk, the reimagining of modern day technology through a Victorian perspective, has found an unlikely follower in the US Defense Advanced Research Projects Agency (DARPA). A DARPA-funded project has reinvented a type of logic gate in the style of Victorian inventor Charles Babbage – not for aesthetic reasons, but because the retro device works at temperatures too high for conventional transistors. It could therefore find uses in, say, jet and rocket engine electronics.
Babbage famously designed mechanical computers through which data would circulate as steam-driven pistons turned cogs and levers. His unwieldy contraptions were superseded by electronic computers, through which data is transmitted via vast arrays of transistors in the form of varying voltages.
In a transistor, the voltage applied to one of the terminals, the gate, determines whether a current flows through it. But above 250 °C, the device becomes so awash with thermally generated electrons – even when it is supposedly off – that the voltage leaks through the gate to render the device useless. Even silicon carbide, the semiconductor material hardiest against heat, doesn't remedy the situation.

Modern Morse key

That prompted Te-Hao Lee's team at Case Western Reserve University in Cleveland, Ohio, to consider returning to mechanical logic. His team has developed a mechanical version of an inverter – the building block used to construct many types of logic gate, which themselves are a fundamental component of digital circuitry within computers. The device uses an arrangement of nanoscale levers instead of transistors. Like a telegraph operator's Morse key, these levers physically make and break contact to pass or block currents.
Application of a voltage makes the levers move under electrostatic attraction. At 550 °C Lee's team managed to get the inverter to switch on and off 500,000 times a second – performing a computation with each cycle. The faster the switching speed, the zippier the computing. Lee predicts that switching speeds of a billion times a second (1 gigahertz) are possible. That might not sound fast by the standards of desktop PCs, which often run at speeds in excess of 2.5 gigahertz, but for control system applications it's more than adequate. The leakage current was too small to measure, showing the researchers had overcome the current loss issue in transistors.

Lever logic

They are not there yet, though. Some levers have been melting and breaking after 2 billion cycles. "We are not sure why yet. But we think there is a temperature-related electrical spike occurring during the switching operation," says Lee. He is confident of fixing it, however, and going on to develop other types of lever-based logic gates.
We can expect more such developments, says David Wright, an electronics engineer at the University of Exeter, UK. "Mechanical memory and logic devices are being developed by several groups worldwide."

Green machine: Squeezing solar juice from jellyfish

Sea power (Image: Denise Allen/CorruptKitten/Flickr) 

Silicon solar cells are so, well, dead. Dollops of green goo made of living cells – from jellyfish to algae - are now being recruited to produce cheaper solar power.
Zackary Chiragwandi at Chalmers University of Technology in Gothenburg, Sweden, and colleagues are developing a photovoltaic device based on green fluorescent proteinMovie Camera (GFP) from the jellyfish Aequorea victoria.
The team deposit two aluminium electrodes with a tiny gap between them onto a silicon dioxide substrate. A droplet of green fluorescent protein is then added on top, whereupon the protein assembles itself into strands between the electrodes.
When exposed to ultraviolet light, the GFP absorbs photons and emits electrons, which travel around a circuit to produce electricity.

Cheap goo

The green goo acts like the dye used in current "dye-sensitised" solar cells, called Grätzel cells.
However, unlike such cells, the GFP does not require the addition of expensive materials, such as titanium dioxide particles. Instead, the GFP can be placed directly on top of the electrode, simplifying the design and reducing overall cost.
The team have also used the proteins to create a biological fuel cell that generates electricity without the need for an external source of light.
Instead, they used light emitted from a mixture of chemicals such as magnesium and the luciferase enzymes found in fireflies (Lampyridae) and sea pansies (Renilla reniformis) to generate electricity from the jellyfish biophotovoltaic device.
Such a fuel cell could be used to power nano-devices embedded in living organisms, says Chiragwandi, for example to diagnose disease.

Algaelectricity

Jellyfish are not the only sea creatures that can be exploited to generate energy: algae could power floating devices on the ocean wave. Adrian Fisher and Paolo Bombelli at the University of Cambridge and colleagues are developing biophotovoltaic devices based on algae and photosynthetic bacteria.
The team deposit a film of photosynthetic cells on top of a transparent conductive electrode, which faces a carbon cathode seeded with platinum nanoparticles.
When exposed to sunlight the algal cells begin splitting water and producing oxygen, electrons and protons. These would usually be used by the algae to convert carbon dioxide into organic compounds, but instead the device siphons them off to generate electricity, says Fisher. "The algal cells produce electrons very generously," he says.
The team has so far used a proof-of-concept device to power a clock. The sunlight-to-electricity efficiency of the device is only 0.1 per cent at present, compared with between 10 and 15 per cent for existing dye-sensitised solar cells, however. Screening different algae species to find the most productive electron donor might be one way to produce more juice.
Eventually, algal cells could float out at sea, generating electricity from sunlight and seawater. "We might end up with less efficiency than [conventional] photovoltaics, but we think we can win on cost, and we don't require space where people want to live,"

With China Clamping Down on Rare-Earth Metals, Japanese Manufacturers Devise Clever Alternatives

Rare-Earths China produces the vast majority of the world's rare-earth oxides. Wikimedia Commons




If necessity is the mother of invention, maybe China is the wicked stepmother. In an effort to thwart Chinese restrictions on rare-earth metal exports, Japanese manufacturers have developed technology that can make motors without them.
Hitachi has come up with a motor that uses a ferrite magnet, made of ferric oxide. The material is the main source of iron for the steel industry, and it’s cheaper and more common than the rare-earth metals typically used to make electric car motors.
Quoting the Nikkei business daily, business blogs are reporting today that Hitachi hopes to use the motors for hybrid car manufacturing. It’s not yet big enough for a car motor, but Hitachi will also use them in air conditioners, Tech Eye reports.
What’s more, the chemical firm Teijin and Tohoku University have developed technology to make a powerful magnet using a new composite made of iron and nitrogen, Forbes.com reports.
Japan and China are the world’s biggest users of rare-earth metals, which are used to produce small batteries for hybrid cars and handheld gadgets. The metals, 17 in all, are also used to make lasers, magnets, camera lenses, computer memory chips and more.
China produces about 90 percent of the world’s rare-earths, and announced in July that it would slash exports by 40 percent. As Forbes.com reports, China said the move was meant to protect the environment; others claim restricting supplies could give Chinese manufacturers an edge.
Rare-earths are also used in weapons systems, so China’s rare-earth wealth has sparked a flurry of U.S. government reports on how to obtain a home-grown supply. Until mining firms ramp up production, innovation seems like a smart solution.

Dog Poo Powers a Streetlight In Massachusetts Park

Park Spark via Fast Company


Good dog parents might think they’re doing their part by using biodegradable baggies to pick up after their pooches. But after Fido’s feces go in the trash can and to a landfill, they release methane gas, a significant contributor to the greenhouse effect. A dog park in Cambridge, Mass., has a solution: Add in a methane digester, and let your dog waste power the streetlights, tea cart and popcorn machine.
The Park Spark methane digester, unveiled this week, only powers a streetlight for now — no poop-powered popcorn yet. But it’s a neat concept: Replace trash cans with a public methane digester, and you demonstrate how simple it can be to turn waste into fuel.
“As long as people own pets in the city and throw away dog waste, the production of energy will be continuous and unlimited,” the project’s Web site says.
The project involves three basic steps: Throw your dog’s waste into the digester, where anaerobic bacteria are ready to break it down. Stir the mixture to help methane rise to the top, and burn the methane to generate light or electricity.
After picking up their dogs' waste in biodegradable bags, visitors to the Park Spark digester can feed the waste through an above-ground tube, and stir it with a hand crank. The bacteria container is buried underground and the methane is piped through the ground to the streetlamp, which burns with an eternal flame. Eventually, the project leaders want to use dog-generated methane to power vendor carts selling human food.
Conceptual artist Matthew Mazzotta came up with the idea, which is partially funded by MIT.

Friday, September 10, 2010

Energy Technologies Not Enough to Sufficiently Reduce Carbon Emissions, Expert Concludes

Current energy technologies are not enough to reduce carbon emissions to a level needed to lower the risks associated with climate change, New York University physicist Martin Hoffert concludes in an essay in the latest issue of the journal Science.
Many scientists have determined that in order to avoid the risks brought about by climate change, steps must be taken to prevent the mean global temperature from rising by more than 2°C above pre-industrial levels. Current climate models indicate that achieving this goal will require limiting atmospheric carbon dioxide (CO2) concentrations to less than 450 parts per million (ppm), a level that implies substantial reductions in emissions from burning fossil fuels.
The present atmospheric level of CO2 is approximately 385 ppm, some 100 ppm above the pre-industrial level of about 280 ppm. It is expected to rise in future years.
"So far, efforts to curb emissions through regulation and international agreement haven't worked," Hoffert writes. "Emissions are rising faster than ever, and programs to scale up 'carbon neutral' energy sources are moving slowly at best."
Hoffert points to a pair of factors that show why current energy technologies are not sufficient to reduce carbon emissions to a level advocated by scientists.
One, alternative energy sources, such as solar and wind electricity, are not adequate to achieve "massive market penetration," which requires utility-scale systems that can store intermittent supplies of power until they are needed.
While Denmark and Norway have developed methods for this type of storage, these aren't "widely feasible in the United States, and other approaches to store power are expensive and need substantial research and testing," Hoffert contends.
Two, reliance on carbon-emitting fuels is once again growing.
"As natural gas and oil approach peak production, coal production rises, and new coal-fired power plants are being built in China, India, and the United States," writes Hoffert, a professor emeritus in NYU's Department of Physics.
Hoffert offers an array of approaches that would bring about new technologies while at the same time reducing the world's reliance on fossil fuels.
"Broad investment will be crucial to enabling basic research findings to develop into applied commercial technologies," he writes. "Carbon taxes and ramped-up government research budgets could help spur investments. But developing carbon-neutral technologies also requires, at the very least, reversing perverse incentives, such as existing global subsidies to fossil fuels that are estimated to be 12 times higher than those to renewable energy."

Structure for Three Intrinsically Disordered Proteins Determined

Researchers used a variety of experimental, mathematical and observational techniques to ascertain how I-2, one of a class of poorly understood proteins known as intrinsically disordered proteins, binds with the regulator protein phosphatase 1.

  
Most proteins are shapely. But about one-third of them lack a definitive form, at least that scientists can readily observe. These intrinsically disordered proteins (IDPs) perform a host of important biological functions, from muscle contraction to other neuronal actions. Yet despite their importance, "We don't know much about them," said Wolfgang Peti, associate professor of medical science and chemistry. "No one really worried about them."
Now, Peti, joined by researchers at the University of Toronto and at Brookhaven National Laboratory in New York, has discovered the structure of three IDPs -- spinophilin, I-2, and DARPP-32. Besides getting a handle on each protein's shape, the scientists present for the first time how these IDPs exist on their own (referred to as "free form") and what shape they assume when they latch on to protein phosphatase 1, known as "folding upon binding."
The findings are reported in the journal Structure.
Determining the IDPs' shape is important, Peti explained, because it gives molecular biologists insight into what happens when IDPs fold and regulate proteins, such as PP1, which must occur for biological instructions to be passed along.
"What we see is some amino acids don't have to change much, and some have to change a lot," Peti, a corresponding author on the paper, said. "That may be a signature how that (binding) interaction happens."
For two years, the researchers used a variety of techniques to ascertain each IDP's structure. With I-2, which instructs cells to divide, they used nuclear magnetic resonance spectroscopy to create ensemble calculations for the protein in its free and PP1-bound form. They confirmed I-2's binding interaction with PP1 (known as the PP1:I-2 complex) with the help of small-angle X-ray scattering measurements at the National Synchrotron Light Source, located at the Brookhaven lab.
The researchers did the same thing to determine the structure of spinophilin and DARPP-32 in their free-form state and to gain insights into their shapes when they bind with PP1.
"It's analogous to putting a sack cloth over a person," Peti explained. "You can't see the details, but you can get the overall shape. This is really a new way to create a structure model for highly dynamic complexes."
Julie Forman-Kay, a senior scientist at the Hospital for Sick Children in Toronto and a biochemistry professor at the University of Toronto, is a co-corresponding author on the paper. Other authors include Barbara Dancheck and Michael Ragusa, Brown graduate students; Joseph Marsh, a graduate student at the University of Toronto; and Marc Allaire, a biophysicist at the Brookhaven lab.
The U.S. National Institute of Neurological Disorders and Stroke, the Canadian Institutes for Health Research, the Natural Sciences and Engineering Research Council of Canada and the U.S. National Science Foundation Graduate Research Fellowship program funded the research.

Intelligent Battery Project Opens New Ground in Energy Storage Applications

After 30 months of collaboration, project partners Abertax Quality Inc of Malta and Mentzer Electronic GmbH of Germany, with research support from the University of Malta, have delivered an innovative lead and non lead acid battery system that gives users unprecedented real time information on the health and charge level of their batteries. This allows for more efficient and safe charging while saving valuable time and avoiding unnecessary maintenance costs.
The Intelligent Battery system thus represents a true milestone in the evolution of conventional battery systems, which are commonly plagued by a lack of standardization and interoperability of battery and charger parts, leaving users in the dark about vital information on the performance and longevity of their batteries. This information shortfall can create energy waste, extra costs and operational inefficiencies.
At its essence, the Intelligent Battery project strived to make battery use and charging "simple and intelligent while at the same time delivering electricity reliably, efficiently and in a cost effective manner," says Dr Joseph Cilia, C.E.O. and Research Director at Abertax.
To achieve their objectives, the companies realized innovations in at least three areas:
First, the Abertax team of electrical engineers made sure to "design the battery's electronics to match the load," explains Cilia in reference to the electronic circuitry that is embedded to monitor critical battery 'vital signs' such as temperature of the battery acid (which should not exceed 45 degrees Celcius) and its charge level. The data obtained by these circuits is sent to a server, where it can be accessed through a variety of interfaces such as desktop computer displays and handheld devices.
Second, the charger, developed in tandem with the battery system, is able to communicate, so to speak, with the battery. This ensures that the battery is charged at optimal level each time, rather than overloaded or even damaged by the charger.
Finally, the battery casing design is modeled on the popular Lego blocks, whereby dimensions are set according to a 2:1 width to length ratio. In practice, this means several batteries can be stacked on top of one another and connected without the need for supplementary connector cables. This design also means the batteries can easily be placed together in various symmetrical combinations in a relatively small area, allowing for easier installation and space optimization.
While the bulk of the project focused on the creation of the new battery, developing the right charger system to feed the Intelligent Battery was an equally important part of the equation, and one that responds squarely to an imbalance that has plagued the battery market to date. Produced en masse particularly in emerging Asian nations, inexpensive chargers are abundant on the European market. But these chargers are usually not designed in tandem with the batteries they serve, and therefore they cannot communicate critical information about the battery to operators who need to make decisions about charging.
"For cost reasons, a non optimal situation has been accepted by the market," explains Mentzer, whose company developed the charger technology to accompany the battery components developed by Abertax. Consumers have access to cheap, immediate charging solutions. But ultimately they must pay for the design mismatch in the long term. In contrast, the Intelligent Battery charger is about 10 to 20 percent more expensive than the generic alternative, but the extra initial investment is more than recuperated in the medium and long term through prolonged battery life and better performance, says Mentzer.
The Intelligent Battery system has been patented internationally and is set for market distribution and use in a range of applications, including uninterrupted power supply systems in buildings, solar power storage systems and small scale renewable energy systems. Wheelchairs, cleaning applications and small vehicles such as pallet trucks are among the electric vehicles set to be enhanced by the Intelligent Battery, which offers an exciting range of possibilities for users. For example: real time information on a vehicle's power system delivered directly to mobile communication devices.
Equally, home owners who rely on stored energy from renewable energy sources are also likely to appreciate reliable, easy-to-access data on the charge level of the batteries they depend on for lighting or operating white goods. This can save users invaluable time and cost when servicing or maintenance is required, for example, since users "will know what he has to ask a technical person to do and what not to do," says Cilia.
The Intelligent Battery project exemplifies how two European enterprises can come together to innovate and bring value add to a highly competitive global market. Driven by the seamless collaboration between Abertax and Mentzer, and fed by the invaluable input of researchers from the University of Malta, this project is a "very good success story" characterized by "excellent" collaboration with Eureka, says Cilia.
Looking ahead, Abertax and Mentzer are planning further collaboration to deliver innovations in the field lithium ion batteries, which are used in hybrid and electric passenger vehicles and beyond. Indeed, possibilities for battery systems innovation in the field of electric mobility are limited only by the imagination, and are increasingly compelling given the urgent energy and environmental challenges of our time

Excessive Drinking May Lead to Poor Brain Health Via Obesity

Prior research has shown that alcohol abuse and dependence are typically associated with higher rates of obesity, as evidenced by a high body mass index (BMI). Findings from a new study of the relationship between BMI and regional measures of brain structure, metabolite concentrations, and cerebral blood flow suggest that alcohol-related brain injuries may result from a complicated fusion of hazardous drinking, chronic cigarette smoking, and even elevated BMI.
Results will be published in the December 2010 issue of Alcoholism: Clinical & Experimental Research and are currently available at Early View.
"Although alcohol doesn't contain fat, it contains seven calories per gram, which comes second only to fat, which has nine calories per gram," said Stefan Gazdzinski, who was a researcher at Northern California Institute for Research and Education in San Francisco when the study was carried out but is now a researcher at Jagiellonian University in Poland. "These calories add up over time. In fact, daily consumption of more than 30 grams of ethanol -- the amount of alcohol in two to three 12-ounce beers -- is associated with risk for abdominal obesity."
"Abdominal obesity has higher health risks that fat deposition in other body areas, for example, legs and hips," added Susan F. Tapert, a professor of psychiatry at the University of California, San Diego and director of Substance Abuse/Mental Illness in the VA San Diego Healthcare System. "As obesity rates are increasing rapidly among alcoholics and non-alcoholics, these relationships are important to understand."
"Excessive weight is not only a risk factor for cardiovascular disease or diabetes, but it is also a risk factor for developing dementia," said Gazdzinski, also the study's lead author. "Obesity has been shown to be associated with worse decision making and problem solving throughout lifetime. We had previously observed lower concentrations of some brain metabolites, markers of brain injury, in healthy non-alcohol dependent people with BMIs in the overweight to obese range. Knowing that individuals in developed countries who overuse alcohol are usually heavier than individuals enjoying alcohol in moderation -- because of the caloric intake -- we wanted to investigate if excess weight accounts for some of the brain injury usually observed in alcoholics."
Gazdzinski and his colleagues retrospectively analyzed data gathered from 54 alcohol-treatment seeking men who had been abstinent from alcohol for about one month. BMI, as well as imaging that assessed volume, blood flow, and metabolite concentrations of the brain were obtained from a 1.5 Tesla magnetic resonance scanner.
"It is commonly believed that it is the large amount of consumed alcohol by itself that leads to brain injury in alcoholics," said Dieter J. Meyerhoff, professor of radiology at the University of California San Francisco and San Francisco Veteran's Affairs Medical Center, and the principal investigator of this study. "This is only partly correct. In previous studies, we have shown that alcoholics who smoke cigarettes have greater brain injury than nonsmoking alcoholics. This new study suggests that a high BMI, independent of drinking and smoking, is also associated with brain injury."
"In other words, weight also is related to brain health among those with alcoholism," said Tapert. "BMI may be a very important factor to consider when examining other potential consequences of alcohol use. Since individuals who consume substantial amounts of alcohol are at risk for obesity, it is important to understand the influence of body fat deposition on the measures we are examining. It could be that metabolic changes resulting from or causing obesity cause harm to the brain, at least among alcoholics."
The relationship between alcohol dependence and BMI is complicated, added Gazdzinski. "Alcoholics who drink the most are not necessarily the heaviest," he said. "In our sample, there was no correlation between drinking severity and BMI. Factors such as availability of funds for drinking may play a role, especially in countries where alcohol is heavily taxed. For example, the drinker may have not enough money to eat properly after drinking."
"While it is fortunate that tobacco use, violent crime, and some other unhealthy behaviors have declined in recent years, heavy drinking has remained relatively stable, and obesity rates have greatly increased," said Tapert. "These findings point to another deleterious outcome of becoming overweight: poor brain health. While it may be that poor brain cell functioning has led to the challenges these men faced with overconsumption of food and alcohol, it is also possible that the obesity itself contributed to poor brain health. If so, weight loss, exercise, and improved self-care in addition to stopping drinking could result in improvements to brain health."

Researchers create nano-architectured aluminum alloy with strength of steel

Surface etched aluminum bar (Image: Alchemist-hp)

Using a technique that creates a new nanoscale architecture, researchers have created an aluminum alloy just as strong as steel but with reasonable plasticity to stretch and not break under stress. Importantly, the technique of creating these nanostructures can be used on many different types of metals and the team plans to work on strengthening magnesium, a metal that is even lighter than aluminum that could be used to make strong, lightweight body armor for soldiers.
Dr. Yuntian Zhu, a professor of materials science at North Carolina State University worked on the project to create the super strong aluminum alloy along with colleagues from the University of Sydney in Australia; the University of California, Davis; and Ufa State Aviation Technical University in Russia. He says the aluminum alloys have unique structural elements that, when combined to form a hierarchical structure at several nanoscale levels, make them super-strong and ductile.
The aluminum alloys have small building blocks, called “grains,” that are thousands of times smaller than the width of a human hair. Each grain is a tiny crystal less than 100 nanometers in size. Bigger is not better in materials, Zhu says, as smaller grains result in stronger materials. Zhu also says the aluminum alloys have a number of different types of crystal “defects.” Nanocrystals with defects are stronger than perfect crystals.
Aside from Zhu’s plans to collaborate with the Department of Defense on a project to make magnesium alloy to be used in body armor, the ability to create lighter – yet stronger – materials is crucial to devising everything from more fuel-efficient cars to safer airplanes.
The paper detailing the team’s findings appears in the journal Nature Communications.

Will Coal Supplies Peak in 2011?

coal supplies photo Yes, according to a recent study that predicts:
After 2011, the production rates of coal and CO2 decline, reaching 1990 levels by the year 2037, and reaching 50% of the peak value in the year 2047.
Summarizing the study, Scitizen writes:
actual historical coal production is a better indicator of the future trend of worldwide coal output than stated reserves which are notoriously unreliable. They note, for example, that the state of Illinois, despite its rank as second in reserves in the United States, has seen its production decline by half over the last 20 years.
This trend is contrary, of course, to often-spewed estimates that current coal supplies will last at least 100 years and to industry claims of a 250- to 400-year supply.
Predictions of declining coal supplies are not new, but this is probably the most recent scientific (and objective) examination of current supplies and certainly provides another reason to get ourselves off of fossil fuels and into renewable energy.

Wednesday, September 08, 2010

Chemical for Keeping Human Pluripotent Stem Cells Alive Identified


Noboru Sato is an assistant professor of biochemistry and a member of the Stem Cell Center at UC Riverside
Human pluripotent stem (hPS) cells can generate any given cell type in the adult human body, which is why they are of interest to stem cell scientists working on finding therapies for spinal cord injuries, Parkinson's disease, burns, heart disease, diabetes, arthritis, and other ailments.
Before hPS cell technologies can be translated into clinical applications, however, some obstacles must first be overcome.
One such obstacle frustrating stem cell researchers is "cell death" that the major types of hPS cells, including human embryonic stem cells and human induced pluripotent stem cells, mysteriously undergo when cultured as single cells, rendering them less suitable for research.
Researchers at the University of California, Riverside now show that a molecular motor, called "nonmuscle myosin II" (NMII), which exists naturally inside each hPS cell and controls various cellular functions, triggers the death of hPS cells when they are broken down to single cells.
While many details of how exactly NMII works remain unknown, a wide consensus among researchers is that NMII induces a contraction of the main internal components of the cells, eventually resulting in cell death.
To stop this cell death, the researchers treated hPS cells with a chemically synthesized compound, blebbistatin, and found that it substantially enhanced the survival of the cells by chemically inhibiting NMII. (Blebbistatin is commercially available from several companies that sell biologically active chemical compounds.)
"Our research shows that blebbistatin works as effectively as the most potent cell death inhibitor of hPS cells available today," said Noboru Sato, an assistant professor of biochemistry, whose lab led the research. "This discovery brings stem cell research a step closer towards finding therapies for several diseases."
Study results appear online, Sept. 7, in Nature Communications.
Sato explained that most of the current culture methods to grow hPS cells require animal-derived materials, such as Matrigel, for coating the culture surfaces. Without these materials, hPS cells cannot adhere to the culture plate. But the drawback of using them is that they could potentially cause contamination of hPS cells by introducing viruses and unknown pathogens.
"Another advantage of using blebbistatin is that we need no human- or animal-derived materials for coating the culture surfaces," he said. "This is because blebbistatin greatly facilitates the adhesion of cells to the culture surface. By combining blebbistatin and a chemically synthesized coating material, poly-D-lysine, we have developed a fully defined and simplified culture environment that allows hPS cells to grow under completely animal-free and contamination-free conditions."
Available through many companies, poly-D-lysine is a chemically synthesized animal-free coating material that is widely used for cell culture coating for other cell types. For hPS cells to adhere to the poly-D-lysine coating, blebbistatin must be added to the culture medium. "This new method shows that a novel combination of routinely available materials can create a completely distinct technological platform," Sato said.
Sato, a member of UC Riverside's Stem Cell Center, was joined in the research by Andrea Walker, a second-year medical student in the UCR/UCLA Thomas Haider Program in Biomedical Sciences and the first author of the research paper, Hua Su, and Nicole Harb of UCR; and Mary Anne Conti and Robert S. Adelstein of the Laboratory of Molecular Cardiology, National Heart, Lung, and Blood Institute, National Institutes of Health, Bethesda, Md.

Smoking Damages Men's Sperm and Also the Numbers of Germ and Somatic Cells in Developing Embryos

Two new studies have shed more light on how smoking may damage fertility, and give further weight to advice that mothers and fathers-to-be should stop smoking before attempting to conceive. The research is published online in Europe's leading reproductive medicine journal Human Reproduction today (Wednesday 8 September).
In the first study, researchers found that a mother's smoking during early pregnancy dramatically reduces the numbers of germ cells (the cells that form eggs in females and sperm in males) and somatic cells (the cells that form every other part of the body) in the developing foetus. They believe that this may have an adverse effect on the fertility of the baby in later life.
In the second study, researchers looked at specific proteins called protamines in the sperm of men who smoked and compared them with the protamines in non-smokers. Protamines play an important role in the development of sperm -- they are necessary for the process that results in the formation of chromosomes during cell division -- and, therefore, have an effect on subsequent male fertility.
For the first study, Claus Yding Andersen, Professor of Human Reproductive Physiology at the University Hospital of Copenhagen (Denmark), and his colleagues looked at 24 embryonic testes obtained after women had undergone legal termination between 37-68 days after conception. They also took blood and urine samples and questioned the women about their lifestyle during pregnancy, including smoking and drinking habits.
They found that the number of germ cells was more than halved (reduced by 55%) in the testes of embryos from mothers who smoked compared with those from the non-smoking mothers. The number of somatic cells was also reduced by more than a third (37%). The effect was dose dependent, with a greater reduction in germ and somatic cells being seen in embryos from the mothers who smoked the most. This remained the same, even after adjusting for coffee and alcohol consumption.
When these results were added to their earlier work that looked at the effect of smoking on 28 female embryos, the researchers found that, overall, germ cells in the ovaries and testes of embryos exposed to smoking were reduced significantly by 41% compared with the number of germ cells in non-exposed embryos. The results also showed that germ cells were more susceptible to damage caused by smoking than somatic cells.
Prof Andersen said: "As the germ cells in embryos eventually develop to form sperm in males and eggs in females, it is possible that the negative effect on the numbers of germ cells caused by maternal smoking during pregnancy may influence the future fertility of offspring. In addition, the reduction in the number of somatic cells also has the potential to affect future fertility, as somatic cells in the testes support the development of germ cells to form functional sperm. If the somatic cell number is reduced, fewer functional sperm will be produced.
"These findings may provide one potential cause of the reduced fertility observed in recent years. Although the prevalence of smoking during pregnancy has declined in the last decade in developed countries, one in eight mothers continue to smoke throughout their pregnancy, and in Denmark the prevalence of smoking actually increased to 43% in 2005 among women younger than 20 at the time of delivery. This tendency is alarming, and when you take the results from this study in combination with the other known negative effects of cigarette smoke during pregnancy, it further emphasises that pregnant mothers should refrain from smoking."
The first trimester is the crucial time when the sexual organs in the developing embryo are differentiating to form either testes or ovaries. "This process is very delicately regulated, with a number of hormones fluctuating. If something goes wrong at this point, just six to eight weeks after conception, it may have an impact on the function of the gonads later in life," said Prof Andersen. "Our results show that the gonads are susceptible to factors, such as cigarette smoke, just at this critical time when they start to differentiate."
The authors warn that their study does not clarify whether the reduction in germ and somatic cell numbers is permanent or reflects a growth delay that might be compensated for later on. Prof Andersen said: "We would expect adverse effects to be more pronounced if the mother continues smoking throughout pregnancy, but we have only studied gonads from the first trimester and can only guess whether this effect actually will occur. So the effect on future fertility is still unknown. However, this study does indicate that smoking during pregnancy may have an adverse effect on the future reproductive ability of offspring, since both the number of germ cells and somatic cells are dramatically reduced and these are the foundations of future fertility."
In the second study, researchers led by Professor Mohamed Hammadeh, head of the assisted reproductive laboratory in the Department of Obstetrics and Gynaecology at the University of the Saarland, (Homburg Saar, Germany), looked at the levels of two protamines, 1 and 2, in the sperm of 53 heavy smokers (more than 20 a day) and 63 non-smokers.
P1 and P2 are necessary for proper chromatin condensation. This is the process whereby chromatin (the combination of DNA and proteins that make up chromosomes) condenses and packages up DNA into chromosomes that can fit inside cells. Poor chromatin packaging adversely affects sperm and is associated with a number of fertility problems such as lower chances of fertilisation after intercourse, poor fertilisation after IVF and ICSI (intracytoplasmic sperm injection), and a higher incidence of miscarriages. This is the first study to investigate the effect of smoking on protamines.
Prof Hammadeh and his colleagues found that P2 concentrations were 14% lower in the sperm of smokers compared with non-smokers. "The concentration of P2 from smokers was 334.78 ng in every million sperm, compared with P2 concentrations of 388.8 ng per million sperm in non-smokers," said Prof Hammadeh. "This means that sperm from smokers suffer from protamine deficiency, probably caused by the cigarette smoke, and this could be a reason for incomplete or poor chromatin packaging in sperm, leading to infertility."
The researchers found that the ratio of P1 to P2 was altered in smokers. "In normal, fertile men, the ratio of P1 to P2 is almost equal at 1:1. Any increase or decrease in this ratio represents some kind of infertility. In this study the ratio was significantly higher in smokers than in non-smokers, with higher levels of P1 than of P2," said Prof Hammadeh.
The study also showed that levels of oxidative stress were higher in smokers than in non-smokers. Oxidative stress is an imbalance between chemically reactive molecules containing oxygen, other, unstable and highly reactive atoms called free radicals (collectively known as "reactive oxygen species"), and anti-oxidant compounds. It can cause damage to proteins, lipids and DNA. "Oxidative stress is known to cause damage to sperm DNA in a number of ways," said Prof Hammadeh. "These results suggest that induced oxidative stress by cigarette smoking may have a significant inverse effect on chromatin condensation by disrupting P2."
He concluded: "Given the potential adverse effects of smoking on fertility, cancer and so on, physicians should advise infertile patients who smoke cigarettes to quit smoking. We are carrying out further research into the levels of P1 and P2 in order to find out the effect of smoking on the silencing or changing of the P2 gene in an attempt to clarify the potential mechanism behind this effect."

Goodbye touchscreen? XWave brainwave interface for iDevices unveiled

XWave is an iPhone/iPod touch/iPad compatible device that detects brainwaves

XWave is an iPhone/iPod touch/iPad compatible device that detects brainwaves

Until humans evolve huge brains like the Talosians, it seems we’ll have to rely electronic headwear to allow us to control devices with our brainwaves – electronic headwear like the XWave from California-based company PLX Devices. The XWave is the first brainwave interface accessory for the iPhone/iPod touch/iPad that is worn over the head like a pair of headphones. Unfortunately, the device won’t allow you to scroll through playlists or select a contact to call with the power of your mind. Rather, like the Star Wars Force Trainer, it detects your attention and meditation levels for use in games and getting the old gray matter into shape.
The XWave is powered by technology provided by Neurosky Inc. and the device itself is not dissimilar to that company’s MindSet headset we first saw at the Tokyo Game Show back in 2008. Like the MindSet, the XWave incorporates a single electrode that sits in contact with the wearer’s forehead to read brainwave information, or electroencephalography (EEG) data, and converts these analog signals into digital so they can be used
The device comes bundled with the XWave app that includes a number of exercises aimed at training your brain. Objectives include levitating a ball on the iDevice’s screen, changing a color based on the relaxation level of your brain and training your brain to maximize its attention span.
PLX Devices is also providing 3rd party software developers an SDK to allow them to design and develop apps using the XWave device. The company reports that some apps already in development include games in which objects are controlled by the wearer’s mind and another that allows the wearer to control the lights in their home or select music based on their mood.
The XWave will be available for preorder from PLX Devices now for US$100 ahead of an October 2010 release.

Mobileye claims 'An End to Motor Vehicle Collisions'

Mobileye's warning system alerts drivers to imminent forward collisions and other driving ... Mobileye's warning system alerts drivers to imminent forward collisions and other driving hazard

Before we go any further, let’s get this out of the way right up front – nothing is ever going to stop cars from running into things. Until drivers are taken out of the equation completely, accidents will always happen. Nonetheless, Dutch tech company Mobileye has declared that with the release of its new C2-270 collision warning system, “an end to motor vehicle collisions [is] now in sight.” This system warns drivers of dangerously-close cars, alerts them when drifting out of their lane and includes a Pedestrian Collision Warning component.
The Mobileye C2-270 system consists of a windshield-mounted 640 x 480 CMOS camera, connected to a dash-mounted display unit. The camera monitors the road in front of the vehicle when the car is in motion, and detects imminent forward collisions via its EyeQ2 algorithmic system-on-a-chip (EyeQ2 has been in use in vehicles from BMW, GM, Volvo and Nissan since 2007). The driver is alerted through flashing color-coded icons and an audible alarm, and has up to 2.7 seconds in which to respond – the system itself doesn’t apply the brakes, or steer the car out of harm’s way.
Unlike its C2-170 predecessor, the 270 will apparently give drivers a heads-up when they’re on a collision course with a pedestrian, a cyclist or a motorcyclist. Like the 170, it also warns drivers of unintended lane departures, and of insufficient distance-keeping. Optional extras include a black box event recording system, vehicle tracking via integrated GPS, and a fleet management application.
With technology like this, of course, there’s always the worry that users will pay less attention because they assume the machine can now do it for them. It’s not unlike a recent study which suggests that ultra-efficient LED bulbs won’t actually save much in the way of electricity, due to the fact that people using them will be less concerned about their personal energy usage. It definitely bears consideration, although we hope such systems will result in less accidents overall, because that’s always a good thing.

Tuesday, September 07, 2010

Cells Can Eat Parts of Themselves, With Help from One Protein


 





Sites of autophagy (green) are reduced in cells lacking HMGB1 (left) compared with control cells 
Like some people, cells eat when they are under pressure -- but they consume parts of themselves. A multi-function protein helps control this form of cannibalism, according to a study in the September 6 issue of the Journal of Cell Biology.
Cells often respond to hunger or stress by digesting some of their contents. The process, known as autophagy, helps free nutrients and clean up cytoplasmic trash such as worn-out organelles and misshapen proteins. A team led by researchers at the University of Pittsburgh Cancer Institute discovered a link between this form of cellular recycling and the protein HMGB1. The team showed HMGB1 to be a critical pro-autophagic protein that enhances cell survival and limits programmed cell death.
The findings suggests that blocking HMGB1 could benefit cancer patients, since tumor cells often rev up autophagy to withstand chemotherapy, immunotherapy, and radiation treatment.

Melting Rate of Icecaps in Greenland and Western Antarctica Lower Than Expected


This artist's concept shows GRACE's twin satellites, which orbit Earth in a back-to-back manner and change positions in response to variations in Earth's gravity field. The GRACE satellites house microwave ranging systems that measure the change in the distance between the satellites over time, enabling them to essentially "weigh" the changes in glaciers.

The Greenland and West Antarctic ice caps are melting at half the speed previously predicted, according to analysis of recent satellite data.
The finding is the result of research by a joint US/Dutch team from the Jet Propulsion Laboratory, Delft University of Technology (TU Delft, The Netherlands) and SRON Netherlands Institute for Space Research. The scientists have published their work in the September issue of Nature Geoscience.
GRACE
The melting of the ice caps has been charted since 2002 using the measurements produced by the two GRACE satellites. From space they detect small changes in the Earth's gravitational field. These changes are related to the exact distribution of mass on Earth, including ice and water. When ice melts and lands in the sea, this therefore has an effect on the gravitational field.
Gigatonnes
Based on this principle, previous estimates for the Greenland ice cap calculated that the ice was melting at a rate of 230 gigatonnes a year (i.e. 230,000 billion kg). That would result in an average rise in global sea levels of around 0.75 mm a year. For West Antarctica, the estimate was 132 gigatonnes a year. However, it now turns out that these results were not properly corrected for glacial isostatic adjustment, the phenomenon that the Earth's crust rebounds as a result of the melting of the massive ice caps from the last major Ice Age around 20,000 years ago. These movements of the Earth's crust have to be incorporated in the calculations, since these vertical movements change the Earth's mass distribution and therefore also have an influence on the gravitational field.
GPS
Researchers from the Jet Propulsion Laboratory in Pasadena (US), TU Delft and SRON Netherlands Institute for Space Research have now succeeded in carrying out that correction far more accurately. They did so using combined data from the GRACE mission, GPS measurements on land and sea floor pressure measurements. These reveal that the sea floor under Greenland is falling more rapidly than was first thought.
One of the researchers, Dr Bert Vermeersen of TU Delft, explains: "The corrections for deformations of the Earth's crust have a considerable effect on the amount of ice that is estimated to be melting each year. We have concluded that the Greenland and West Antarctica ice caps are melting at approximately half the speed originally predicted." The average rise in sea levels as a result of the melting ice caps is also lower.
Model
"The innovative aspect of our method is that we simultaneously matched the current changes in the ice mass and glacial isostatic adjustment to the observations, instead of assuming that a particular glacial isostatic adjustment model is correct," says Dr Vermeersen. "For Greenland in particular, we have found a glacial isostatic adjustment model that deviates rather sharply from general assumptions. But at present there are too few data available to verify this independently. A more extensive network of GPS readings in combination with geological indicators for the local and regional changes in sea level changes around Greenland over the last 10,000 years, will possibly be able to provide conclusive evidence on this matter in the years to come."

New Mission to Skim the Sun: NASA Selects Science Investigations for Solar Probe Plus


The Solar Probe Plus spacecraft with solar panels folded into the shadows of its protective shield, gathers data on its approach to the Sun.
NASA has begun development of a mission to visit and study the sun closer than ever before. The unprecedented project, named Solar Probe Plus, is slated to launch no later than 2018.
The small car-sized spacecraft will plunge directly into the sun's atmosphere approximately four million miles from our star's surface. It will explore a region no other spacecraft ever has encountered. NASA has selected five science investigations that will unlock the sun's biggest mysteries.
"The experiments selected for Solar Probe Plus are specifically designed to solve two key questions of solar physics -- why is the sun's outer atmosphere so much hotter than the sun's visible surface and what propels the solar wind that affects Earth and our solar system? " said Dick Fisher, director of NASA's Heliophysics Division in Washington. "We've been struggling with these questions for decades and this mission should finally provide those answers."
As the spacecraft approaches the sun, its revolutionary carbon-composite heat shield must withstand temperatures exceeding 2550 degrees Fahrenheit and blasts of intense radiation. The spacecraft will have an up close and personal view of the sun enabling scientists to better understand, characterize and forecast the radiation environment for future space explorers.
NASA invited researchers in 2009 to submit science proposals. Thirteen were reviewed by a panel of NASA and outside scientists. The total dollar amount for the five selected investigations is approximately $180 million for preliminary analysis, design, development and tests.
The selected proposals are:
  • Solar Wind Electrons Alphas and Protons Investigation: principal investigator, Justin C. Kasper, Smithsonian Astrophysical Observatory in Cambridge, Mass. This investigation will specifically count the most abundant particles in the solar wind -- electrons, protons and helium ions -- and measure their properties. The investigation also is designed to catch some of the particles for direct analysis.
  • Wide-field Imager: principal investigator, Russell Howard, Naval Research Laboratory in Washington. This telescope will make 3-D images of the sun's corona, or atmosphere. The experiment will also provide 3-D images of solar wind and shocks as they approach and pass the spacecraft. This investigation complements instruments on the spacecraft providing direct measurements by imaging the plasma the other instruments sample.
  • Fields Experiment: principal investigator, Stuart Bale, University of California Space Sciences Laboratory in Berkeley, Calif. This investigation will make direct measurements of electric and magnetic fields, radio emissions, and shock waves that course through the sun's atmospheric plasma. The experiment also serves as a giant dust detector, registering voltage signatures when specks of space dust hit the spacecraft's antenna.
  • Integrated Science Investigation of the Sun: principal investigator, David McComas of the Southwest Research Institute in San Antonio. This investigation consists of two instruments that will monitor electrons, protons and ions that are accelerated to high energies in the sun's atmosphere.
  • Heliospheric Origins with Solar Probe Plus: principal investigator, Marco Velli of NASA's Jet Propulsion Laboratory in Pasadena, Calif. Velli is the mission's observatory scientist, responsible for serving as a senior scientist on the science working group. He will provide an independent assessment of scientific performance and act as a community advocate for the mission.
"This project allows humanity's ingenuity to go where no spacecraft has ever gone before," said Lika Guhathakurta, Solar Probe Plus program scientist at NASA Headquarters, in Washington. "For the very first time, we'll be able to touch, taste and smell our sun."
The Solar Probe Plus mission is part of NASA's Living with a Star Program. The program is designed to understand aspects of the sun and Earth's space environment that affect life and society. The program is managed by NASA'S Goddard Space Flight Center in Greenbelt, Md., with oversight from NASA's Science Mission Directorate's Heliophysics Division.The Johns Hopkins University Applied Physics Laboratory in Laurel, Md., is responsible for formulating, implementing and operating the Solar Probe Mission.
For more information about the Solar Probe Plus mission, visit: http://solarprobe.gsfc.nasa.gov/
For more information about the Living with a Star Program, visit: http://science.nasa.gov/about-us/smd-programs/living-with-a-star/

Helping Corn-Based Plastics Take More Heat


Developing a more heat-tolerant biodegradable plastic is the goal of ARS research chemist William J. Orts (left) and his collaborators, Allison Flynn and Lennard Torres from Lapol, LLC, Santa Barbara, Calif.

Your favorite catsup or fruit juice might be "hot-filled" at the food-processing plant -- that is, poured into its waiting container while the catsup or juice is still hot from pasteurization. Current containers made from corn-based plastics literally can't take the heat of hot-filling, according to U.S. Department of Agriculture (USDA) chemist William J. Orts.
But Orts and a team of collaborators from Lapol, LLC, of Santa Barbara, Calif., hope to change that by making corn-derived plastics more heat-tolerant. Orts and Lapol co-investigators Allison Flynn and Lennard Torres are doing the work at the Agricultural Research Service (ARS) Western Regional Research Center in Albany, Calif., where Orts leads the Bioproduct Chemistry and Engineering Research Unit. ARS is USDA's principal intramural scientific research agency.
By boosting the bioplastics' heat tolerance, the collaboration -- under way since 2007 -- may broaden the range of applications for which corn-derived plastics could be used as an alternative to petroleum-based plastics.
Corn-based plastics are made by fermenting corn sugar to produce lactic acid. The lactic acid is used to form polylactic acid, or PLA, a bioplastic. The Albany team is developing a product known as a heat-deflection temperature modifier that would be blended with PLA to make it more heat-tolerant.
The modifier is more than 90 percent corn-based and is fully biodegradable. There currently are no commercially available heat-deflection temperature modifiers for PLA, according to Randall L. Smith, chief operating officer at Lapol. ARS and Lapol are seeking a patent for the invention.

Chemists, Engineers Achieve World Record With High-Speed Graphene Transistors

Graphene, a one-atom-thick layer of graphitic carbon, has great potential to make electronic devices such as radios, computers and phones faster and smaller. But its unique properties have also led to difficulties in integrating the material into such devices.
In a paper published Sept. 1 in the journal Nature, a group of UCLA researchers demonstrate how they have overcome some of these difficulties to fabricate the fastest graphene transistor to date.
With the highest known carrier mobility -- the speed at which electronic information is transmitted by a material -- graphene is a good candidate for high-speed radio-frequency electronics. But traditional techniques for fabricating the material often lead to deteriorations in device quality.
The UCLA team, led by professor of chemistry and biochemistry Xiangfeng Duan, has developed a new fabrication process for graphene transistors using a nanowire as the self-aligned gate.
Self-aligned gates are a key element in modern transistors, which are semiconductor devices used to amplify and switch electronic signals. Gates are used to switch the transistor between various states, and self-aligned gates were developed to deal with problems of misalignment encountered because of the shrinking scale of electronics.
To develop the new fabrication technique, Duan teamed with two other researchers from the California NanoSystems Institute at UCLA, Yu Huang, an assistant professor of materials science and engineering at the Henry Samueli School of Engineering and Applied Sciences, and Kang Wang, a professor of electrical engineering at the Samueli School.
"This new strategy overcomes two limitations previously encountered in graphene transistors," Duan said. "First, it doesn't produce any appreciable defects in the graphene during fabrication, so the high carrier mobility is retained. Second, by using a self-aligned approach with a nanowire as the gate, the group was able to overcome alignment difficulties previously encountered and fabricate very short-channel devices with unprecedented performance."
These advances allowed the team to demonstrate the highest speed graphene transistors to date, with a cutoff frequency up to 300 GHz -- comparable to the very best transistors from high-electron mobility materials such gallium arsenide or indium phosphide.
"We are very excited about our approach and the results, and we are currently taking additional efforts to scale up the approach and further boost the speed." said Lei Liao, a postdoctoral fellow at UCLA.
High-speed radio-frequency electronics may also find wide applications in microwave communication, imaging and radar technologies

Experiment Records Ultrafast Chemical Reaction With Vibrational Echoes


The molecules shown here in yellow are first-hand observers to an ultrafast chemical reaction. As the reaction proceeds, the vibrational frequencies of the yellow molecules change. By 'listening' to changes in these vibrational frequencies, researchers Kevin Kubarych and Carlos Baiz could observe the chemical reaction underway. The rainbow colors indicate how the 'notes' of the yellow molecules change in response to the reaction.



 
To watch a magician transform a vase of flowers into a rabbit, it's best to have a front-row seat. Likewise, for chemical transformations in solution, the best view belongs to the molecular spectators closest to the action.


Those special molecules comprise the "first solvation shell," and although it has been known for decades that they can sense and dictate the fate of nearly every chemical reaction, it has been virtually impossible to watch them respond. University of Michigan researchers Kevin Kubarych and Carlos Baiz, however, recently achieved the feat. Their work was published online Aug. 25 in the Journal of the American Chemical Society.
Until now, observing the solvent shell in action has been difficult for several reasons. First, fundamental steps in chemical reactions are exceedingly fast. To "film" a chemical reaction requires a camera with a "shutter speed" of femtoseconds (one femtosecond is the time it takes light to travel the length of a bacterium -- about half a micrometer, or one hundredth the width of a human hair).
Second, a solution contains many solvent molecules, but only a few are privileged to be in the first solvation shell and participate in the reaction. Finally, most spectroscopic probes of liquids are not chemically specific, meaning they can't identify the particular molecular species they are monitoring.
To sum up, watching the first solvation shell respond to a chemical reaction requires a combination of ultrafast time resolution and the ability to initiate the reaction and track the solvent shell's response. It is this combination that Kubarych, an assistant professor of chemistry, and graduate student Baiz have achieved.
The key breakthrough was to realize that electrons move during chemical reactions and that when the nearest solvent molecules sense the electron redistribution, their vibrational frequencies change. Much as the strings on a musical instrument are intimately connected to the wooden neck and body, the solvent shell and the reacting molecule are tightly coupled and difficult to disentangle. Indeed, the very act of holding an instrument may cause it to warp or heat up and, in principle, these changes affect the frequencies of vibration of the strings. Similarly, the new approach to reaction dynamics introduced by Kubarych's lab essentially "listens" to the very fastest events in chemical reactions by noting the changing resonance frequencies of the surrounding molecules.
"This level of detailed information on the complex environments common in chemical transformations is unique," Kubarych said, "and promises to offer remarkable insight into the understanding of natural and artificial charge transfer reactions -- processes that are of fundamental importance in contexts ranging from cellular respiration to solar energy conversion.

Chemists Develop Simple Technique to Visualize Atomic-Scale Structures


Atomic force micrograph of ~1 micrometer wide × 1.5 micrometers (millionths of a meter) tall area. The ice crystals (lightest blue) are 0.37 nanometers (billionths of a meter) high, which is the height of a 2-water molecule thick ice crystal. A one-atom thick sheet of graphene is used to conformally coat and trap water that has adsorbed onto a mica surface, permitting it to be imaged and characterized by atomic force microscopy. Detailed analysis of such images reveals that this (first layer) of water is ice, even at room temperature. At high humidity levels, a second layer of water will coat the first layer, also as ice. At very high humidity levels, additional layers of water will coat the surface as droplets

 
Atomic force micrograph of ~1 micrometer wide × 1.5 micrometers (millionths of a meter) tall area. The ice crystals (lightest blue) are 0.37 nanometers (billionths of a meter) high, which is the height of a 2-water molecule thick ice crystal. A one-atom thick sheet of graphene is used to conformally coat and trap water that has adsorbed onto a mica surface, permitting it to be imaged and characterized by atomic force microscopy. Detailed analysis of such images reveals that this (first layer) of water is ice, even at room temperature. At high humidity levels, a second layer of water will coat the first layer, also as ice. At very high humidity levels, additional layers of water will coat the surface as droplets   Researchers at the California Institute of Technology (Caltech) have devised a new technique -- using a sheet of carbon just one atom thick -- to visualize the structure of molecules. The technique, which was used to obtain the first direct images of how water coats surfaces at room temperature, can also be used to image a potentially unlimited number of other molecules, including antibodies and other biomolecules. A paper describing the method and the studies of water layers appears in the September 3 issue of the journal Science.


"Almost all surfaces have a coating of water on them," says James Heath, the Elizabeth W. Gilloon Professor and professor of chemistry at Caltech, "and that water dominates interfacial properties" -- properties that affect the wear and tear on that surface. While surface coatings of water are ubiquitous, they are also very tough to study, because the water molecules are "in constant flux, and don't sit still long enough to allow measurements," he says. Quite by accident, Heath and his colleagues developed a technique to pin down the moving molecules, under room-temperature conditions. "It was a happy accident -- one that we were smart enough to recognize the significance of," he says. "We were studying graphene on an atomically flat surface of mica and found some nanoscale island-shaped structures trapped between the graphene and the mica that we didn't expect to see." Graphene, which is composed of a one-atom-thick layer of carbon atoms in a honeycomb-like lattice (like chicken wire, but on an atomic scale), should be completely flat when layered onto an atomically flat surface. Heath and his colleagues -- former Caltech graduate student Ke Xu, now at Harvard University, and graduate student Peigen Cao -- thought the anomalies might be water, captured and trapped under the graphene; water molecules, after all, are everywhere. To test the idea, the researchers conducted other experiments in which they deposited the graphene sheets at varying humidity levels. The odd structures became more prevalent at higher humidity, and disappeared under completely dry conditions, leading the researchers to conclude that they indeed were water molecules blanketed by the graphene. Heath and his colleagues realized that the graphene sheet was "atomically conformal" -- it hugged the water molecules so tightly, almost like shrink wrap, that it revealed their detailed atomic structure when examined with atomic force microscopy. (Atomic force microscopes use a mechanical probe to essentially "feel" the surfaces of objects.) "The technique is dead simple -- it's kind of remarkable that it works," Heath says. The method, he explains, "is sort of like how people sputter carbon or gold onto biological cells so they can image them. The carbon or gold fixes the cells. Here, the graphene perfectly templates the weakly adsorbed water molecules on the surface and holds them in place, for up to a couple of months at least." Using the technique, the researchers revealed new details about how water coats surfaces. They found that the first layer of water on mica is actually two water molecules thick, and has the structure of ice. Once that layer is fully formed, a second, two-molecule-thick layer of ice forms. On top of that, "you get droplets," Heath says. "It's truly amazing that the first two adsorbed layers of water form ice-like microscopic islands at room temperature," says Xu. "These fascinating structures are likely important in determining the surface properties of solids, including, for example, lubrication, adhesion, and corrosion." The researchers have since successfully tested other molecules on other types of atomically flat surfaces -- such flatness is necessary so the molecules don't nestle into imperfections in the surface, distorting their structure as measured through the graphene layer. "We have yet to find a system for which this doesn't work," says Heath. He and his colleagues are now working to improve the resolution of the technique so that it could be used to image the atomic structure of biomolecules like antibodies and other proteins. "We have previously observed individual atoms in graphene using the scanning tunneling microscope," says Cao. "Similar resolution should also be attainable for graphene-covered molecules." "We could drape graphene over biological molecules -- including molecules in at least partially aqueous environments, because you can have water present -- and potentially get their 3-D structure," Heath says. It may even be possible to determine the structure of complicated molecules, like protein-protein complexes, "that are very difficult to crystallize," he says. Whereas the data from one molecule might reveal the gross structure, data from 10 will reveal finer features -- and computationally assembling the data from 1,000 identical molecules might reveal every atomic nook and cranny. If you imagine that graphene draped over a molecule is sort of like a sheet thrown over a sleeping cat on your bed, Heath explains, having one image of the sheet-covered lump -- in one orientation -- "will tell you that it's a small animal, not a shoe. With 10 images, you can tell it's a cat and not a rabbit. With many more images, you'll know if it's a fluffy cat -- although you won't ever see the tabby stripes." The work in the paper was funded by the United States Department of Energy's Office of Basic Energy Sciences

Researchers Discover Proton Diode: Water Is an Active Element in Proteins


The proton diode in the light-driven proton pump bacteriorhodopsin
 
Biophysicists in Bochum have discovered a diode for protons: just like the electronic component determines the direction of flow of electric current, the "proton diode" ensures that protons can only pass through a cell membrane in one direction. Water molecules play an important role here as active components of the diode. The researchers led by Prof. Dr. Klaus Gerwert (Chair of Biophysics at the RUB) were able to observe this through a combination of molecular biology, X-ray crystallography, time-resolved FTIR spectroscopy and biomolecular simulations. They report in the current international online edition of the journal Angewandte Chemie. 

Protons drive the protein turbines
The proton diode plays an important role in the energy production of cells. Light-driven proton pumps -- certain proteins that traverse the cell membrane -- eject protons out of the cell, so that excess pressure is generated outside "much like the water pressure at a dam," explains Prof. Gerwert. Elsewhere, the protons push back into the cells to compensate the concentration gradient, and thereby drive the turbines of the cell, proteins known as ATPases. The energy thus released is converted into the universal fuel of the cells, ATP (adenosine triphosphate). "This process is a kind of archaic photosynthesis" explains Prof. Gerwert. "The light energy is ultimately converted into usable energy for the organism."
The details of these processes are the subject of research. In particular, the role of water molecules in proteins has long been unclear. "Previously it was believed that the water molecules blundered into the proteins by chance, and fulfilled no particular function," says Gerwert. Manfred Eigen, born in Bochum in 1967, was awarded the Nobel Prize for chemistry because he was able to explain why water and ice protons are such rapid conductors. The current work shows that proteins also use precisely this mechanism and that the water molecules do indeed carry out an active function in the protein.
Water is as important as amino acids
This result supports the hypothesis drawn up by Klaus Gerwert in 2006 in Nature that protein-bound water molecules are just as important catalytic elements for the function of proteins as amino acids, the building blocks of life. Consequently, the Bochum biophysicists have devoted their work in Angewandte Chemie to Manfred Eigen. Eigen also published his central thesis on proton transfer in water in Angewandte Chemie in 1964. Klaus Gerwert was inspired by Manfred Eigen's winter seminars in Klosters.
Film instead of fixed image
The Bochum researchers were able to achieve their results in an interdisciplinary approach through a combination of molecular biology, X-ray crystallography, time-resolved FTIR spectroscopy and biomolecular simulations. This combination shows the dynamic processes in the protein after light excitation with atomic resolution. "You can track how the proton is transported from the central proton binding site inside the protein via an amino acid and then via a protonated water cluster to the membrane surface," says Prof. Gerwert. The interdisciplinary approach is now expanding the classical methods of structural biology, X-ray crystallography and nuclear magnetic resonance spectroscopy (NMR), as it provides a complete film and not just fixed images of proteins. The experiments in Bochum were supplemented by computer simulations in Shanghai. Klaus Gerwert is both a professor at the RUB and Director of the Max Planck Partner Institute for Computational Biology in Shanghai.

Metal-Mining Bacteria Are Green Chemists


Bacterial cells that can accumulate high quantities of precious metals are an efficient and green alternative to traditional recycling methods. Here, E. coli cells are surrounded by nanoparticles of palladium and gold (black deposits)
Microbes could soon be used to convert metallic wastes into high-value catalysts for generating clean energy, say scientists writing in the September issue of Microbiology.
Researchers from the School of Biosciences at the University of Birmingham have discovered the mechanisms that allow the common soil bacterium Desulfovibrio desulfuricans to recover the precious metal palladium from industrial waste sources.
Palladium is one of the platinum group metals (PGMs) which are among the most precious resources on earth. They possess a wide variety of applications, due to their exceptional chemical properties. PGMs are routinely used in many catalytic systems and are the active elements of autocatalytic converters that reduce greenhouse gas emissions.

Dr Kevin Deplanche who led the study explained why new ways of recovering PGMs are needed. "These metals are a finite resource and this is reflected in their high market value," he said. "Over the last 10 years, demand has consistently outstripped supply and so research into alternative ways of recovering palladium from secondary sources is paramount to ensuring future availability of this resource."
Previous work in the team's lab showed that Desulfovibrio desulfuricans was able to reduce palladium in industrial wastes into metallic nanoparticles with biocatalytic activity. Now, the precise molecules involved in the reduction process have been identified. Hydrogenase enzymes located on the surface membrane of the bacterium carry out the reduction of palladium, which results in the accumulation of catalytic nanoparticles. The bacterial cells coated with palladium nanoparticles are known as 'BioPd."
The group believes that BioPd has great potential to be used for generating clean energy. "Research in our group has shown that BioPd is an excellent catalyst for the treatment of persistent pollutants, such as chromium, that is used in the paint industry. BioPd could even be used in a proton exchange fuel cell to make clean electricity from hydrogen," said Dr Deplanche. "Our ultimate aim is to develop a one-step technology that allows for the conversion of metallic wastes into high value catalysts for green chemistry and clean energy generation,

New Material May Reveal Inner Workings of Hi-Temp Superconductors

Measurements taken at the National Institute of Standards and Technology (NIST) may help physicists develop a clearer understanding of high-temperature superconductors, whose behavior remains in many ways mysterious decades after their discovery. A new copper-based compound exhibits properties never before seen in a superconductor and could be a step toward solving part of the mystery.
Copper-based high-temperature superconductors are created by taking a nonconducting material called a Mott insulator and either adding or removing some electrons from its crystal structure. As the quantity of electrons is raised or lowered, the material undergoes a gradual transformation to one that, at certain temperatures, conducts electricity utterly without resistance. Until now, all materials that fit the bill could only be pushed toward superconductivity either by adding or removing electrons -- but not both.
However, the new material tested at the NIST Center for Neutron Research (NCNR) is the first one ever found that exhibits properties of both of these regimes. A team of researchers from Osaka University, the University of Virginia, the Japanese Central Research Institute of Electric Power Industry, Tohoku University and the NIST NCNR used neutron diffraction to explore the novel material, known only by its chemical formula of YLBLCO.
The material can only be made to superconduct by removing electrons. But if electrons are added, it also exhibits some properties only seen in those materials that superconduct with an electron surplus -- hinting that scientists may now be able to study the relationship between the two ways of creating superconductors, an opportunity that was unavailable before this "ambipolar" material was found.
The results are described in detail in a "News and Views" article in the August, 2010, issue of Nature Physics, "Doped Mott insulators: Breaking through to the other side.

The rechargeable LED lightglobe

Didn't Uncle Fester do this years ago?

Now here's something we've never seen before – a rechargeable lightglobe. Chinese company Magic Bulb has patented a new type of device which incorporates a battery and LED lightblobe to produce a lightglobe which uses only 4 watts but produces the equivalent light of a traditional 50W globe. If the power fails, the globe will keep running for around three hours or it can be screwed out of its socket and the handle extended to turn it into a bright torch.
The Magic Bulb was on show in the China section of IFA in Berlin this week and is expected to retail for between US$30 and $40 when it finds distribution in other parts of the world. Does it have a significant and viable point-of-difference to other globes – you betcha!
It's a set and forget solution that will almost certainly come in handy when the electricity goes down next. It has a life of 20,000 hours, saves over 70 percent of the power used by an equivalent brightness 50W filament globe, and meets all the international standards.

Life Cycle Analysis of Electric Car Shows Battery Has Only Minor Impact

battery nissan leaf photo
Nissan LEAF batteries. Photo: Michael Graham Richard


Speaking of lithium-ion batteries, a recent life-cycle analysis (a type of study that aims to find the complete environmental impact of something, taking into account manufacturing, usage, and disposal) of the lithium-ion batteries used in electric cars had some very interesting findings. It turns out that batteries have an even lower impact than most of us thought. Read on for the details.


Lithium's the Least of Our Problems
The LCA study finds that the environmental burden caused by the lithium-ion battery is of at most 15% of the total impact of the electric car (which includes making it, using and maintaining it, and disposing of it at the end of its useful life). Interestingly, the lithium itself represents just a small part of that; about 7.5% of the impact occurs when "refining and manufacturing the battery's raw materials, copper and aluminium". The lithium itself is only responsible for 2.3% of total.
This seems an argument in favor of making extra sure to recycle batteries properly at the end of their lives, and to use non-virgin sources of materials whenever possible (recycled aluminium has a much lower impact: "Recycling scrap aluminium requires only 5% of the energy used to make new aluminium.")
"Lithium-ion rechargeable batteries are not as bad as previously assumed," according to Dominic Notter, coauthor of the study which has just been published in the scientific journal "Environmental Science & Technology."
Lithium Shortage? Not Anytime Soon
As for those who worry about Lithium supply, just know that Bolivia alone has enough of it for billions of electric cars and they only have about 1/3 of known world supplies (if there's ever a big shortage, higher prices would probably lead to new discoveries), and lithium isn't destroyed when used in a battery, so it can be recycled and reused.
Further Reducing Impact of Batteries
The total impact of EV batteries can be further reduced if at the end of their "vehicle" life they are used for other forms of energy storage. Indeed, they can still hold up to 80% of their charge even after having been used for years in a vehicle, so before recycling them, they could be used to store power on the grid (such as intermittent power from wind farms).
But the most important thing when it comes to electric cars will be to clean up the power grid. That's the best bang for the buck when it comes to fighting global warming; a cleaner source of power for homes and industry combined with an electrified transportation sector would drastically cut greenhouse gas emissions.

Breaking Up Phosphorus With Ultraviolet Light May Offer a Safer, Simpler Way to Build Many Industrial and Household Chemicals


Researchers have developed a new way to attach phosphorus to organic compounds by first splitting the phosphorus with ultraviolet light
Phosphorus, a mineral element found in rocks and bone, is a critical ingredient in fertilizers, pesticides, detergents and other industrial and household chemicals. Once phosphorus is mined from rocks, getting it into these products is hazardous and expensive, and chemists have been trying to streamline the process for decades.
MIT chemistry professor Christopher Cummins and one of his graduate students, Daniel Tofan, have developed a new way to attach phosphorus to organic compounds by first splitting the phosphorus with ultraviolet light. Their method, described in the Aug. 26 online edition of Angewandte Chemie, eliminates the need for chlorine, which is usually required for such reactions and poses health risks to workers handling the chemicals.
Guy Bertrand, chemistry professor at the University of California at Riverside, says the beauty of the discovery is its simplicity. "It is amazing to realize that nobody thought earlier about such a simple approach to incorporate phosphorus into organic molecules," he says. "Such a synthetic approach to organophosphorus compounds is indeed urgent, since the old (chlorine)-based phosphorus chemistry has a lot of undesirable consequences on our environment."
While the new reaction cannot produce the quantities needed for large-scale production of phosphorus compounds, it opens the door to a new field of research that could lead to such industrial applications, says Bertrand, who was not involved in the research.

Extracting phosphorus
Most natural phosphorus deposits come from fossilized animal skeletons, which are especially abundant in dried-up seabeds. Those phosphorus deposits exist as phosphate rock, which usually includes impurities such as calcium and other metals that must be removed.
Purifying the rock produces white phosphorus, a molecule containing four phosphorus atoms. White phosphorous is tetrahedral, meaning it resembles a four-cornered pyramid in which each corner atom is bound to the other three. Known as P4, white phosphorus is the most stable form of molecular phosphorus. (There are also several polymeric forms, the most common of which are black and red phosphorus, which consist of long chains of broken phosphorus tetrahedrons.)
For most industrial uses, phosphorus has to be attached one atom at a time, so single atoms must be detached from the P4 molecule. This is usually done in two steps. First, three of the atoms in P4 are replaced with chlorine, resulting in PCl3 -- a phosphorus atom bound to three chlorine atoms.
Those chlorine atoms are then displaced by organic (carbon-containing) molecules, creating a wide variety of organophosphorus compounds such as those found in pesticides. However, this procedure is both wasteful and dangerous -- chlorine gas was used as a chemical weapon during World War I -- so chemists have been trying to find new ways to bind phosphorus to organic compounds without using chlorine.
A new reaction
Cummins has long been fascinated with phosphorus, in part because of its unusual tetrahedral P4 formation. Phosphorus is in the same column of the periodic table as nitrogen, whose most stable form is N2, so chemists expected that phosphorus might form a stable P2 structure. However, that is not the case.
For the past few years, Cummins' research group has been looking for ways to break P4 into P2 in hopes of attaching the smaller phosphorus molecule to organic compounds. In the new study, Cummins drew inspiration from a long overlooked paper, published in 1937, which demonstrated that P4 could be broken into two molecules of P2 with ultraviolet light. In that older study, P2 then polymerized into red phosphorus.
Cummins decided to see what would happen if he broke apart P4 with UV light in the presence of organic molecules that have an unsaturated carbon-carbon bond (meaning those carbon atoms are able to grab onto other atoms and form new bonds). After 12 hours of UV exposure, he found that a compound called a tetra-organo diphosphane had formed, which includes two atoms of phosphorus attached to two molecules of the organic compound.
This suggests, but does not conclusively prove, that P2 forms and then immediately bonds to the organic molecule. In future studies, Cummins hopes to directly observe the P2 molecule, if it is indeed present.
Cummins also plans to investigate what other organophosphorus compounds can be synthesized with ultraviolet light, including metallic compounds. He has already created a nickel-containing organophosphorus molecule, which could have applications in electronics.