Thursday, October 21, 2010

Inhaling Nitric Oxide Eases Pain Crises in Sickle Cell Patients, Researchers Find


Dr. C. Alvin Head is chairman of the Department of Anesthesiology at the Medical College of Georgia School of Medicine. 

Inhaling nitric oxide appears to safely and effectively reduce pain crises in adults with sickle cell disease, researchers report.
A study of 18 patients in Atlanta, Chicago and Detroit showed that the nine inhaling nitric oxide for four hours had better pain control than those receiving only the standard self-administered morphine, said Dr. C. Alvin Head, chairman of the Department of Anesthesiology at the Medical College of Georgia School of Medicine.
"This study shows that you can breathe the gas and have less pain, which is the major reason sickle cell patients are admitted to the hospital," said Head, corresponding author of the study published in the American Journal of Hematology.
A larger study will help define the optimal dose as well as timing and duration for the treatment, Head said. If findings continue to hold, he envisions sickle cell patients, much like asthmatics, having nitric oxide inhalers handy to forestall a full-blown pain crisis. The pain results when sickle cell patients' abnormally shaped hemoglobin impedes oxygen delivery.
"By the time you see a patient in the emergency room or the clinic, they have a significant amount of pain and you are always playing catch-up," Head said. "The idea would be to use this as early as possible."
While it's not certain how nitric oxide helps, Head has laboratory evidence and some early clinical indications that nitric oxide, which has a great affinity for hemoglobin, restores hemoglobin's natural shape and charge. The more normal negative charge helps cells repel each other, melts sticky polymers and may prevent new ones from forming. In fact, he suspects that one of nitric oxide's usual duties in the body is to help prevent clot formation.
"If you have pain relief without more narcotic then we must be attacking the problem," Head noted. The study participants receiving nitric oxide use slightly less morphine than the control group and continued to experience pain relief two hours after the therapy ended. No patients showed signs of nitric oxide toxicity.
Head suspects morphine will eventually be replaced by a mix of other drugs, such as nitric oxide, that address the pain's root cause.
He is planning human and animal studies to see if extremely low doses of nitric oxide during pregnancy also can improve delivery rates. "We think it will be productive so the mother has fewer crises, less stress, more blood flow to the placenta and an improved chance of a baby to be delivered,"

Plastics and Nanoparticles -- The Perfect Combination

 
When combined with plastics, these surface-modified carbon nanotubes can, for example, improve an aircraft’s protection against lightning strikes.

These days, plastic components are vital to many fields of industry -- lightweight construction, automobile manufacturing and electrical engineering, to name but a few. Now researchers have found ingenious ways to combine plastics with nanoparticles and endow them with new properties. Thanks to these innovative materials, aircraft could in future be better protected against lightning strikes.
Picture the scene: Pitch-black clouds gathering on the horizon, an aircraft winging its way towards the storm … and suddenly a flash of white-hot lightning splits the sky. It is by no means a rare occurrence for aircraft to have to pass through bad weather fronts, but when they do, there is always one major danger -- lightning. Naturally, aircraft manufacturers do everything they can to protect their machines against strikes, but even aircraft made of aluminum do not always escape entirely unscathed. And when polymer components -- usually carbon fiber reinforced plastics (CFRPs) -- are incorporated into the design as a weight-saving measure, the situation becomes even more problematic, because they do not conduct electrical current as well as aluminum.
At the Fraunhofer Institute for Manufacturing Technology and Advanced Materials IFAM in Bremen, researchers have now developed a process for manufacturing new materials that should afford aircraft better protection against lightning strikes. They have been focusing on the unique material properties of carbon nanotubes (CNTs). CNTs are among the stiffest and strongest fibers known, and have particularly high electrical conductivity. In order to transfer their properties to CFRPs, the scientists have been combining these nanoparticles with plastics. "By mixing nanoparticles with plastics, we've been able to significantly enhance the material properties of the latter," states Dr. Uwe Lommatzsch, project manager at the IFAM. To give just two examples, CNTs are being used to optimize the electrical conductivity of plastics, and their heat dissipation properties are likewise being improved by the addition of metal particles.
The trick is in the mixing process, says Lommatzsch: "The micro- or nanoparticles must be highly homogeneous, and sometimes very closely bound to the polymer." To do this, the scientists employ plasma technology. They use an atmospheric plasma to alter the surface of the particles in such a way that they can be more readily chemically bound with the polymer. A pulsed discharge in a reaction chamber creates a reactive gas. Lommatzsch's colleague, Dr. Jörg Ihde, explains: "We spray the particles -- i.e. the nanotubes -- into this atmospheric plasma." They immediately fall into the selected solvent, which can then be used to further process the polymer. The whole procedure takes just a few seconds -- a huge advantage over the old method, in which CNTs were generally prepared in an acid bath using a wet chemical process. That took several hours or days, required considerably more chemicals, and generated significantly more waste.
In addition to improved carbon fiber reinforced plastics for use in aircraft manufacturing, the IFAM researchers have several other potential applications in mind. Ihde outlines an example: "We can increase the heat dissipation properties of electrical components by giving metal particles of copper or aluminum an electrically insulating coating in the plasma and then mixing them into a polymer." This can be pressed onto an electronic component so heat is dissipated directly. "Overheating of elements is a major problem in the electronics industry," he adds. The researchers have also devised a way to reduce electromagnetic losses by using this plasma process to coat soft magnetic particles such as iron and then combining them with plastics. Built into electric motors, they cut eddy current losses, thus improving efficiency and lengthening service life. IFAM experts will be exhibiting surface-modified carbon nanotubes -- which demonstrate significantly enhanced miscibility with solvents -- at the K 2010 trade fair in Düsseldorf, from October 27 through November 3.

Unexpected Magnetic Order Among Titanium Atoms Discovered


Abstract representation of crystalline layers.

Theoretical work done at the Department of Energy's Oak Ridge National Laboratory has provided a key to understanding an unexpected magnetism between two dissimilar materials.
 The results, published in Nature Communications, have special significance for the design of future electronic devices for computations and telecommunications, according to co-author Satoshi Okamoto of ORNL's Materials Science and Technology Division. The work was performed at Universidad Complutense de Madrid, synchrotron radiation facilities in France and Japan, University of Bristol and University of Warwick.
"What the team found was an unexpected magnetic order among the titanium atoms at an interface between strontium titanate and lanthanum manganite, which are both insulators in bulk," Okamoto said.
With today's nano-fabrication tools, scientists can develop artificial materials with controlled precision -- almost atom by atom -- of alternating very thin crystalline layers of different materials. The properties of these materials are determined by the structure of interfaces of the different materials and how atoms interact through the interfaces.
Such an interface has traditionally been considered a source of disorder, but in the case of materials such as complex oxides used for this study, the result was something that does not exist naturally in any other material. In order to clarify the electronic properties of such interfaces, the research team made detailed synchrotron X-ray measurements.
"The result was even more surprising as we observed a new type of magnetism in titanium atoms, which are non-magnetic in bulk strontium titanate," Okamoto said.
Furthermore, the researchers were able to manipulate the structure of spin, or magnetism, at atomic scale. The theoretical work by Okamoto provided the key to understand the origin of this novel form of interfacial magnetism and is of particular importance for the development of new spintronic devices such as tunneling magneto-resistance junction, which can be used as a head of a hard-disc drive.
While today's electronic devices are based on the transfer of electrical charge between two materials, a potential alternative, spintronic devices, would also use the magnetic moment, or spin, of electrons in addition to their charge and would therefore be more efficient for sending or storing information as an electric signal.
The research, published Sept. 21, was led by Jacobo Santamaria of Universidad Complutense de Madrid. Funding was provided by the Spanish Ministry of Science and Innovation. Work at ORNL was supported by DOE's Office of Basic Energy Sciences.

With a Chaperone, Copper Breaks Through: Research Identifies Features of Copper Transfer That May Improve Chemotherapy Treatments


Information on proteins is critical for understanding how cells function in health and disease. But while regular proteins are easy to extract and study, it is far more difficult to gather information about membrane proteins, which are responsible for exchanging elements essential to our health, like copper, between a cell and its surrounding tissues.
Now Prof. Nir Ben-Tal and his graduate students Maya Schushan and Yariv Barkan of Tel Aviv University's Department of Biochemistry and Molecular Biology have investigated how a type of membrane protein transfers essential copper ions throughout the body. This mechanism, Schushan says, could also be responsible for how the body absorbs Cisplatin, a common chemotherapy drug used to fight cancer. In the future, this new knowledge may allow scientists to improve the way the drug is transferred throughout the body, she continues.
Their breakthrough discovery was detailed in a recent issue of the Proceedings of the National Academy of Sciences.
Cellular gatekeepers and chaperones
Most proteins are water soluble, which allows for easy treatment and study. But membrane proteins reside in the greasy membrane that surrounds a cell. If researchers attempt to study them with normal technology of solubilization in water, they are destroyed -- and can't be studied.
Copper, which is absorbed into the body through a membrane protein, is necessary to the healthy functioning of the human body. A deficiency can give rise to disease, while loss of regulation is toxic. Therefore, the cell handles copper ions with special care. One chaperone molecule delivers the copper ion to an "entrance gate" outside the cell; another chaperone then picks it up and carries it to various destinations inside the cell.
The researchers suggest that this delicate system is maintained by passing one copper ion at a time by the copper transporter, allowing for maximum control of the copper ions. "This way, there is no risk of bringing several copper ions into the protein at the same time, which ultimately prevents harmful chemical reactions between the ions and the abundant chemical reagents within the cell," explains Prof. Ben-Tal. Once the ion has passed through the transporter into the cell, the transporter is ready to receive another copper ion if necessary.
Improving cancer drugs -- and more
The mechanism which transfers copper throughout the body may also be responsible for the transfer of the common chemotherapy drug Cisplatin. By studying how copper is transferred throughout the body, researchers may also gain a better understanding of how this medication and others are transferred into the cell.
With this information, says Prof. Ben-Tal, scientists could improve the transfer of the drug throughout the body, or develop a more effective chemotherapy drug. And that's not the only pharmaceutical dependent on the functioning of membrane proteins. "Sixty percent of drugs target membrane proteins," he explains, "so it's critical to learn how they function."
This work was done in collaboration with Prof. Turkan Haliloglu from Bogazici University, Istanbul.

Tuesday, October 19, 2010

Bioneers 2010: Gary Hirshberg Explains How To Scale Up Sustainable Foods

cow nose photo

Gary Hirshberg is the CE-Yo of Stonyfield Farms, an organic yogurt company, advocates that it is possible for businesses to be part of the solution when it comes to sustainability. It's been a point he's successfully lived out through his company for years, and at Bioneers he made the point that it is possible to scale up organic foods, feeding the masses while staying sustainable. It's a contentious point and one of hot debate among, well, most anyone, but during his talk he landed on five essential elements of making organic mainstream.
We're big fans of how Stonyfield Farms is run. The company puts sustainability before profits every single time. But the best part is the profits are never sacrificed -- the business is growing hand over first because it is run with the environment as a priority.
At this year's Bioneers, Hirshberg gave a presentation highlighting that it's often considered impossible to be a big business and have a minimal environmental footprint. But he countered that notion with examples of how Stonyfield Farms has zero sludge from its wastewater facility since it coughed up the extra $600,000 it would take to install an anerobic facility instead of a traditional wastewater facility. They earned their money back in just the first 9 months, in addition to the benefit of zero waste. Also, the company is able to support 1,750 organic farmers through their milk purchases, and fair-wage co-ops in their banana and sugar purchases. The fact that the company was able to create 46 new jobs while every one of their non-organic competitors had to lay off workers during the recession emphasized that sustainable practices are profitable for both business and nature.
But perhaps the biggest point that Hirshberg wanted to drive home is that this type of business practice is scalable -- it doesn't have to be just the small businesses that are green. He named five must-do items for scaling up.
1) Be activists where we shop
Hirshberg stated that consumers have to drive the demand, so we all must do our research and make sure that we're buying the greenest product (which might not always be the local product).
2) Recycling means we've failed
Businesses have to figure out how to reduce and reuse so that recycling is unnecessary.
3) Organic is not just for the elite
Organic foods often seem like they're only available to those with enough money to buy them, but Hirshberg is adamant that it doesn't need to be this way. We need to make organic foods affordable for everyone.
4) Design sustainable products and packaging
Hirshberg noted that Stonyfield Farms recently switched all of its packaging to plant-based plastic. He stated that while that reduces the company's oil consumption, corn isn't a perfect option. So they ensure that they counter their footprint with GMO offsets, which goes to literally paying GMO corn growers to switch to non-GMO corn.
5) Engage in politics
Hirshberg pointed out that the five largest agriculture interests spent $28 billion on lobbying since 2008. Organic businesses have to get active too, pushing for the regulations that protect the environment and businesses together. He also noted that we have to become more open source -- Stonyfield Farms keeps no secrets, letting their competitors know their moves because they feel this will lead to faster advances on sustainable practices.
It might be easier said than done, but the principles set out by Hirshberg out are a good starting point for areas businesses can focus on if they want to help boost organic products in the marketplace without losing sight of the bigger goal of going organic. Stonyfield Farms is a perfect example of how a business can grow not only its bottom line, but the bottom line of other sustainable businesses and the health of the environment.

With a Chaperone, Copper Breaks Through: Research Identifies Features of Copper Transfer That May Improve Chemotherapy Treatments

 Information on proteins is critical for understanding how cells function in health and disease. But while regular proteins are easy to extract and study, it is far more difficult to gather information about membrane proteins, which are responsible for exchanging elements essential to our health, like copper, between a cell and its surrounding tissues.
Now Prof. Nir Ben-Tal and his graduate students Maya Schushan and Yariv Barkan of Tel Aviv University's Department of Biochemistry and Molecular Biology have investigated how a type of membrane protein transfers essential copper ions throughout the body. This mechanism, Schushan says, could also be responsible for how the body absorbs Cisplatin, a common chemotherapy drug used to fight cancer. In the future, this new knowledge may allow scientists to improve the way the drug is transferred throughout the body, she continues.
Their breakthrough discovery was detailed in a recent issue of the Proceedings of the National Academy of Sciences.
Cellular gatekeepers and chaperones
Most proteins are water soluble, which allows for easy treatment and study. But membrane proteins reside in the greasy membrane that surrounds a cell. If researchers attempt to study them with normal technology of solubilization in water, they are destroyed -- and can't be studied.
Copper, which is absorbed into the body through a membrane protein, is necessary to the healthy functioning of the human body. A deficiency can give rise to disease, while loss of regulation is toxic. Therefore, the cell handles copper ions with special care. One chaperone molecule delivers the copper ion to an "entrance gate" outside the cell; another chaperone then picks it up and carries it to various destinations inside the cell.
The researchers suggest that this delicate system is maintained by passing one copper ion at a time by the copper transporter, allowing for maximum control of the copper ions. "This way, there is no risk of bringing several copper ions into the protein at the same time, which ultimately prevents harmful chemical reactions between the ions and the abundant chemical reagents within the cell," explains Prof. Ben-Tal. Once the ion has passed through the transporter into the cell, the transporter is ready to receive another copper ion if necessary.
Improving cancer drugs -- and more
The mechanism which transfers copper throughout the body may also be responsible for the transfer of the common chemotherapy drug Cisplatin. By studying how copper is transferred throughout the body, researchers may also gain a better understanding of how this medication and others are transferred into the cell.
With this information, says Prof. Ben-Tal, scientists could improve the transfer of the drug throughout the body, or develop a more effective chemotherapy drug. And that's not the only pharmaceutical dependent on the functioning of membrane proteins. "Sixty percent of drugs target membrane proteins," he explains, "so it's critical to learn how they function."
This work was done in collaboration with Prof. Turkan Haliloglu from Bogazici University, Istanbul.

NASA Technology Could Aid in Interpretation of Mammograms, Ultrasound, Other Medical Imagery


The left image shows an original mammogram before MED-SEG processing. The image on the right, with region of interest (white) labeled, shows a mammogram after MED-SEG processing.
NASA software used to enhance Earth Science Imagery could one day aid in the interpretation of mammograms, ultrasounds and other medical imagery.
The new MED-SEG system, developed by Bartron Medical Imaging, Inc., a Connecticut-based small company, with satellite offices in Maryland, relies on an innovative software program developed at NASA's Goddard Space Flight Center in Greenbelt, Md., to help doctors analyze mammograms, ultrasounds, digital X-rays, and other medical imaging tests.
"The use of this computer-based technology could minimize human error that occurs when evaluating radiologic films and might allow for earlier detection of abnormalities within the tissues being imaged," said Dr. Thomas Rutherford, a medical doctor and director of gynecologic oncology at Yale University.
The FDA recently cleared the system to be used by trained professionals to process images. These images can be used in radiologist's reports and communications as well as other uses, but the processed images should not be used for primary image diagnosis.
The entire indications for use: MED-SEG is a software device that receives medical images and data from various imaging sources (including but not limited to CT, MR, US, RF units), computed and direct radiographic devices, and secondary capture devices, (scanners, imaging gateways or imaging sources). Images and data can be stored, communicated, processed and displayed within the system or across computer networks at distributed locations.
The core of Bartron's MED-SEG system is a computer algorithm -- Hierarchical Segmentation Software (HSEG) -- developed by Goddard Computer Engineer James C. Tilton, Ph.D.
Tilton began working on his algorithm more than 25 years ago. His goal was to advance a totally new approach for analyzing digital images, which are made up of thousands of pixels. Like a single piece of a jigsaw puzzle, a pixel often does not provide enough information about where it fits into the overall scene. To overcome the deficiency, Tilton focused on an approach called image segmentation, which organizes and groups an image's pixels together at different levels of detail. But Tilton's approach to image segmentation was different than other approaches in that it not only finds region objects, but also groups spatially separated region objects together into region classes.
For example, an Earth satellite image may contain several lakes of different depths. Deep lakes appear dark blue, while shallow lakes are a lighter shade of blue. HSEG first finds each individual lake; then it groups together all shallow lakes into one class and the deeper lakes into another. Because lakes are more similar than they are to trees, grass, roads, buildings, and other objects, the software then groups all lakes together, regardless of their varying colors. As a result, HSEG allows the user to distinguish important features in the scene accurately and quickly.
Since Tilton developed the algorithm, scientists have used it to analyze Earth-imaging data from NASA's Landsat and Terra spacecraft, using it to improve the accuracy of snow and ice maps produced from the data. Scientists also have used it to find potential locations for archeological digs, the premise being that vegetation covering a long-abandoned human settlement would look different than the surround flora.
"My original concept was geared to Earth science," Tilton said. "I never thought it would be used for medical imaging." In fact, he initially was skeptical; that is, until he processed cell images and was able to see details not visible in unprocessed displays of the image. "The cell features stood out real clearly and this made me realize that Barton was onto to something."
Bartron learned of the software through Goddard's Innovative Partnerships Program (IPP) Office, and in 2003 licensed the patented technology to create a system that would differentiate hard-to-see details in complex medical images.
Bartron's exclusive license of NASA's HSEG technologies in the medical imaging field allows the company to contribute to the work of doctors who analyze images obtained from computed tomography (CT), magnetic resonance imaging (MRI), ultrasound, radio frequency, and other imaging sources.
"Trained professionals can use the MED-SEG system to separate two-dimensional images into digitally related sections or regions that, after colorization, can be individually labeled by the user," explained Fitz Walker, president and CEO of Bartron Medical Imaging.
With the MED-SEG system, medical centers will be able to send images via a secure Internet connection to a Bartron data center for processing by the company's imaging application. The data are then sent back to the medical center for use by medical personnel during diagnosis. Bartron has installed the system at the University of Connecticut Health Center, with the possibility of installing evaluation systems at New York University Medical Center, Yale-New Haven Medical Center, and the University of Maryland Medical Center.
Through a Cooperative Research and Development Agreement, Tilton also worked with the company to develop, test, and document a new, three-dimensional version of HSEG, which the company plans to incorporate into the next version of the MED-SEG product.
In the future, Dr. Molly Brewer, a professor with the Division of Gynecologic Oncology, University of Connecticut Health Center, would like to do clinical trials with the MED-SEG system. The goal, she said, would be improving mammography as a diagnostic tool for detecting breast cancer. "One problem with mammograms is they often give a false negative for detecting abnormalities in women's breasts. Women who either have high breast density or a strong family history of breast cancer are often sent for MRIs, which are costly, very uncomfortable and have a high false positive rate resulting in many unnecessary biopsies. Neither imaging modality can detect cancers without a significant number of inaccuracies either missing cancer or overcalling cancer. In addition, reading these tests relies on detecting differences in density which is highly subjective. The MED-SEG processes the image allowing a doctor to see a lot more detail in a more quantitative way. This new software could save patients a lot of money by reducing the number of costly and unnecessary tests."
Tilton's technology is not limited to medical imaging, said Enidia Santiago-Arce, the IPP technology manager for HSEG. "It can be applied to many types of image processing for a wide variety of fields, from monitoring crops to facial recognition to image data mining. HSEG is available for licensing beyond the field of medical imaging."
NASA is releasing this story in conjunction with Breast Cancer Awareness Month.

Studies of Radiative Forcing Components: Reducing Uncertainty About Climate Change


This figure from the IPCC summarises the IPCC’s conclusions on radiative forcing by anthropogenic drivers with a warming (in red) and cooling (in blue) effect on the Earth’s climate from 1750 to 2005. Note the error bars for each column. These indicate the assessed level of uncertainty assigned to each factor by the IPCC in 2007. The column at the far right shows the net global warming induced by humans.
Much is known about factors that have a warming effect on Earth's climate -- but only a limited amount is understood about factors that have a cooling effect. Researchers at the Center for International Climate and Environmental Research -- Oslo (CICERO) are working to fill the knowledge gap.
Gunnar Myhre has been working to reduce the level of uncertainty in projections from the Intergovernmental Panel on Climate Change (IPCC). With funding from the Research Council of Norway's NORKLIMA Programme, he and his research colleagues have studied as many radiative forcing components as possible simultaneously.
Energy balance of the Earth system
Earth's temperature is determined by the difference between incoming solar energy and outgoing thermal radiation from Earth's surface and atmosphere that escapes into space. In its assessment reports, the IPCC estimates the energy balance using various radiative forcing components on Earth's climate.
Unsure of cooling effect
The IPCC's Fourth Assessment Report (2007) estimates the overall radiative forcing from anthropogenic carbon dioxide in the atmosphere to be 1.66 W/m2. The combined forcing from other greenhouse gases (methane, nitrous oxide and halocarbons) was estimated at 0.98 W/m2. The IPCC assigned medium confidence to these estimates.
The IPCC was far less certain about the cooling effect of various anthropogenic radiative forcing components, particularly atmospheric aerosols (tiny, floating particulates and droplets). In 2007 the IPCC estimated that anthropogenic aerosols have a direct and indirect cooling effect, with a radiative forcing of -1.2 W/m2.
Global team effort
In recent years, scientists all around the world have taken part in a collective research effort to enhance knowledge about our planet's climate and to reduce uncertainty about the effects of various drivers of climate change. In the project Radiative forcing of climate change, Dr Gunnar Myhre and his colleagues have carried out climate modelling on a large scale with the aim of reducing the uncertainty associated with the degree of radiative forcing exerted by aerosols and ozone.
"Our objective has been to provide the IPCC with knowledge that enables them to produce better projections of climate change," says Dr Myhre.
While greenhouse gases warm our climate, and aerosols in all probability act to cool it, factors such as ozone and changes in albedo (the reflectivity of different surfaces) can work both ways.
Impact of aerosols
For many years, human activities have sent vast amounts of aerosols into the atmosphere in the form of sulphate, black carbon and organic particulates. These anthropogenic aerosols are emitted in addition to natural aerosols such as sea salt, sulphate particles in volcanic ash, and sand from the Sahara.
The Western countries have managed to lower their sulphate emissions dramatically. But China in particular is still emitting large amounts of both black carbon and sulphate, a situation causing tremendous health problems for millions of Chinese. So it is likely that within a few years, China will also focus on major reductions in emissions.
"When better sulphate abatement technology is available and emissions drop, it may lead to warming of the atmosphere," says Dr Myhre.
Can cool or warm the climate
Black carbon and sulphate are short-lived climate forcers, with atmospheric lifetimes of as little as a few weeks.
"More knowledge is needed about what will happen when atmospheric levels of black carbon and sulphate are reduced. They can either cool or warm the atmosphere -- but in all likelihood their primary effect is cooling," says Dr Myhre. In a 2009 article in the journal Science, he concluded that the amount of black carbon in the atmosphere may have increased sixfold since the industrial age began.
Less uncertainty now
Researchers in the project Radiative forcing of climate change have examined as many climate forcers as possible at the same time, and all of them were treated consistently.
"This makes it easier to understand the individual components, as well as how they affect each other and what uncertainties remain," explains Dr Myhre.
Dr Myhre has worked particularly on aerosols and ozone. He has collaborated with other modelling groups and has been involved in developing a new model for the direct effect of anthropogenic aerosols -- a model that is now proving very useful in the IPCC's work.
"Based on our findings from the radiative forcing project, we have been able to reduce the uncertainty associated with the direct effects of aerosols considerably. Their cooling effect has proved to be less than previously thought. Biomass burning may have a significant effect on the radiation budget, but the net result of the warming and cooling effects is close to zero."
Impacts still clouded by doubt
Since the publication of the IPCC's Fourth Assessment Report, climate scientists have learned more about the various anthropogenic climate forcers, but still much remains to be done, according to Gunnar Myhre.

MRI Zooms in on Microscopic Flow


Remotely detected MRI images show water flowing through a constricted microfluidic channel. Each image is a 'snapshot' of the flow at a given time of flight, and the images are shown as two-dimensional projections of the YZ (top) and XZ (lower) planes, where the constriction is in Y and the overall flow along Z.
"Better and faster results!" is the clarion call for scientists and engineers to continually strive to improve their research tools. Of the tools used to study material structures at the atomic and molecular scales, there is none finer than Nuclear Magnetic Resonance (NMR) spectroscopy and its daughter technology Magnetic Resonance Imaging (MRI). Now, the latest development from the research group of one of the world's foremost authorities on NMR/MRI technology promises NMR/MRI results that are better and faster than ever before -- a million times faster!
Through a combination of remote instrumentation, JPEG-style image compression algorithms and other key enhancements, chemist Alexander Pines and members of his research group have been able to use NMR/MRI to image materials flowing through microfluidic "lab-on-a-chip" devices and zoom in on microscopic objects of particular interest with unprecedented spatial and time resolutions. Pines holds joint appointments with the Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) at Berkeley.
"What excites me most about this new methodology is the possibility of a mobile, chip-based NMR/MRI platform for microfluidic analysis. Who knows? This might turn out to be useful for chemistry and biomedicine," says Pines, an internationally recognized leader in the development of NMR technology, who is a faculty senior scientist in Berkeley Lab's Materials Sciences Division and the Glenn T. Seaborg Professor of Chemistry at UC Berkeley
This latest work, which focused on MRI, has been reported in the journal Science in a paper titled "Zooming in on Microscopic Flow by Remotely Detected MRI." Co-authoring the paper with Pines were Vikram Bajaj, who is still a member of the Pines' group, plus Jeffrey Paulsen, now of Schlumberger-Doll Research, and Elad Harel, now at the University of Chicago.
Says Bajaj, first author on the Science paper, "We have been able to conclusively demonstrate the ability to record microscopic images of flowing macroscopic objects without loss of sensitivity, something that is impossible in conventional MRI. We were also able to illustrate how MRI can be used to measure flow dynamics quantitatively and with high spatial resolution in real microfluidic devices. The spatial resolution we achieved is sufficient to capture the results of hundreds or thousands of parallel assays on a microfluidic device. Furthermore, we recorded these images approximately one million times faster than could be done with a conventional MRI experiment. This means that experiments which would have taken years to complete are now practical considerations."
NMR/MRI signals are made possible by a property found in the atomic nuclei of almost all molecules called "spin," which makes the nuclei act as if they were bar magnets. Obtaining an NMR/MRI signal depends upon an excess of nuclei in a sample with spins pointing either "north" or "south." In the signal-encoding phase of NMR/MRI, the nuclei are exposed to a magnetic field and subjected to radiofrequency pulses so that they absorb and re-emit energy at signature frequencies. In the signal-detection phase of NMR/MRI, the frequencies of the encoded signals are either directly measured to obtain a spectrum (NMR), or used to obtain a second, spatially encoded signal that can then be translated into images (MRI).
MRI has become a staple of modern medicine, providing physicians with a diagnostic tool that is noninvasive, quick, and involves no ionizing radiation that can damage cells and tissue. However, conventional MRI requires huge doughnut-shaped machines that fill an entire room and are extremely expensive to purchase and operate. In recent years, Pines and his group have taken great strides towards making NMR/MRI technology compact, portable and relatively inexpensive. It started with the decoupling of the NMR/MRI signal encoding and signal detection processes, which made remote NMR/MRI possible and opened the technology to lab-on-a-chip microfluidic assays of gases and liquids. With these new developments, Pines and his group have laid the foundation for new NMR/MRI applications in portable chemical and biomedical analysis.
"Our goal is to develop NMR/MRI appliances for portable chemical analysis of complex mixtures, including blood, urine, and saliva," Bajaj says. "Ultimately, we would like to make it possible to use NMR/MRI in point of care clinical analysis."
In their new Science paper, Pines and Bajaj and their co-authors describe how they were able to apply MRI technology to studies involving microscopic flow through microfluidic or biological channels, or through porous materials. The key was the integration of several new elements into their remote NMR/MRI configuration. This included the fabrication of microsolenoid MRI probes with demountable microfluidic device holders, the design of remote MRI sequences for spatial encoding in the presence of motion, as well as for velocimetric measurements, and the use of JPEG-style compressive sampling algorithms for accelerated image encoding.
"The combination of remote NMR/MRI methods with these new elements spectroscopically mimics the implantation of a coil around a microscopic feature of interest and allows us to zoom in on the microscopic details of microfluidic flow dynamics in three spatial dimensions," says Bajaj. "The mechanism of remote detection is analogous to that of a magnetic recording tape on which complex data are first encoded and later read out by a single stationary detector as the tape advances."
This work is supported by the U.S. Department of Energy's Office of Science, and by a gift from the Agilent Technologies Foundation.

UConn Tests Demonstrate Great Potential of Hemp Biodiesel

industrial hemp photo


Some regular TreeHugger readers (and commenters, you know who you are...) are no doubt exclaiming that they being saying the same thing for years, that hemp makes great biofuel--if only the Feds would get out of the way. Well, researchers from the University of Connecticut are nevertheless demonstrating the potential of industrial hemp as an energy crop, even if under current law it would only benefit other nations.
Hemp Biodiesel Works At Lower Temperatures Than Any Other
Professor Richard Parnas, with the help of his graduate students, produced biodiesel from virgin hemp oil and determined that not only could they convert 97% of the hemp oil into fuel, but that the biodiesel could be used at lower temperatures than any other biodiesel currently produced.
The next step, financed from a two-year $1.8 million grant from the Department of Energy, is building a pilot biodiesel reactor, capable of producing 200,000 gallons of biodiesel annually. The facility will test a variety of feedstocks in addition to hemp, including the economic viability of them.
Parnas also makes the food-versus-fuel connection, pointing out that hemp doesn't need high quality land to grow and is not as likely to compete with food crops as a result. Parnas explains, "If someone is already growing hemp, they might be able to produce enough fuel to power their whole farm from the seeds they produce."
US Missing Great Opportunity With Industrial Hemp & Marijuana Prohibition
None of which is gigantic news in itself, but it does serve to highlight the poor policy in the United States which 1) Prohibits the growing of one of the world's most ancient and useful crops (even though the finished products can be sold here), because 2) Federal lawmakers somehow can't grasp that industrial hemp is a different thing than marijuana--which is a subset of 3) a hypocritical prohibition on marijuana, even though alcohol and tobacco are perfectly legal and taxed. Soapbox put away...

Landfills To Be Mined For Fuel, Recyclables

fluid bed gasifier image
Fluid Bed Gasifier Plant. Credit Advanced Plasma Power
In Jim Kunstler's World Made By Hand, the former motorheads in town mine the dump for resources. There is a lot of valuable stuff in dumps; it has been cheaper to bury stuff than to recycle or compost it. They are also filling up, and nobody wants new ones in their backyard. In Belgium, Advanced Plasma Power is building a plant at a huge landfill site to mine the garbage, separating out the recyclable materials and converting the rest into synthetic fuel.
safer-greener.jpg
really?
Tim Webb writes in the Guardian:
The 30-year project will reuse 16.5m tonnes of municipal waste dumped since the 1960s at the landfill site near Hasselt in eastern Belgium. APP will use its plasma technology to convert the methane produced by the rubbish, which is more than 20 times more damaging to the environment than carbon dioxide, into usable gas. This will fuel a 60MW power plant capable of supplying 60,000 homes.
APP has developed a four stage process of converting waste into "a clean hydrogen-rich syngas and a vitrified recyclate called Plasmarok® that can be used as a building material or replacement aggregate." They claim it produces little or no emissions and "almost nothing is left - around 2% of input volumes - for landfill." They also claim that it has a negative carbon footprint.
It is not without risk. An environmental lawyer pointed out that there are health and safety issues: "in many cases, for older "mature" sites, there are inexact records of what lies below."
Others point out that burning anything creates pollution, and that it is an incinerator in disguise. The Global Alliance for Incinerator Alternatives claims:
Incinerators with names like "gasification," "pyrolysis," "plasma arc," and "waste-to-energy" all emit dioxins and other harmful pollutants, despite industry claims that they are "green" technologies....The short track record of pyrolysis, plasma and gasification incinerator technologies has shown even higher costs, less dependability, and inconsistent energy generation. In addition, data show that dioxins, furans and other toxins are formed in these systems, and in some cases, toxins are formed in higher quantities than in conventional mass-burn incinerators.
But existing dumps are emitting harmful pollutants while they sit there, creating greenhouse gases through rotting, and leaking toxic leachates. Perhaps mining them for fuel and materials is the lesser of two evils.

Sunday, October 17, 2010

How to Weigh a Star Using a Moon


Artist's concept of an exoplanet and its moon transiting a sun-like star. Such a system could be used to directly weigh the star.
How do astronomers weigh a star that's trillions of miles away and way too big to fit on a bathroom scale? In most cases they can't, although they can get a best estimate using computer models of stellar structure.
New work by astrophysicist David Kipping says that in special cases, we can weigh a star directly. If the star has a planet, and that planet has a moon, and both of them cross in front of their star, then we can measure their sizes and orbits to learn about the star.
"I often get asked how astronomers weigh stars. We've just added a new technique to our toolbox for that purpose," said Kipping, a predoctoral fellow at the Harvard-Smithsonian Center for Astrophysics.
Astronomers have found more than 90 planets that cross in front of, or transit, their stars. By measuring the amount of starlight that's blocked, they can calculate how big the planet is relative to the star. But they can't know exactly how big the planet is unless they know the actual size of the star. Computer models give a very good estimate but in science, real measurements are best.
Kipping realized that if a transiting planet has a moon big enough for us to see (by also blocking starlight), then the planet-moon-star system could be measured in a way that lets us calculate exactly how large and massive all three bodies are.
"Basically, we measure the orbits of the planet around the star and the moon around the planet. Then through Kepler's Laws of Motion, it's possible to calculate the mass of the star," explained Kipping.
The process isn't easy and requires several steps. By measuring how the star's light dims when planet and moon transit, astronomers learn three key numbers: 1) the orbital periods of the moon and planet, 2) the size of their orbits relative to the star, and 3) the size of planet and moon relative to the star.
Plugging those numbers into Kepler's Third Law yields the density of the star and planet. Since density is mass divided by volume, the relative densities and relative sizes gives the relative masses. Finally, scientists measure the star's wobble due to the planet's gravitational tug, known as the radial velocity. Combining the measured velocity with the relative masses, they can calculate the mass of the star directly.
"If there was no moon, this whole exercise would be impossible," stated Kipping. "No moon means we can't work out the density of the planet, so the whole thing grinds to a halt."
Kipping hasn't put his method into practice yet, since no star is known to have both a planet and moon that transit. However, NASA's Kepler spacecraft should discover several such systems.
"When they're found, we'll be ready to weigh them," said Kipping.
This research will appear in the Monthly Notices of the Royal Astronomical Society.

New Materials Could Replace Costly Gold in Electrical Applications


Researchers at the University of Connecticut, partnering with United Technologies Research Center engineers, have modeled and developed new classes of alloy materials for use in electronic applications that will reduce reliance on costly gold and other precious metals.
The research appears online in the October 12th issue of the journal Applied Physics Letters.
With the price of gold currently hovering around $1,340 per ounce, manufacturers across the globe, including Connecticut's United Technologies Corporation (UTC), are scrambling for alternatives to the costly noble metals that are widely used in electronic applications, including gold, platinum, rhodium, palladium and silver. What makes these metals attractive is their combination of excellent conductivity paired with resistance to oxidation and corrosion. Finding less costly but equally durable and effective alternatives is an important aim.
Mark Aindow and S. Pamir Alpay, UConn professors of materials science and engineering, and Joseph Mantese, a UTRC Fellow, have developed new classes of materials that behave much like gold and its counterparts when exposed to the oxidizing environments that degrade traditional base metals. Their research was funded by a grant from the U.S. Army Research Office.
The team has investigated nickel, copper and iron -- inexpensive materials that may offer promise. Based on their research, they have laid out the theory and demonstrated experimentally the methodology for improving the electrical contact resistance of these base metals. Aindow said, "We used a combination of theoretical analysis to select the appropriate constituents, and materials engineering at the atomic level to create designer materials."
The researchers synthesized various alloys, using inexpensive base metals. Higher conductivity native oxide scales can be achieved in these alloys through one of three processes: doping to enhance carrier concentration, inducing mixed oxidation states to give electron/polaron hopping, and/or phase separation for conducting pathways.
Their work has demonstrated an improvement in contact resistance of up to one-million-fold over that for pure base metals, so that base metal contacts can now be prepared with contact properties near those of pure gold.

Changing the Color of Single Photons Emitted by Quantum Dots


In new NIST experiments, the "color" (wavelength) of photons from a quantum dot single photon source (QD SPS) is changed to a more convenient shade using an up-conversion crystal and pump laser. To test that the process truly acts on single photons without degrading the signal (by creating additional photons), the output is split in two and sent to parallel detectors -- true single photons should not be detected simultaneously in both paths.
Researchers at the National Institute of Standards and Technology (NIST) have demonstrated for the first time the conversion of near-infrared 1,300 nm wavelength single photons emitted from a true quantum source, a semiconductor quantum dot, to a near-visible wavelength of 710 nm. The ability to change the color of single photons may aid in the development of hybrid quantum systems for applications in quantum communication, computation and metrology.
Two important resources for quantum information processing are the transmission of data encoded in the quantum state of a photon and its storage in long-lived internal states of systems like trapped atoms, ions or solid-state ensembles. Ideally, one envisions devices that are good at both generating and storing photons. However, this is challenging in practice because while typical quantum memories are suited to absorbing and storing near-visible photons, transmission is best accomplished at near-infrared wavelengths where information loss in telecommunications optical fibers is low.
To satisfy these two conflicting requirements, the NIST team combined a fiber-coupled single photon source with a frequency up-conversion single photon detector. Both developed at NIST, the frequency up-conversion detector uses a strong pump laser and a special non-linear crystal to convert long wavelength (low frequency) photons into short wavelength (high frequency) photons with high efficiency and sensitivity (http://www.nist.gov/itl/antd/nir_082509.cfm).
According to Matthew Rakher and Kartik Srinivasan, two authors of the paper, previous up-conversion experiments looked at the color conversion of highly attenuated laser beams that contained less than one photon on average. However, these light sources still exhibited "classical" photon statistics exactly like that of an unattenuated laser, meaning that the photons are organized in such as way that at most times there are no photons while at other times there are more than one. Secure quantum communications relies upon the use of single photons.
"The quantum dot can act as a true single photon source," says Srinivasan. "Each time we excite the dot, it subsequently releases that energy as a single photon. In the past, we had little control over the wavelength of that photon, but now we can generate a single photon of one color on demand, transmit it over long distances with fiber optics, and convert it to another color."
Converting the photon's wavelength also makes it easier to detect, say co-authors Lijun Ma and Xiao Tang. While commercially available single photon detectors in the near-infrared suffer noise problems, detectors in the near-visible are a comparatively mature and high-performance technology. The paper describes how the wavelength conversion of the photons improved their detection sensitivity by a factor of 25 with respect to what was achieved prior to conversion.

Carbon Dioxide Controls Earth's Temperature, New Modeling Study Shows


Various atmospheric components differ in their contributions to the greenhouse effect, some through feedbacks and some through forcings. Without carbon dioxide and other non-condensing greenhouse gases, water vapor and clouds would be unable to provide the feedback mechanisms that amplify the greenhouse effect.
Water vapor and clouds are the major contributors to Earth's greenhouse effect, but a new atmosphere-ocean climate modeling study shows that the planet's temperature ultimately depends on the atmospheric level of carbon dioxide.
The study, conducted by Andrew Lacis and colleagues at NASA's Goddard Institute for Space Studies (GISS) in New York, examined the nature of Earth's greenhouse effect and clarified the role that greenhouse gases and clouds play in absorbing outgoing infrared radiation. Notably, the team identified non-condensing greenhouse gases -- such as carbon dioxide, methane, nitrous oxide, ozone, and chlorofluorocarbons -- as providing the core support for the terrestrial greenhouse effect.
Without non-condensing greenhouse gases, water vapor and clouds would be unable to provide the feedback mechanisms that amplify the greenhouse effect. The study's results are published Oct. 15 in Science.
A companion study led by GISS co-author Gavin Schmidt that has been accepted for publication in the Journal of Geophysical Research shows that carbon dioxide accounts for about 20 percent of the greenhouse effect, water vapor and clouds together account for 75 percent, and minor gases and aerosols make up the remaining five percent. However, it is the 25 percent non-condensing greenhouse gas component, which includes carbon dioxide, that is the key factor in sustaining Earth's greenhouse effect. By this accounting, carbon dioxide is responsible for 80 percent of the radiative forcing that sustains the Earth's greenhouse effect.
The climate forcing experiment described in Science was simple in design and concept -- all of the non-condensing greenhouse gases and aerosols were zeroed out, and the global climate model was run forward in time to see what would happen to the greenhouse effect.
Without the sustaining support by the non-condensing greenhouse gases, Earth's greenhouse effect collapsed as water vapor quickly precipitated from the atmosphere, plunging the model Earth into an icebound state -- a clear demonstration that water vapor, although contributing 50 percent of the total greenhouse warming, acts as a feedback process, and as such, cannot by itself uphold the Earth's greenhouse effect.
"Our climate modeling simulation should be viewed as an experiment in atmospheric physics, illustrating a cause and effect problem which allowed us to gain a better understanding of the working mechanics of Earth's greenhouse effect, and enabled us to demonstrate the direct relationship that exists between rising atmospheric carbon dioxide and rising global temperature," Lacis said.
The study ties in to the geologic record in which carbon dioxide levels have oscillated between approximately 180 parts per million during ice ages, and about 280 parts per million during warmer interglacial periods. To provide perspective to the nearly 1 C (1.8 F) increase in global temperature over the past century, it is estimated that the global mean temperature difference between the extremes of the ice age and interglacial periods is only about 5 C (9 F).
"When carbon dioxide increases, more water vapor returns to the atmosphere. This is what helped to melt the glaciers that once covered New York City," said co-author David Rind, of NASA's Goddard Institute for Space Studies. "Today we are in uncharted territory as carbon dioxide approaches 390 parts per million in what has been referred to as the 'superinterglacial.'"
"The bottom line is that atmospheric carbon dioxide acts as a thermostat in regulating the temperature of Earth," Lacis said. "The Intergovernmental Panel on Climate Change has fully documented the fact that industrial activity is responsible for the rapidly increasing levels of atmospheric carbon dioxide and other greenhouse gases. It is not surprising then that global warming can be linked directly to the observed increase in atmospheric carbon dioxide and to human industrial activity in general."