Science News

I receive news from various sources. I am very happy to welcome all of you who are interested in the developments in science and technology to read and appreciate the news items which will be published in this blog.

I invite you to send feedback, if any, on these items to ksparth@gmail.com.

Tuesday, November 9, 2010

LHC Update protons to lead ions in four days flat

 

November 8, 2010

Geneva

. Four days is all it took for the LHC operations team at CERN* to complete the transition from protons to lead ions in the LHC. After extracting the final proton beam of 2010 on 4 November, commissioning the lead-ion beam was underway by early afternoon. First collisions were recorded at 00:30 CET on 7 November, and stable running conditions marked the start of physics with heavy ions at 11:20 CET today.
"The speed of the transition to lead ions is a sign of the maturity of the LHC," said CERN Director General Rolf Heuer. "The machine is running like clockwork after just a few months of routine operation."

Operating the LHC with lead ions – lead atoms stripped of electrons - is completely different from operating the machine with protons. From the source to collisions, operational parameters have to be re-established for the new type of beam. For lead-ions, as for protons before them, the procedure started with threading a single beam round the ring in one direction and steadily increasing the number of laps before repeating the process for the other beam.

Once circulating beams had been established they could be accelerated to the full energy of 287 TeV per beam. This energy is much higher than for proton beams because lead ions contain 82 protons. Another period of careful adjustment was needed before lining the beams up for collision, and then finally declaring that nominal data taking conditions, known at CERN as stable beams, had been established. The three experiments recording data with lead ions, ALICE, ATLAS and CMS can now look forward to continuous lead-ion running until CERN's winter technical stop begins on 6 December.

"It's been very impressive to see how well the LHC has adapted to lead ions," said Jurgen Schukraft, spokesperson of the ALICE experiment. "The ALICE detector has been optimised to record the large number of tracks that emerge from ion collisions and has handled the first collisions very well, so we are all set to explore this new opportunity at LHC."

"After a very successful proton run, we're very excited to be moving to this new phase of LHC operation," said ATLAS spokesperson Fabiola Gianotti. "The ATLAS detector has recorded first spectacular heavy-ion events, and we are eager to study them in detail."

"We designed CMS as a multi-purpose detector," said Guido Tonelli, the collaboration's spokesperson, "and it's very rewarding to see how well it's adapting to this new kind of collision. Having data collected by the same detector in proton-proton and heavy-ion modes is a powerful tool to look for unambiguous signatures of new states of matter."
Lead-ion running opens up an entirely new avenue of exploration for the LHC programme, probing matter as it would have been in the first instants of the Universe's existence. One of the main objectives for lead-ion running is to produce tiny quantities of such matter, which is known as quark-gluon plasma, and to study its evolution into the kind of matter that makes up the Universe today. This exploration will shed further light on the properties of the strong interaction, which binds the particles called quarks, into bigger objects, such as protons and neutrons.
Following the winter technical stop, operation of the collider will start again with protons in February and physics runs will continue through 2011.

                                                        **********************
*CERN, the European Organization for Nuclear Research, is the world's leading laboratory for particle physics. It has its headquarters in Geneva. At present, its Member States are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland and the United Kingdom. India, Israel, Japan, the Russian Federation, the United States of America, Turkey, the European Commission and UNESCO have Observer status.



 

 

Scientists trace nano particles' journey from the lungs to the body

November 8, 2010 
Boston
Naotechnology is attracting the attention of all types of specialists. Researchers hope that  their findings could help investigators develop agents for delivering drug in to lungs and provide key information for use in air pollution control.

Using a novel, real-time imaging system, scientists have tracked a group of near-infrared fluorescent nanoparticles from the airspaces of the lungs, into the body and out again, providing a description of the characteristics and behavior of these minute particles which could be used in developing therapeutic agents to treat pulmonary disease, as well as offering a greater understanding of the health effects of air pollution.
Led by investigators at Beth Israel Deaconess Medical Center (BIDMC) and the Harvard School of Public Health, the findings are described in the November 7 Advance Online issue of the journal Nature Biotechnology.
At a scale of one to 100 nanometers (nm) – one billionth of a meter – nanoparticles are too small to be visible through a traditional microscope. But this extremely small scale makes them potential candidates for targeted drug delivery, capable of precisely pinpointing disease sources with increased efficiency and minimal side effects to surrounding tissues.
"Nanoparticles hold promise as therapeutic agents for a number of diseases," explains co-senior author John V. Frangioni, MD, PhD, of the Division of Hematology/Oncology at BIDMC and Associate Professor of Medicine and of Radiology at Harvard Medical School (HMS), whose laboratory specializes in the development of imaging systems and of contrast agent development for molecular imaging. The anatomy of the lung, with its large surface area and minimal barriers limiting access to the body, makes this organ a particularly good target for nanoparticle drug delivery.
"We have been interested in the fate of small particles after they deposit deep in the gas exchange region of the lung," adds co-senior author Akira Tsuda, PhD, a research scientist in the Molecular and Integrative Physiological Sciences Program in the Department of Environmental Health at the Harvard School of Public Health. "Determining the physicochemical characteristics of inhaled nanoparticles on their ability to cross the [lungs'] alveolar epithelial surface is an important step in understanding the biological effects associated with exposure to these particles."
Previous work by Frangioni and first author Hak Soo Choi, PhD, an Instructor of Medicine at HMS, had established the characteristics of nanoparticles that regulate clearance from the body. "To be of value clinically, nanoparticles must be able to either biodegrade into biologically inert compounds, or be efficiently cleared from the body," says Choi, explaining that accumulation of nanoparticles can be toxic.
The aim of this new study was to determine the characteristics and parameters of inhaled nanoparticles that mediate their uptake into the body -- from the external environment, across the alveolar lung surface and into the lymphatic system and blood stream and eventually to other organs. To do this, the scientists made use of the FLARE™ (Fluorescence-Assisted Resection and Exploration) imaging system, systematically varying the chemical composition, size, shape and surface charge of a group of near-infrared fluorescent nanoparticles to compare the physiochemical properties of the various engineered particles. The investigators then tracked the movement of the varying nanoparticles in the lungs of rat models over a period of one hour, and also verified results using conventional radioactive tracers.
"The FLARE system enabled us to cut the number of experiments in half while performing direct comparisons of nanoparticles of different sizes, shapes and rigidities," explains Frangioni, whose laboratory developed the FLARE system for use in image-guided cancer surgery as well as other applications.
Their results established that non-positively charged nanoparticles, smaller than 34 nm in diameter, appeared in the lung-draining lymph nodes within 30 minutes. They also found that nanoparticles smaller than 6 nm in diameter with "zwitterionic" characteristics (equal positive and negative charge) traveled to the draining lymph nodes within just a few minutes, subsequently being cleared by the kidneys into urine.
"These new findings can be applied to design and optimize particles for drug delivery by inhalation therapy," notes Tsuda. "This research also guides us in the assessment of the health effects of various particulate pollutants, as the data suggest the importance of distinguishing specific subclasses of particles [based on surface chemistry and size] that can rapidly cross the alveolar epithelium and may disseminate in the body."
Adds Frangioni, "This study complements our earlier work in which we defined the characteristics of nanoparticles that regulate efficient clearance from the body. With these new findings, which define the characteristics that regulate uptake into the body, we've now described a complete 'cycle' of nanoparticle trafficking --from the environment, through the lungs, into the body, then out of the kidneys in urine and back to the environment."

                                                                              *****************






 

Friday, November 5, 2010

National study shows CT screening of former, current smokers reduces lung cancer deaths by 20 percent


This study has profound significance. It shows that low dose CT can identify lung cancers; annual screening helped to reduce cancer deaths by 20%
                                                     
                                                                                                         K S Parthaarathy


WASHINGTON, DC – A large national study finds that screening current or former heavy smokers with a CT scan can reduce deaths from lung cancers by 20 percent. One potential reason for the reduction is that the scan can pick up tumors at an early stage. The study was conducted by the National Cancer Institute at 33 centers around the country including Georgetown Lombardi Comprehensive Cancer Center, a part of Georgetown University Medical Center.
The National Lung Screening Trial (NLST) involved more than 53,000 current and former heavy smokers ages 55 to 74. More than 1,800 men and women participated through Lombardi. The study compared the effects of two screening procedures for lung cancer -- low-dose helical computed tomography (CT) and standard chest X-ray.
Under Lombardi's leadership, 1,800 men and women were recruited into the clinical trial at Georgetown University Hospital as well as two other Georgetown community screening locations.
"Overall this study provides strong evidence that older patients who are at high-risk of developing lung cancer could benefit from CT screening and that's a significant finding." says Claudine Isaacs, MD, lead investigator of the NLST study at Lombardi. "We are grateful to all the men and women who participated in this important study. Clinical trials are critical to making progress in medicine."
"These results are very encouraging," says Louis Weiner, MD, director of Lombardi. "Studies like these generate so much excitement, but clearly there is much more work to be done. Lombardi and other NCI-cancer centers continue to explore effective ways to reduce lung cancer deaths including prevention efforts and by conducting clinical trials with the newest available cancer fighting drugs."
The NLST study began enrolling participants in August 2002. Participants were required to have a smoking history of at least 30 pack-years and were either current or former smokers without signs, symptoms, or history of lung cancer. Pack-years are calculated by multiplying the average number of packs of cigarettes smoked per day by the number of years a person has smoked.
The men and women were randomly assigned to receive three annual screens with either low-dose helical CT (often referred to as spiral CT) or standard chest X-ray. Helical CT uses X-rays to obtain a multiple-image scan of the entire chest during a 7 to 15 second breath-hold. A standard chest X-ray requires only a sub-second breath-hold but produces a single image of the whole chest in which anatomic structures overlie one another. Previous efforts to demonstrate that standard chest X-ray examinations can reduce lung cancer mortality have been unsuccessful.
The trial participants received their screening tests at the time of enrollment and at the end of their first and second years on the trial. The participants were then followed for up to another five years; all deaths were documented, with special attention given to the verification of lung cancer as a cause of death. As of October 20, 2010, a total of 354 deaths from lung cancer had occurred among participants in the CT arm of the study, whereas a significantly larger 442 lung cancer deaths had occurred among those in the chest X-ray group. This represents a 20.3 percent reduction in lung cancer mortality offered by CT scans compared to the X-ray group.
"Potentially, we could save thousands of lives with CT screening, but keep in mind that because smoking causes many lung cancers, we could save hundreds of thousands more if people wouldn't smoke or quit if they do," Isaacs points out.
"We're proud to be a part of this important study designed to answer critical questions," says Howard J. Federoff, MD, PhD, executive vice president for health sciences at GUMC and executive dean of its School of Medicine. "Lombardi's leadership role in the effort to reduce the burden of cancer has an impact at the national and local levels, and benefits our community directly."
"This large and well-designed study used rigorous scientific methods to test ways to prevent death from lung cancer by screening patients at especially high risk," said Harold Varmus, M.D., NCI Director, in a press release issued today. "Lung cancer is the leading cause of cancer mortality in the U.S. and throughout the world, so a validated approach that can reduce lung cancer mortality by even 20 percent has the potential to spare very significant numbers of people from the ravages of this disease."
The NCI notes that the possible disadvantages of helical CT include the cumulative effects of radiation from multiple CT scans; surgical and medical complications in patients who prove not to have lung cancer but who need additional testing to make that determination; and risks from additional diagnostic work-up for findings unrelated to potential lung cancer, such as liver or kidney disease. In addition, the screening process itself can generate suspicious findings that turn out not to be cancer in the vast majority of cases, producing significant anxiety and expense. These problems must, of course, be weighed against the advantage of a significant reduction in lung cancer mortality.
                                                                          ********



The NLST was sponsored by NCI, a part of the National Institutes of Health, and conducted by the American College of Radiology Imaging Network (ACRIN) and the Lung Screening Study group

Wednesday, November 3, 2010

Beet juice promotes brain health in older adults

  Researchers have shown that daily doses of beet root juices help to increase the blood flow to brain. They  speculate that the increased  blood flow to brain could hold potential for combating the progression of dementia.
They carried out the study with 14 subjects. and proposed a mechanism which helps to increase the blood flow to brain. The subjects drank 16 ounces of beet root juice in the experiment. The researchers have approached a company to make the juice tastier and to market it as a beverage. This gives way their intention. The conflict of interest is obvious. They believe that if the drink is made tastier more people will drink it. They can go all the way laughing to the bank!   
 

The research findings are available online in Nitric Oxide: Biology and Chemistry, the peer-reviewed journal of the Nitric Oxide Society and will be available in print soon.


"There have been several very high-profile studies showing that drinking beet juice can lower blood pressure, but we wanted to show that drinking beet juice also increases perfusion, or blood flow, to the brain," said Daniel Kim-Shapiro, director of Wake Forest University's Translational Science Center; Fostering Independence in Aging. "There are areas in the brain that become poorly perfused as you age, and that's believed to be associated with dementia and poor cognition."

High concentrations of nitrates are found in beets, as well as in celery, cabbage and other leafy green vegetables like spinach and some lettuce. When you eat high-nitrate foods, good bacteria in the mouth turn nitrate into nitrite. Research has found that nitrites can help open up the blood vessels in the body, increasing blood flow and oxygen specifically to places that are lacking oxygen.

In this study, the first to find a link between consumption of nitrate-rich beet juice and increased blood flow to the brain, Translational Science Center researchers looked at how dietary nitrates affected 14 adults age 70 and older over a period of four days.

On the first day, the study subjects reported to the lab after a 10-hour fast, completed a health status report, and consumed either a high- or low-nitrate breakfast. The high-nitrate breakfast included 16 ounces of beet juice. They were sent home with lunch, dinner and snacks conforming to their assigned diets.

The next day, following another 10-hour fast, the subjects returned to the lab, where they ate their assigned breakfasts. One hour after breakfast, an MRI recorded the blood flow in each subject's brain. Blood tests before and after breakfast confirmed nitrite levels in the body.
For the third and fourth days of the study, the researchers switched the diets and repeated the process for each subject.

The MRIs showed that after eating a high-nitrate diet, the older adults had increased blood flow to the white matter of the frontal lobes – the areas of the brain commonly associated with degeneration that leads to dementia and other cognitive conditions.

"I think these results are consistent and encouraging – that good diet consisting of a lot of fruits and vegetables can contribute to overall good health," said Gary Miller, associate professor in the Department of Health and Exercise Science and one of the senior investigators on the project.The university is currently looking into ways of marketing the beverage.

The University has very noble intentions! The Center for Translational Science; Fostering Independence in Aging focuses on the promotion and maintenance of functional health as people age. Center researchers study how diet and exercise can change cognitive and physical function. The center's team involves medical staff, behavioral scientists and other scientists who develop research-based interventions to help both physical and cognitive health in aging populations.




[ Back to EurekAlert! ] [ Print | E-mail | Share Share ] [ Close Window ]

 

Monday, October 25, 2010

Risk of cancer due to radiation exposure in middle age higher

An interesting analysis of  A-bomb survivor data published in the  Journal of the National Cancer Institute  ( 25 October 2010) revealed that  contrary to common assumptions, the risk of cancer associated with radiation exposure in middle age may not be lower than the risk associated with exposure at younger ages. Children are more sensitive than adults to the effects of radiation and that they have a greater risk of developing radiation-induced cancer than adults. Some data also suggest that, in general, the older a person is when exposed to radiation, the lower their risk of developing a radiation-induced cancer.

Recent analysis of the statistical evidence from long-term studies of atomic bomb survivors in Japan indicates that for radiation exposure after about age 30, the risk of developing radiation-induced cancer does not continue to dec

David J. Brenner, Ph.D., D.Sc., at Columbia University in New York, and colleagues reanalyzed the Japanese atomic bomb survivor data assuming two different pathways through which radiation exposure can ultimately lead to cancer. The first is initiation of gene mutations that convert normal stem cells to premalignant cells that could eventually lead to cancer. The second is radiation induced promotion, or expansion, of the number of existing premalignant cells in the body. The initiation effect is more likely to play a role in children than in adults, they reason, because cells initiated at an early age have a longer time available to expand in number and progress on the pathway to cancer. The promotion effect, on the other hand, is more likely to be important for radiation exposures in middle age, because the adult body already contains larger numbers of premalignant cells.
The researchers developed a model based on these biological effects and applied the model to the Japanese atomic bomb survivor data. They found that the model was able to reproduce the cancer risk patterns associated with age at radiation exposure observed in these survivors. They then applied the same model to predict cancer risks as a function of age in the U.S. population and found that the cancer risks predicted by the model were consistent with the data in the age range from about 30 to 60.

The authors conclude that cancer risk after exposure in middle age may increase for some tumor types contrary to conventional wisdom. They add that these findings could have practical implications regarding x-ray diagnostic tests, which are predominantly performed on middle aged adults, as well as for occupations that involve radiation exposures, again where most exposures are in middle age.

"Overall, the weight of the epidemiological evidence suggests that for adult exposures, radiation risks do not generally decrease with increasing age at exposure," they write, "and the mechanistic underpinning described here provides this conclusion with some biological plausibility."

In an accompanying editorial, John D. Boice, Sc.D., of the International Epidemiology Institute, Rockville, Md., and Vanderbilt University, Nashville, cautioned that there are uncertainties in generalizing the Japanese data to a U.S. population. He also notes that other data and other models contradict the results of this study. However, he concludes that this biology-based model "raises provocative hypotheses and conclusions that, although preliminary, draw attention to the continued importance of low-dose radiation exposures in our society."

Dr Brenner often comes up with papers which may appear controversial; but they are based on a deep understanding of the biological mechanisms of cancer induction. In 2005 he along with some of the most eminent radio-biologists and epidemiologists known internationally  reviewed the status of low dose radiation risk. The paper was stoutly criticized by many because of its over dependence on the Linear No n Threshold concep.

Though not the ideal data A-bomb survivor data continue to dominate the study of radiation risk
###

The Journal of the National Cancer Institute is published by Oxford University Press and is not affiliated with the National Cancer Institute


Saturday, October 9, 2010

Viral and fungal infection kill bees by the billions

Some people have attributed the collapse of bees by their billions to mobile phone radiation. None of these observations were based on any scientific study. These "researchers" produced sensational reports and published them mostly in daily newspaper. Colony Collapse Syndrome has bees seen in many countries.Such collapse can have a devastating impact on food production as bees are efficient pollinators
 
The sudden death of bee colonies since late 2006 across North America has stumped scientists. But today, researchers may have a greater understanding of the mysterious colony collapse disorder, said a Texas Tech University biologist.

Shan Bilimoria, a professor and molecular virologist, said the bees may be taking a one-two punch from both an insect virus and a fungus, which may be causing bees to die off by the billions.

Bilimoria is part of a team of researchers searching for the cause of the collapse. Led by research professor Jerry Bromenshenk from the University of Montana in Missoula, the group also includes virologists and chemists from the U.S. Army Edgewood Chemical Biological Center and the Instituto de Ecologica AC in Mexico.

Their study was published this week in the peer-reviewed journal PLoS ONE.

"At this stage, the study is showing an association of death rates of the bees with the virus and fungus present," Bilimoria said. "Our contribution to this study confirms association. But even that doesn't prove cause and effect. Not just yet."

The mysterious colony deaths have caused major concern with scientists since much of agriculture depends on bees to pollinate crops.

To discover what might be attacking bee colonies, the team ground up dead bees that had succumbed to colony collapse disorder. Using analytical equipment, researchers discovered through spectroscopic analysis evidence of a moth virus called insect iridescent virus (IIV) 6 and a fungal parasite called Nosema.
The insect virus is closely related to another virus that wiped out bee populations 20 years ago in India, he said. Also, unlike previous research that found the deaths may be caused by a virus with RNA, the IIV 6 contains DNA.

"Our DNA discovery puts this field in a whole new direction," he said.

Bilimoria said Texas Tech supplied the virus material for the experiments and were tested on bees with the fungus. Though an association between exposure and death was found, scientists don't yet know if the two pathogens cause CCD or whether CCD colonies are more likely to succumb to the two pathogens.
"To prove cause and effect, we will have to isolate the virus and fungus from bee colony, and then reinfect with same virus and fungus," Bilimoria said.
In the next part of the research project, Bilimoria will work to isolate the virus from infected bees.
"Once we isolate and identify the virus, we will have a way of monitoring it," he said. "It is easier to fight the problem if we know what the culprit is."
###

Friday, October 8, 2010

Transgenic corn helps all farmers; saves billions in USA

In India , introduction of transgenic crops is mired in controversy. Probably popular apprehensions  and now plagiarism come in the way of exploiting the technology. The story coming from the USA will be of interest to the discerning Indian farmer.Last year 63% of the corn cultivated in USA was transgenic.Recently non Bt corn farmers have got benefits.



IMAGE: Transgenic corn's suppression of the European corn borer has saved Midwest farmers billions of dollars in the past decade, reports a new study in Science.



Click here for more information.
Transgenic corn's suppression of the European corn borer has saved Midwest farmers billions of dollars in the past decade, reports a new study in the October 8 edition of Science.
Studies carried out several Midwest universities show that suppression of this pest has saved $3.2 billion for corn growers in Illinois, Minnesota, and Wisconsin over the past 14 years with more than $2.4 billion of this total benefiting non-Bt corn growers. Comparable estimates for Iowa and Nebraska are $3.6 billion in total, with $1.9 billion accruing for non-Bt corn growers.
Transgenic corn is engineered to express insecticidal proteins from the bacterium Bacillus thuringiensis (Bt). Bt corn has become widely adopted in U.S. agriculture since its commercialization in 1996.

The truly secular behaviour of  female corn borer moths hurts them incalculably. They can't distinguish between Bt and non-Bt corn, so females lay eggs in both types of fields. Once eggs hatch in Bt corn, young borer larvae feed and die within 24 to 48 hours.
The major benefit of planting Bt corn is reduced yield losses, and Bt acres received this benefit after the growers paid Bt corn technology fees. But as a result of areawide pest suppression, non-Bt acres also experienced yield savings free, without the cost of Bt technology fees, and thus received more than half of the benefits from growing Bt corn in the region.
"We've assumed for some time that economic benefits were accruing, even among producers who opted not to plant Bt hybrids," said co-author of the study Mike Gray, University of Illinois Extension entomologist and professor in the Department of Crop Sciences. "However, once quantified, the magnitude of this benefit was even more impressive."
Over the past several years, entomologists and corn producers have noticed very low densities of European corn borers in Illinois. In fact, Illinois densities have reached historic lows to the point where many are questioning its pest status, Gray said.
"Since the introduction of Bt corn, initially targeted primarily at the European corn borer, many entomologists and ecologists have wondered if population suppression over a large area would eventually occur," Gray said. "As this research shows, area wide suppression has occurred and dramatically reduced the estimated $1 billion in annual losses caused previously by the European corn borer."
This information also provides incentives for growers to plant non-Bt corn in addition to Bt corn.
"Sustained economic and environmental benefits of this technology will depend on continued stewardship by producers to maintain non-Bt maize refuges to minimize the risk of evolution of Bt resistance in crop pest species," Gray said.

Cheek swab may detect lung cancer


 
In clinical trial, technique appears to detect lung cancer far afield from a tumor
IMAGE: Nano-scale disturbances in cheek cells indicate the presence of lung cancer. Regular microscopy looking at chromatin, the genetic material inside a cell's nucleus, will not reveal significant dissimilarities between the...

Click here for more information.

Early detection is critical for improving cancer survival rates. Yet, lung cancer, one of the deadliest cancers in the United States, is notoriously difficult to detect in its early stages.
Now, the US National Science Foundation  supported researchers have developed a method to detect lung cancer by merely shining diffuse light on cells swabbed from patients' cheeks.
Recently, in a new clinical study, the analysis technique--called partial wave spectroscopic (PWS) microscopy--was able to differentiate individuals with lung cancer from those without, even if the non-cancerous patients had been lifetime smokers or suffered from chronic obstructive pulmonary disease (COPD).
The findings-released by a team of engineers and physicians from NorthShore University Health System, Northwestern University and New York University-appear in print in the Oct. 15, 2010, issue of the journal Cancer Research.
"This study is important because it provides the proof of concept that a minimally intrusive, risk-stratification technique may allow us to tailor screening for lung cancer, the leading cause of cancer deaths in Americans," said physician and researcher Hemant Roy of NorthShore University HealthSystems and the University of Chicago, the lead author on the paper. "This represents a major step forward in translating biomedical optics breakthroughs for personalized screening for lung cancer."
The recent results are an extension of several successful trials involving the light-scattering analysis technique, including early detection successes with pancreatic cancer and colon cancer. NSF has supported the team's work since 2002, with an early grant to Roy's collaborator and co-author, bioengineer Vadim Backman of Northwestern University.
"Their work has now transitioned to a larger $2 million Emerging Frontiers in Research and Innovation award," said Leon Esterowitz, a biophotonics expert and program director at NSF who has long supported the research. "The results have even larger implications in that the techniques and the 'field effect' may be a general phenomena that could be applied to a multitude of epithelial cancers, the most common cancer type."
The continuing clinical and laboratory experiments involving the PWS light-scattering technique-and its predecessor technologies, four-dimensional elastic light scattering fingerprinting (4D-ELF) and low-coherence enhanced backscattering spectroscopy (LEBS)-are revealing new information about the changes cells undergo when cancer emerges somewhere in the body.
Within affected cells, including otherwise healthy cells far from an actual tumor, the molecules in the nucleus and cellular skeleton appear to change. On the scale of roughly 200 nanometers or less, even to the scale of molecules, an affected cell's structure becomes so distorted that light scatters through the cell in a telling way.
The ability of cancer to cause changes in distant, healthy tissue is called the "field effect" or "field of injury" effect, and is the physical mechanism that allows cells in the cheek to reveal changes triggered by a tumor far off in a patient's lung.
"Microscopic histology and cytology have been a staple of clinical diagnostics detecting micro-scale alterations in cell structure," added Backman. "However, the resolution of conventional microscopy is limited. PWS-based nanocytology, on the other hand, detects cellular alterations at the nanoscale in otherwise microscopically normal-appearing cells."
"What is intriguing is that the very same nanoscale alterations seem to develop early in very different types of cancer including lung, colon and pancreatic cancers," Backman continued. "Not only does this suggest that nanocytology has the potential to become a general platform for cancer screening, but also that these nanoscale alterations are a ubiquitous event in early carcinogenesis with critical consequences for cell function. Elucidating the mechanisms of these alterations will help us understand the initial stages of carcinogenesis and improve screening."
###


 

Tuesday, October 5, 2010

X-rays linked to increased childhood leukemia risk

 

October 4, 2010

Berkeley – Diagnostic X-rays may increase the risk of developing childhood leukemia, according to a new study by researchers at the University of California, Berkeley's School of Public Health. Specifically, the researchers found that children with acute lymphoid leukemia (ALL) had almost twice the chance of having been exposed to three or more X-rays compared with children who did not have leukemia. For B-cell ALL, even one X-ray was enough to moderately increase the risk. The results differed slightly by the region of the body imaged, with a modest increase associated with chest X-rays.

The new findings, published in the October 2010 issue of the International Journal of Epidemiology, come from the Northern California Childhood Leukemia Study, a population-based case-control study that includes 35 counties in the northern and central regions of the state.
While the relationship between high doses of radiation and cancer is well known, significant debate still surrounds the health impacts from the low doses of radiation typical of conventional X-rays, or radiographs.

The dose of ionizing radiation from a single chest X-ray is roughly equivalent to the amount one would get from natural background radiation in 10 days, which is still considered low.

"The general clinical impression has been that the level of radiation a child would be exposed to today from a conventional X-ray would not confer an additional risk for cancer," said Patricia Buffler, UC Berkeley professor of epidemiology and principal investigator of the Northern California Childhood Leukemia Study. "The results of our study were not what we expected."

Leukemia is a cancer of the white blood cells, the soldiers in the body's immune system responsible for detecting and destroying disease-causing agents. According to the American Cancer Society, it is the most common childhood cancer, accounting for nearly a third of all cancers among children younger than 15 years old.
Nearly all cases of childhood leukemia are acute, with 80 percent being acute lymphoid leukemia (ALL), characterized by the overproduction of abnormal B- or T-cell lymphocytes, and 20 percent being acute myeloid leukemia (AML), in which granulocytes are overproduced.

The study included 827 children up to age 15 diagnosed with either ALL or AML. The children with leukemia were each compared with other children randomly selected from the California birth registry who were matched by factors such as age, gender, ethnicity and maternal race.

Interviews were conducted with mothers within four months of the diagnosis of leukemia, and the mothers were asked to report on the number of X-rays received by the child at least 12 months or more before the leukemia diagnosis. Mothers were also asked about their exposures to X-rays during pregnancy and the year prior to pregnancy.

The researchers noted that dental X-rays were not considered because they are so common and deliver such a low dose of radiation that exposure to those radiographs would not discriminate between individuals with high and low levels of radiation exposure.

The study found an increased risk from X-rays for ALL, but not for AML or T-cell leukemia, and there was no association with age at first exposure. Furthermore, there was no increased risk associated with prenatal exposure to X-rays or maternal X-rays occurring before pregnancy, although these exposures were uncommon in this study population.

The study authors emphasized that health care providers are already cautious in their use of X-rays in children, and use them only when necessary to diagnose potential problems such as respiratory illnesses, broken bones and fractures.

"X-rays are a valuable tool, and our findings indicate that their use should continue to be judicious," said Karen Bartley, doctoral student in epidemiology and first author of the study. "Of greater concern, perhaps, is the use of newer imaging technologies, which are becoming more common and which produce far higher doses of radiation."

Computed tomography (CT) scans, for instance, produce a 3-D image by compiling together multiple "slices" of 2-D images that were taken as the scanner moved along. A 2009 study from the National Cancer Institute projected that the 72 million CT scans received by Americans in 2007 would lead to 29,000 excess cancers. The number of scans in the United States has increased over recent decades, going from 3 million scans in 1980 to more than 70 million a year today.

"The findings about increased leukemia risk certainly warrant further investigation," said UC San Francisco radiologist Dr. Rebecca Smith-Bindman, who was not part of the X-ray study. "If even plain film X-rays are associated with an increased risk of leukemia, then one has to wonder about CT scans, some of which can generate 500 times the dose of radiation of an X-ray."

Dr. Smith-Bindman is in the process of characterizing the ionizing radiation exposure to children from CT scans as part of a study funded by the National Cancer Institute. She noted that two-thirds of the imaging procedures children undergo are conventional X-rays, accounting for about 20 percent of their exposure to radiation from medical imaging. In contrast, CT scans make up only 10 percent of the medical imaging tests children undergo, but they account for two-thirds of their ionizing radiation dose.

"The bottom line is we have to be very cautious about the use of any medical imaging techniques," said Dr. Smith-Bindman. "They can be enormously helpful for making accurate diagnoses, but tests that deliver ionizing radiation are associated with small – but real – risks of future complications related to the radiation exposure, and thus they should be used judiciously."

This study suffers from deficiencies such as recall bias. It is a repeat of the Oxford Survey of Childhood Cancers!.
###
Other co-authors of the paper are Dr. Catherine Metayer and Steve Selvin from the UC Berkeley School of Public Health, and Dr. Jonathan Ducore from the UC Davis Department of Pediatrics.
This research is supported by the National Institute of Environmental Health Sciences.


Progress towards lead free electronics

 

A step toward lead-free electronics

Lead is present every where an many products including inkjet printers and digital cameras, ultrasouns scanners and diesel fuel injectors. It was thought to be irreplaceable. Not any more.Material engineers from the University of Leeds may lead the way to 100% lead free electronics for instance.

The work, carried out at the UK's synchrotron facility, Diamond Light Source, reveals the potential of a new manmade material to replace lead-based ceramics in countless electronic devices.

European regulations now bar the use of most lead-containing materials in electronic and electrical devices. Ceramic crystals known as 'piezoelectrics' are currently exempt from these regulations but this may change in the future, owing to growing concerns over the disposal of lead-based materials.

Piezoelectric materials generate an electrical field when pressure is applied, and vice-versa. In gas igniters on ovens and fires, for example, piezoelectric crystals produce a high voltage when they are hit with a spring-loaded hammer, generating a spark across a small gap that lights the fuel.

The most common piezoelectric material is a ceramic crystal called lead zirconium titanate, or PZT.
Using a high intensity X-ray beam at the Diamond Light Source, the University of Leeds researchers have now shown that a simple, lead-free ceramic could potentially do the same job as PZT.

"With the 'Extreme Conditions' beamline at Diamond we were able to probe the interior of the lead-free ceramic- potassium sodium bismuth titanate (KNBT) to learn more about its piezoelectric properties. We could see the changes in crystal structure actually happening while we applied the electric field," said Tim Comyn, lead investigator on the project."

"PZT is the best material for the job at the moment, because it has the greatest piezoelectric effect, good physical durability, and can be radically tailored to suit particular applications," said Adam Royles, PhD student on the project. "The lead-free ceramic that we have been studying is lightweight and can be used at room temperature. This could make it an ideal choice for many applications."

In the medical field, PZT is used in ultrasound transducers, where it generates sound waves and sends the echoes to a computer to convert into a picture. Piezoelectric ceramics also hold great potential for efficient energy harvesting, a possible solution for a clean sustainable energy source in the future.

The Leeds team will continue to work at Diamond to study the transformation induced by an electric field at high speed (1000 times per second) and under various conditions using state of the art detectors.
The results of the work are published online in the journal Applied Physics Letters.


                                                                            ********************






  [
 

Sunday, October 3, 2010

Stop wasting food, achieve huge energy savings

WASHINGTON, Oct. 2, 2010

 Scientists at the University of Texas at Austin showed that US can achieve huge energy savings if it simply stop wasting food . They  have identified a way that the United States could immediately save the energy equivalent of about 350 million barrels of oil a year — without spending a penny or putting a ding in the quality of life: Just stop wasting food. 

They reported their study,  in American Chemical Society's( ACS's) semi-monthly journal Environmental Science & Technology; they found that it takes the equivalent of about 1.4 billion barrels of oil to produce, package, prepare, preserve and distribute a year's worth of food in the United States.
Michael Webber and Amanda Cuéllar note that food contains energy and requires energy to produce, process, and transport. Estimates indicate that between 8 and 16 percent of energy consumption in the United States went toward food production in 2007. Despite this large energy investment, the U.S. Department of Agriculture estimates that people in the U.S. waste about 27 percent of their food. The scientists realized that the waste might represent a largely unrecognized opportunity to conserve energy and help control global warming.
Their analysis of wasted food and the energy needed to ready it for consumption concluded that the U.S. wasted about 2030 trillion BTU of energy in 2007, or the equivalent of about 350 million barrels of oil. That represents about 2 percent of annual energy consumption in the U.S. "Consequently, the energy embedded in wasted food represents a substantial target for decreasing energy consumption in the U.S.," the article notes. "The wasted energy calculated here is a conservative estimate both because the food waste data are incomplete and outdated and the energy consumption data for food service and sales are incomplete."
Percentage of Various Foods Wasted in the U.S.
Fats and oils 33
Dairy 32
Grains 32
Eggs 31
Sugar and other caloric sweeteners 31
Vegetables 25
Fruit 23
Meat, poultry, fish 16
Dry beans, peas, lentils 16
Tree nuts and peanuts 16

###
DOWNLOAD FULL TEXT ARTICLE: http://pubs.acs.org/stoken/presspac/presspac/abs/10.1021/es100310d
The American Chemical Society is a non-profit organization chartered by the U.S. Congress. With more than 161,000 members, ACS is the world's largest scientific society and a global leader in providing access to chemistry-related research through its multiple databases, peer-reviewed journals and scientific conferences. Its main offices are in Washington, D.C., and Columbus, Ohio.


  

 

Tuesday, September 28, 2010

Medical imaging may detect unrelated diseases in research participants

  Public release date: 27-Sep-2010

JAMA and Archives Journals

In about 40 percent of research participants undergoing medical imaging, radiologists may detect a tumor or infection unrelated to the study but that may be meaningful to the individual's health, according to a report in the September 27 issue of Archives of Internal Medicine, one of the JAMA/Archives journals.
"An incidental finding in human subjects research is defined in a major consensus project as an observation 'concerning an individual research participant that has potential clinical importance and is discovered in the course of conducting research, but is beyond the aims of the study,'" the authors write as background information in the article. "Numerous reports have detailed how the detection of an incidental finding can result in the early beneficial diagnosis of an unsuspected malignant neoplasm or aneurysm. However, others describe harm and excessive cost resulting from treatment of radiographically suspicious incidental findings. Moreover, clinical experience dictates that many incidental findings are of indeterminate clinical significance and generate uncertainty among both research participants and their physicians."
Nicholas M. Orme, M.D., of Mayo Clinic, Rochester, Minn., and colleagues evaluated the medical records of 1,426 research participants who underwent an imaging procedure related to a study conducted in 2004. Each image was interpreted by a radiologist the day it was performed, and an expert panel reviewed all incidental findings that resulted in a clinical action during a three-year follow-up period.
Of the 1,426 research imaging examinations, an incidental finding occurred in 567 (39.8 percent). The risk of an incidental finding increased with age. More incidental findings were generated in patients undergoing computed tomography (CT) scans of the abdomen and pelvic area than in any other imaging procedure, followed by CT of the chest and magnetic resonance imaging (MRI) of the head.
Clinical action was taken for 35 (6.2 percent) of the individuals with an incidental finding; in most cases (26 of 567, or 4.6 percent) the medical benefit or burden of these actions was unclear. However, action resulted in clear medical benefit for six of the 567 patients (1.1 percent) and clear medical burden in three patients (0.5 percent).
"This study demonstrates that research imaging incidental findings are common in certain types of imaging examinations, potentially offering an early opportunity to diagnose asymptomatic life-threatening disease, as well as a potential invitation to invasive, costly and ultimately unnecessary interventions for benign processes," the authors write. "The majority of incidental findings seem to be of unclear significance. These instances represent a dilemma for researchers."
The findings should help researchers identify imaging studies at high risk for generating incidental findings and develop a plan for managing them, the authors conclude. "Timely, routine evaluation of research images by radiologists can result in identification of incidental findings in a substantial number of cases that can result in significant medical benefit to a small number of patients," they write.
(Arch Intern Med. 2010;170[17]:1525-1532. Available pre-embargo to the media at www.jamamedia.org.)
Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.
Editorial: Researchers Must Plan for Incidental Findings "In clinical research, investigators may learn information about a participant that is not pertinent to the research question but that may have important clinical implications for the participant," writes Bernard Lo, M.D., of the University of California, San Francisco, in an accompanying editorial. "Such incidental findings present ethical dilemmas. Although they may offer the possibility of substantial personal benefit to the participant, more commonly they are false-positive findings that lead to a cascade of testing that presents additional risks and burdens."
"When planning a study, researchers need to develop a comprehensive plan for how they will respond to incidental findings, including what findings they will offer to disclose to participants," Dr. Lo continues. "The possibility of incidental findings, and their ramifications, should be part of the informed consent process. The work by Orme et al helps us to start to quantify their impact."
(Arch Intern Med. 2010;170[17]:1522-1524.

###



 

 

Saturday, September 25, 2010

Scientists release first cultivated ohelo berry for Hawaii

  What the US Department of Agriculture and its university affiliates did may be duplicated else where in other countries.

The first cultivar of 'ōhelo berry, a popular native Hawaiian fruit, has been released by U.S. Department of Agriculture (USDA) scientists and their university and industry cooperators.
'Ōhelo (Vaccinium reticulatum Smith) is a small, native Hawaiian shrub in the cranberry family, commonly found at high elevations on the islands of Maui and Hawaii. As people scour the landscape to harvest this delectable berry for use in jam, jelly and pie filling, they unfortunately disrupt the fragile habitats where this plant grows.
In an effort to reduce damage to the environment and meet consumer demands, horticulturist Francis T.P. Zee, with the USDA Agricultural Research Service (ARS) Pacific Basin Agricultural Research Center (PBARC) in Hilo, Hawaii, is evaluating 'ōhelo for small farm production and ornamental use. Zee collaborated with fellow ARS scientists and cooperators at the University of Hawaii at Manoa, Big Island Candies and the Big Island Association of Nurserymen. ARS is the principal intramural scientific research agency of USDA.
Zee and his team selected the offspring of seed-grown plants to create the new cultivar "Kilauea" for berry production. They found 'ōhelo's tiny seeds readily germinated under 20-30 percent shade in well-watered and well-drained potting mixture. Plant hardiness and vigor improved with age, and some seedlings flowered just 10 months after germination, much sooner than the 5 years reported in previous studies. The 16-month-old plants Zee successfully transplanted from the greenhouse to the field produced berries a year later.
Zee also used cuttings and tissue culture to propagate selected 'ōhelo of high ornamental potential. With proper care, young, growing shoots of 'ōhelo can be groomed into vibrant, colorful ornamental potted plants. Since the plant is not seasonal, its readiness for market can be scheduled by trimming and fertilizing. Older potted 'ōhelo plants can be trained into a bonsai and can readily adapt to the office environment.
Zee and PBARC scientists are currently examining the disease and insect problems associated with growing potted 'ōhelo. Full descriptions of Zee's 'ōhelo studies can be found on the University of Hawaii's College of Tropical Agriculture and Human Resources' (CTAHR) website.

                                            -------xxxx-------









 

Scientists release first cultivated ohelo berry for Hawaii

Scientists release first cultivated ohelo berry for    Hawaii

What US Department of Agriculture and its university affiliates did may be followed  universally by similar departments elsewhere in any country. Many wild berries and fruits rich in vitamins and other nutrients have become extinct.

The first cultivar of 'ōhelo berry, a popular native Hawaiian fruit, has been released by U.S. Department of Agriculture (USDA) scientists and their university and industry cooperators.
'Ōhelo (Vaccinium reticulatum Smith) is a small, native Hawaiian shrub in the cranberry family, commonly found at high elevations on the islands of Maui and Hawaii. As people scour the landscape to harvest this delectable berry for use in jam, jelly and pie filling, they unfortunately disrupt the fragile habitats where this plant grows.
In an effort to reduce damage to the environment and meet consumer demands, horticulturist Francis T.P. Zee, with the USDA Agricultural Research Service (ARS) Pacific Basin Agricultural Research Center (PBARC) in Hilo, Hawaii, is evaluating 'ōhelo for small farm production and ornamental use. Zee collaborated with fellow ARS scientists and cooperators at the University of Hawaii at Manoa, Big Island Candies and the Big Island Association of Nurserymen. ARS is the principal intramural scientific research agency of USDA.
Zee and his team selected the offspring of seed-grown plants to create the new cultivar "Kilauea" for berry production. They found 'ōhelo's tiny seeds readily germinated under 20-30 percent shade in well-watered and well-drained potting mixture. Plant hardiness and vigor improved with age, and some seedlings flowered just 10 months after germination, much sooner than the 5 years reported in previous studies. The 16-month-old plants Zee successfully transplanted from the greenhouse to the field produced berries a year later.
Zee also used cuttings and tissue culture to propagate selected 'ōhelo of high ornamental potential. With proper care, young, growing shoots of 'ōhelo can be groomed into vibrant, colorful ornamental potted plants. Since the plant is not seasonal, its readiness for market can be scheduled by trimming and fertilizing. Older potted 'ōhelo plants can be trained into a bonsai and can readily adapt to the office environment.
Zee and PBARC scientists are currently examining the disease and insect problems associated with growing potted 'ōhelo. Full descriptions of Zee's 'ōhelo studies can be found on the University of Hawaii's College of Tropical Agriculture and Human Resources' (CTAHR) website.


                                                            ---------------xxxxxx----------------------------

Thursday, September 23, 2010

'Dry water' could make a big splash commercially

'Dry water' could make a big splash commercially

IMAGE: Powdered material called "dry water " could provide a new way to store carbon dioxide in an effort to fight global warming.

Click here for more information.

An unusual substance known as "dry water," which resembles powdered sugar, could provide a new way to absorb and store carbon dioxide, the major greenhouse gas that contributes to global warming, scientists reported at the 240th National Meeting of the American Chemical Society. The powder shows bright promise for a number of other uses:

*It may, for instance, be a greener, more energy-efficient way of jump-starting the chemical reactions used to make hundreds of consumer products.

*Dry water also could provide a safer way to store and transport potentially harmful industrial materials.

Carter explained that the substance became known as "dry water" because it consists of 95 percent water and yet is a dry powder. Dry water was discovered in 1968 and got attention for its potential use in cosmetics. Scientists at the University of Hull, U.K. rediscovered it in 2006 in order to study its structure, and Cooper's group at the University of Liverpool has since expanded its range of potential applications. One of the most recent involves using dry water as a storage material for gases, including carbon dioxide. In laboratory-scale research, Cooper and co-workers found that dry water absorbed over three times as much carbon dioxide as ordinary, uncombined water and silica in the same space of time.

In another potential new application, the scientists also showed that dry water is a promising means to speed up catalyzed reactions between hydrogen gas and maleic acid to produce succinic acid, a feedstock or raw material widely used to make drugs, food ingredients, and other consumer products. "There's nothing else quite like it," said Ben Carter, Ph.D., researcher for study leader Professor Andrew Cooper. "Hopefully, we may see 'dry water' making waves in the future."