Saturday, September 10, 2011

Researchers identify genetic mutations associated with diseases of the esophagus

ScienceDaily (July 26, 2011) — Mutations in three genes have been identified that are more prevalent in patients with esophageal cancer and Barrett esophagus, a premalignant metaplasia (change in cells or tissue) caused by chronic gastroesophageal reflux disease (GERD), according to preliminary research reported in the July 27 issue of JAMA.See Also:Health & MedicineGenesBrain TumorParkinson's ResearchDiseases and ConditionsGastrointestinal ProblemsPersonalized MedicineReferenceTumor suppressor geneBRCA1BRCA2Tumor

The incidence of esophageal adenocarcinoma (EAC) in the United States and Europe has increased 350 percent since 1970, with the cause uncertain. Esophageal adenocarcinoma is believed to be preceded by Barrett esophagus (BE), according to background information in the article. Barrett esophagus is common, estimated to occur in 1 percent to 10 percent of the general population. "Finding predisposition genes may improve premorbid risk assessment, genetic counseling, and management," the authors write.

Charis Eng, M.D., Ph.D., of the Cleveland Clinic, and colleagues conducted a study to identify a gene or genes associated with BE/EAC predisposition. The research included an analyses of 21 concordant (both)-affected sibling pairs with BE/EAC and 11 discordant sibling pairs (2005-2006). The study also included data from 176 white patients with BE/EAC and 200 ancestry-matched controls (2007-2010). Data from 19 BE/EAC tissues yielded 12 "priority" candidate genes for mutation analysis. Genes that showed mutations in cases but not in controls were further screened in 58 cases.

Analyses indicated that three major genes, MSR1, ASCC1, and CTHRC1 were associated with BE/EAC. Mutational analyses of the 12 priority candidate genes in BE/EAC cases found mutations in these three genes in 13 of 116 patients (11.2 percent), with the most frequently mutated being MSR1 (approximately 7 percent), followed by ASCCl and CTHRC1. "Findings of germline

How the modular structure of proteins permits evolution to move forward

ScienceDaily (July 26, 2011) — Changes in a short protein domain can alter a whole signaling network involved in organ development- this is the key result of a comparative study of the development of the egg laying organ in two species of nematodes. However, the outward appearance of the organ remains the same in both species. The study provides support for the theory of developmental systems drift -- a theory maintaining that, over the course of evolution, analogous organs of different species can retain the same shape and function while the regulative mechanisms underlying their development can change considerably.See Also:Plants & AnimalsEvolutionary BiologyGeneticsMolecular BiologyBiologyDevelopmental BiologyBiotechnologyReferenceDevelopmental biologyRoundwormHuman biologyMorphogenesis

The new results, published July 26 in the online, open-access journal PLoS Biology, raise the question of whether the modular structure of proteins creates space for evolutionary development, even in otherwise highly conserved structures of organs and signaling pathways.

The nematode Caenorhabditis elegans (C. elegans) is a model organism of genetics. The worm is only about one millimeter long, and its genome has been completely sequenced, so scientists can trace the fate of every one of its 959 cells. In research lasting more than a decade, Ralf Sommer, Director of the Department for Evolutionary Biology at the Max Planck Institute for Developmental Biology in Tübingen, Germany, has established as a comparative model organism, a second nematode, Pristionchus pacificus (P. pacificus). At first sight, this species resembles C. elegans, but it belongs to another family. The last common ancestor of the two species lived 250 to 420 million years ago, well before the zenith of the dinosaurs. "For

Scientists induce hibernation at will: Discovery puts scientists closer to human application

ScienceDaily (July 26, 2011) — Hibernation is an essential survival strategy for some animals and scientists have long thought it could also hold promise for human survival. But how hibernation works is largely unknown. Scientists at the University of Alaska Fairbanks have successfully induced hibernation at will, showing how the process is initiated. Their research is published in the July 26 issue of The Journal of Neuroscience.See Also:Health & MedicineInsomnia ResearchControlled SubstancesHuman BiologyPlants & AnimalsBiologyAnimalsNatureStrange ScienceReferenceParasympathetic nervous systemExcitotoxicity and cell damageNeurotransmitterAntifreeze protein

A hibernating animal has a reduced heart rate and blood flow similar to a person in cardiac arrest, yet the hibernator doesn't suffer the brain damage that can occur in people. "Understanding the neuroprotective qualities of hibernating animals may lead to development of a drug or therapy to save people's lives after a stroke or heart attack," said Kelly Drew, senior author and UAF professor of chemistry and biochemistry in the Institute of Arctic Biology.

Hibernating animals survive by severely reducing their metabolism, a condition called torpor, in which oxygen consumption can fall to as low as one percent of resting metabolic rate and core body temperature to near or below freezing temperatures.

Arctic ground squirrels, like all animals and people, produce a molecule called adenosine that slows nerve cell activity. "When a squirrel begins to hibernate and when you feel drowsy it's because adenosine molecules have attached themselves to receptors in your brain," said Tulasi Jinka, lead author and IAB post-doctoral fellow in Drew's lab.

The receptors can be regulated by a simple cup of coffee. A caffeine molecule is similar enough in structure to adenosine that it binds to the receptors and effectively stops or reverses the onset of drowsiness. Jinka and Drew wanted to know what substances trip the squirrels' switch to start to hibernate.

"We devised an experiment in which non-hibernating arctic ground squirrels were given a substance that stimulated adenosine receptors in their brains. We expected the substance to induce hibernation," Drew said. "We also gave a substance similar to caffeine to arouse hibernating ground squirrels."

The non-hibernating squirrels were tested three times during one year. They were tested during the summer when they were not hibernating, again early in their hibernation season and a third time mid-way through the hibernation season. If animals were hibernating before the test Jinka woke them up to see if the substance would cause them to go back into hibernation. To ensure that his expectations did not influence the results he delivered a placebo in the same manner as the drug and did not know which solution contained the active substance when he conducted the experiments.

Torpor was induced in all six of the squirrels awoken during mid-hibernation season, but in only two of the six from the early hibernation season group and in none during the summer season. The caffeine-like substance reversed torpor in all of the hibernating squirrels.

"We show for the first time that activation of the adenosine receptors is sufficient to induce torpor in arctic ground squirrels during their hibernation season," Jinka said, who conducted this experiment while he was a graduate student.

What Jinka and Drew don't yet know is how season causes the receptors to become increasingly sensitive to adenosine as the time of hibernation progresses.

Jinka and Drew are expanding their adenosine research to rats, which more closely resemble the physiology of humans. "Rats allow us to move toward being able to apply this research to humans," Jinka said.

Email or share this story:

Friday, September 9, 2011

To help doctors and patients, researchers are developing a 'vocabulary of pain'

ScienceDaily (July 26, 2011) — All over the world, patients with chronic pain struggle to express how they feel to the doctors and health-care providers who are trying to understand and treat them.See Also:Health & MedicinePain ControlWorkplace HealthFibromyalgiaMind & BrainCaregivingPTSDBrain InjuryReferenceChronic painBack painTension headacheGate control theory of pain

Now, a University at Buffalo psychiatrist is attempting to help patients suffering from chronic pain and their doctors by drawing on ontology, the branch of philosophy concerned with the nature of being or existence.

The research will be discussed during a tutorial he will give at the International Conference on Biomedical Ontology, sponsored by UB, that will be held in Buffalo July 26-30.

"Pain research is very difficult because nothing allows the physician to see the patient's pain directly," says Werner Ceusters, MD, professor of psychiatry in UB's School of Medicine and Biomedical Sciences, and principal investigator on a new National Institutes of Health grant, An Ontology for Pain and Related Disability, Mental Health and Quality of Life.

"The patient has to describe what he or she is feeling."

That is a serious shortcoming, Ceusters says, because each patient's subjective experience of pain is different. Descriptions of pain therefore lack the precision and specificity that is taken for granted with other disorders, where biomarkers or physiological indicators reveal what health-care providers need in order to assess the severity of a particular disorder.

"If we want to more effectively help people suffering from chronic pain, we need to study a population that is consistent, patients who have features in common," Ceusters says. "The problem with pain is, it's very hard to build up a group with the same sort of pain. People don't have the same vocabulary or linguistic capabilities or even the same cultural backgrounds. It's something pain researchers have struggled with for decades," Ceusters says. "We need to develop a vocabulary of pain."

That's where ontology comes in.

"The philosophical definition of ontology is the study of things that exist and how they relate to each other," says Ceusters, who also is director of the Ontology Research Group of UB's New York State Center of Excellence in Bioinformatics and Life Sciences. "I am a person and you are a person so we share something. Suppose I drop dead. What lies on the floor? Is that still a person? If it is no longer a person, is it still the very same thing that was sitting here as a person but now is a corpse?"

Ceusters says that in much the same way, definitions of pain and especially of chronic pain need to be much more precise; ontology provides methods of distinguishing among categories and describing data in uniform and formal ways.

While the philosophical approach to ontology naturally has its roots in ancient Greece, a computational approach to ontology began in the latter part of the 20th century, when computer scientists interested in artificial intelligence wanted to create software programs that perform reasoning they way humans do. To do so, they began to draw on ontology.

"Here at the University at Buffalo, we excel at combining the two approaches; we have a very strong foundation in the philosophical approach to ontology with Barry Smith, who is a pioneer in contemporary ontology, especially related to biomedical applications," says Ceusters, "while we also have a very strong presence in computational approaches, especially to biomedical ontology. These computational approaches allow us to devise systems of communication in which there is a consistent meaning for terms used in different language systems and conceptual frameworks."

With the $793,571 NIH grant, Ceusters and colleagues will study data gathered from thousands of patients in the U.S., the United Kingdom, Sweden, Israel and Germany who suffer from oral and facial pain, including temporomandibular disorder (TMD).

Ceusters will work with his colleagues, including Richard Ohrbach, DDS, PhD, associate professor of oral diagnostic sciences in the UB School of Dental Medicine, to develop an ontology that allows the data to be described in a much more uniform way.

"The goal is to integrate the data together so that we have a large pool of data that will allow us to obtain better insight into the complexity of pain disorders, specifically the assessment of pain disorders and how they impact mental health and a patients' quality of life," Ceusters says.

The grant will build on past work that Ceusters conducted with a grant from the Oishei Foundation related to improving the classification, diagnosis and treatment of psychiatric conditions.

Ceusters, who has degrees in knowledge engineering and information science as well as in neuropsychiatry, says that the current effort grew out of his work on that grant and also from a meeting with pain researchers that he attended in 2009.

"At that meeting, we discussed how we might build an ontology so that it could represent what pain is and how it relates to body parts and their activities and functions," he says. "Our goal is to create a software program that will allow all pain specialists to express themselves in crystal clear terms," he says, "We will create a symptom checklist that can be understood by computers. We have to define the terminology of pain. This can only be solved by the kind of ontology we are doing here at the University at Buffalo."

Email or share this story:

Eliminating protein in specific brain cells blocks nicotine reward

ScienceDaily (July 26, 2011) — Removing a protein from cells located in the brain's reward center blocks the anxiety-reducing and rewarding effects of nicotine, according to a new animal study in the July 27 issue of The Journal of Neuroscience. The findings may help researchers better understand how nicotine affects the brain.See Also:Health & MedicineSmokingBrain TumorPsychology ResearchMind & BrainSmoking AddictionAddictionAnxietyReferenceNicotineAddictionDrug addictionDopamine hypothesis of schizophrenia

Nicotine works by binding to proteins called nicotinic receptors on the surface of brain cells. In the new study, researchers led by Tresa McGranahan, Stephen Heinemann, PhD, and T. K. Booker, PhD, of the Salk Institute for Biological Studies, found that removing a specific type of nicotinic receptor from brain cells that produce dopamine -- a chemical released in response to reward -- makes mice less likely to seek out nicotine. The mice also did not show reductions in anxiety-like behaviors normally seen after nicotine treatment. Smokers commonly report anxiety relief as a key factor in continued smoking or relapse.

"These findings show that the rewarding and anxiety-reducing properties of nicotine, thought to play a key role in the development of tobacco addiction, are related to actions at a single set of brain cells," said Paul Kenny, PhD, an expert on drug addiction at Scripps Research Institute, who was unaffiliated with the study.

Previous studies showed blocking the alpha4 nicotinic receptor within the ventral tegmental area (VTA) -- a brain region important in motivation, emotion, and addiction -- decreases the rewarding properties of nicotine. Because alpha4 receptors are present on several cell types in the VTA, it was unclear how nicotine produced pleasurable feelings.

To zero in on the circuit important in the brain's response to nicotine, researchers developed mice with a mutation that left them unable to produce the alpha4 receptor, but only on dopamine brain cells. Mice lacking alpha4 receptors in these cells spent less time looking to obtain nicotine compared with normal mice, suggesting the alpha4 receptors are required for the rewarding effects of nicotine. Nicotine also failed to reduce anxiety-like behaviors in the mutant mice, as it normally does in healthy mice.

"Identification of the type of nicotinic receptors necessary for two key features of nicotine addiction -- reward and anxiety -- may help us better understand the pathway that leads to nicotine dependence, and potential treatment for the one billion cigarette smokers worldwide," McGranahan said. Diseases from tobacco use remain a major killer throughout the world, causing more than 5 million deaths per year.

The findings could guide researchers to a better understanding of the mechanisms of tobacco addiction and assist in the development of new drugs to treat tobacco addiction and provide relief from anxiety disorders, Kenny added.

The research was supported by the National Institute of Neurological Disorders and Stroke, the National Institute on Alcohol Abuse and Alcoholism, and the National Institute on Drug Abuse.

Email or share this story:

Vitamin D relieves joint, muscle pain for breast cancer patients

ScienceDaily (July 26, 2011) — High-dose vitamin D relieves joint and muscle pain for many breast cancer patients taking estrogen-lowering drugs, according to a new study from Washington University School of Medicine in St. Louis.See Also:Health & MedicineVitaminVitamin DPain ControlOsteoporosisBreast CancerDietary SupplementReferenceCOX-2 inhibitorBreast cancerHysterectomyHormone replacement therapy

The drugs, known as aromatase inhibitors, are commonly prescribed to shrink breast tumors fueled by the hormone estrogen and help prevent cancer recurrence. They are less toxic than chemotherapy, but for many patients, the drugs may cause severe musculoskeletal discomfort, including pain and stiffness in the hands, wrists, knees, hips, lower back, shoulders and feet.

"About half of patients can experience these symptoms," says Antonella L. Rastelli, MD, assistant professor of medicine and first author of the study published online in the journal Breast Cancer Research and Treatment. "We don't know exactly why the pain occurs, but it can be very debilitating -- to the point that patients decide to stop taking aromatase inhibitors."

Because the drugs reduce cancer recurrence, finding a way to help patients stay on them is important for long-term, relapse-free survival, according to Rastelli. Aromatase inhibitors are prescribed to post-menopausal women for at least five years and often longer after a breast cancer diagnosis. There is some evidence that patients who experience the drugs' side effects are less likely to see their cancer return, providing even more incentive to help these patients continue taking them.

It was Rastelli's colleague, Marie E. Taylor, MD, assistant professor of radiation oncology, who first noticed that patients on aromatase inhibitors who experienced this pain found some relief from high doses of vitamin D.

So Rastelli's group recruited 60 patients who reported pain and discomfort associated with anastrozole, one of three FDA-approved aromatase inhibitors. The patients they studied also had low vitamin D levels. Half the group was randomly assigned to receive the recommended daily dose of vitamin D (400 international units) plus a 50,000-unit vitamin D capsule once a week. The other half received the daily dose of 400 units of vitamin D plus a weekly placebo. All subjects received 1,000 milligrams of calcium daily throughout the study.

Patients in the study reported any pain they experienced through three different questionnaires. They were asked to quantify their pain intensity, as well as report how much the pain altered their mood, affected their work and interfered with relationships and daily activities. The results show that patients receiving high-dose vitamin D every week reported significantly less musculoskeletal pain and also were less likely to experience pain that interfered with daily living.

"High-dose vitamin D seems to be really effective in reducing the musculoskeletal pain caused by aromatase inhibitors," Rastelli says. "Patients who get the vitamin D weekly feel better because their pain is reduced and sometimes goes away completely. This makes the drugs much more tolerable. Millions of women worldwide take aromatase inhibitor therapy, and we may have another 'tool' to help them remain on it longer."

Like anastrozole used in this study, the other two FDA-approved aromatase inhibitors, letrozole and exemestane, also cause musculoskeletal pain. Given the similar side effects, Rastelli says patients on these drugs may also benefit from high-dose vitamin D.

The vitamin used in this study is a plant-derived type called vitamin D2. Rastelli says it achieves the best results when given weekly because the body metabolizes it within seven to 10 days. Rastelli and her colleagues did not use high-dose vitamin D3, which remains in the body longer.

"This was a very carefully conducted study, and the placebo control makes the findings quite compelling," says Matthew J. Ellis, MD, PhD, the study's senior author and director of the Breast Cancer Program at the Alvin J. Siteman Cancer Center at Barnes-Jewish Hospital and Washington University School of Medicine in St. Louis. "We should follow up these findings further to determine the most efficacious and safe approach to vitamin D supplementation in our breast cancer patients."

Since vitamin D helps the body absorb calcium, too much of it can cause high levels of calcium in the urine, which may increase the risk of kidney stones. Such possible side effects emphasize the importance of tracking patients' urine calcium levels while taking high-dose vitamin D.

"It's important to monitor the patients, but overall it appears to be very safe," Rastelli says. "Because vitamin D2 is eliminated from the body so quickly, it's very hard to overdose."

In addition to relieving pain, the group wanted to examine whether vitamin D could protect against the bone loss often seen in patients taking aromatase inhibitors. The researchers measured each patient's bone density at the beginning of the study and again after six months.

Perhaps because of its role in calcium absorption, high-dose vitamin D did appear to help maintain bone density at the neck of the femur, the top of the thighbone near the hip joint. Although the result did not reach statistical significance, Rastelli calls the result promising and worth further studies.

"It's great that we have something as simple as vitamin D to help patients alleviate some of this pain," Rastelli says. "It's not toxic -- it doesn't cause major side effects. And if it is actually protecting against bone loss, that's even better."

The study was supported by Astra-Zeneca, which makes the aromatase inhibitor anastrozole under the brand name Arimidex.

Email or share this story:

Thursday, September 8, 2011

Seeing the wood for the trees: New study shows sheep in tree-ring records

ScienceDaily (July 26, 2011) — Nibbling by herbivores can have a greater impact on the width of tree rings than climate, new research has found. The study, published this week in the British Ecological Society's journal Functional Ecology, could help increase the accuracy of the tree ring record as a way of estimating past climatic conditions.See Also:Plants & AnimalsTreesBotanyNatureEarth & ClimateForestClimateGlobal WarmingReferenceGrowth ringTemperature recordPetrified woodClimate changes of 535 to 536

Many factors in addition to climate are known to affect the tree ring record, including attack from parasites and herbivores, but determining how important these other factors have been in the past is difficult.

Working high in the mountains of southern Norway, midway between Oslo and Bergen, a team from Norway and Scotland fenced off a large area of mountainside and divided it into different sections into each of which a set density of domestic sheep was released every summer.

After nine summers, cross sections of 206 birch trees were taken and tree ring widths were measured. Comparing these with local temperature and the numbers of sheep at the location where the tree was growing allowed the team to disentangle the relationship between temperature and browsing by sheep and the width of tree rings.

According to lead author Dr James Speed of the NTNU Museum of Natural History and Archaeology: "We found tree ring widths were more affected by sheep than the ambient temperature at the site, although temperatures were still visible in the tree ring records. This shows that the density of herbivores affects the tree ring record, at least in places with slow-growing trees."

The impact of large herbivores on tree rings has, until now, been largely unknown, so these findings could help increase the accuracy of the tree ring record as a way of estimating past climatic conditions, says Dr Speed: "Our study highlights that other factors interact with climate to affect tree rings, and that to increase the accuracy of the tree ring record to estimate past climatic conditions, you need to take into account the history of wild and domestic herbivores. The good news is that past densities of herbivores can be estimated from historic records, and from the fossilised remains of spores from fungi that live on dung."

"This study does not mean that using tree rings to infer past climate is flawed as we can still see the effect of temperatures on the rings, and in lowland regions tree rings are less likely to have been affected by herbivores because they can grow out of reach faster," he explains.

Tree rings give us a window into the past, and have been widely used as climate recorders since the early 1900s. The growth rings are visible in tree trunk cross sections, and are formed in seasonal environments as the wood is laid down faster in summer than winter. In years with better growing conditions (in cool locations this usually means warmer) tree rings are wider, and because trees can be very long-lived and wood is easily preserved, for example in bogs and lakes, this allows very long time-series to be established, and climatic conditions to be estimated from the ring widths.

The study was funded by the Norwegian Research Council and the Norwegian Directorate for Nature Management.

Email or share this story:

Concern over intensive treatment for patients with Type 2 diabetes

ScienceDaily (July 26, 2011) — Doctors should be cautious about prescribing intensive glucose lowering treatment for patients with type 2 diabetes as a way of reducing heart complications, concludes a new study published online in the British Medical Journal.See Also:Health & MedicineDiabetesPersonalized MedicineHeart DiseaseCholesterolWounds and HealingToday's HealthcareReferenceDiabetes mellitus type 2Blood sugarDiabetic dietGlycemic index

French researchers found that intensive glucose lowering treatment, which is widely used for people with type 2 diabetes to reduce their heightened risk of cardiovascular disease, showed no benefit on all-cause or cardiovascular mortality.

Globally, there were an estimated 150 million adults with diabetes in 2000 and this is expected to rise to 366 million by 2030. People with type 2 diabetes are twice as likely to have cardiovascular disease than non-diabetics and are also more at risk of microvascular complications (damage to small blood vessels).

Glycaemic lowering therapies are commonly used to treat people with type 2 diabetes to prevent long term cardiovascular complications and renal and visual impairment, but previous studies have not shown clear and universal benefits of the treatment.

So a team, led by Catherine Cornu at the Louis Pradel Hospital in Bron, France, reviewed studies that looked at microvascular complications and cardiovascular events related to the intensity of glycaemic control and the quality of trials.

They analysed 13 studies involving 34,533 patients of whom 18,315 were given intensive glucose lowering treatment and 16,218 given standard treatment.

They found that intensive glucose treatment did not significantly affect all-cause mortality or cardiovascular death.

There was, however, a 15% reduction in the risk of non-fatal heart attacks, following intensive treatment and a 10% reduction in microalbuminuria -- an indication of kidney problems and heart disease -- but a more than two-fold increase in the risk of severe hypoglycaemia (dangerously low blood glucose levels).

The researchers calculated that over a five-year treatment period, 117 to 150 patients would need to be treated to avoid one heart attack, 32 to 142 to avoid one case of microalbuminuria, and 15 to 52 to avoid one severe hypoglycaemic event.

They conclude: "Intensive glucose lowering treatment of type 2 diabetes should be considered with caution and therapeutic escalation should be limited."

In an accompanying editorial, UK experts state that clinicians should consider the absolute risks and benefits of more intensive therapy carefully on an individual patient basis to determine the most sensible treatment strategy.

Email or share this story:

One in six fast-food customers cut calories after US food labeling system introduction

ScienceDaily (July 26, 2011) — Around a sixth of fast food customers used calorie information and, on average, bought food with lower calories since the introduction of a labelling system in the US, says a new study published on the British Medical Journal website.See Also:Health & MedicineNutritionDiet and Weight LossMind & BrainNutrition ResearchDieting and Weight ControlScience & SocietyPublic HealthResource ShortageReferenceFast foodCalorieHealthy dietWeight Watchers

US researchers found there has been a small but positive impact from a law introduced in 2008 in New York requiring chain restaurants with 15 or more branches nationally to provide calorie information on menus and menu boards in the city.

Obesity rates in the US are at an all time high in both adults and children and currently a third of adults and 17% of children and teenagers are obese. Several studies support an association between fast food consumption and excessive energy intake, but customers often underestimate the number of calories in restaurant meals and before 2007, nutrition information was seldom available at the point of purchase.

So a team of researchers decided to assess the impact of the calorie labelling regulation on the energy content of individual purchases at fast food restaurants in New York City. High street chains in England are about to embark on a similar, though voluntary scheme, as part of the government's Public Health Responsibility Deal.

Surveys were carried out during lunchtime hours in spring 2007 (one year before the regulation) and in spring 2009 (nine months after its implementation) at 168 randomly selected locations of the top 11 fast food chains in the city.

Adult customers provided register receipts and answered survey questions. Data from 7,309 customers in 2007 and 8,489 customers in 2009 were analysed.

Overall, there was no decline in calories purchased across the full sample. However, three major chains saw significant reductions.

For example, at McDonalds, average energy per purchase fell by 5.3%, at Au Bon Pain, it fell by 14.4% and at KFC, it dropped by 6.4%. Together, these three chains represented 42% of all customers in the study.

However, average energy content increased at one chain -- Subway -- by 17.8% where large portions were heavily promoted.

Analysis also showed that 15% of customers reported using the calorie information and, on average, these customers purchased 106 fewer kilocalories than customers who did not see or use the calorie information.

The researchers say that calorie labelling is only one part of a framework to address the obesity epidemic and call for additional strategies to reduce energy intake on a population basis. "Special attention should be focused on educating customers on how to interpret and use nutrition information," they conclude.

In an accompanying editorial, Dr Susan Jebb from the MRC Human Nutrition Research Centre in Cambridge believes that labelling is a step forward, but changes in food supply must follow. She writes: "Calorie labelling will help consumers make an informed choice about what they eat, but sustained improvements in the nation's diet will require a transformation of the food supply too."

Email or share this story:

Wednesday, September 7, 2011

Changes in attention and visual perception are correlated with aging: Older people find it harder to see the wood for the trees

ScienceDaily (July 25, 2011) — When looking at a picture of many trees, young people will tend to say: "This is a forest." However, the older we get, the more likely we are to notice a single tree before seeing the forest. This suggests that the speed at which the brain processes the bigger picture is slower in older people. In a new study published in the July-August issue of Elsevier´s Cortex, researchers have found that these age-related changes are correlated with a specific aspect of visual perception, known as Gestalt perception.See Also:Plants & AnimalsTreesMiceNatureEarth & ClimateForestGlobal WarmingClimateStrange ScienceReferenceVisual acuityColor visionThe evolution of human intelligenceMirror test

Markus Staudinger, together with Gereon R. Fink, Clare E. Mackey, and Silke Lux, investigated the brain's ability to focus on the local and global aspects of visual stimuli, in a group of young and elderly healthy subjects. They also studied how this ability is related to Gestalt perception, which is the mind's tendency to perceive many similar smaller objects as being part of a bigger entity. As expected, older people found it more difficult to concentrate on the global picture, but they also had trouble with the Gestalt principle of Good Continuation -- the mind's preference for continuous shapes.

Participants in the study were shown groups of letters which were arranged in a pattern so that they formed a larger letter (see below), and asked whether a letter appeared on the local or global level. Importantly, the number of small letters forming the pattern was then varied. Usually, the smaller the letters are in the pattern, the easier it is to perceive the larger letter, and this was indeed true for the younger participants in the study. However, varying the number or letters did not help the older people, who remained slower to notice the global figure.

These findings provide the first evidence that changes in attention -- meaning, the ability to concentrate on one thing, while ignoring others -- and in Gestalt perception are correlated to healthy aging. More generally, they show that there may be age-related changes in different cognitive domains which interact. Furthermore, the results help us understand which specific aspects of visual perception become impaired in healthy aging.

E E E E E
E
E
E E E E
E
E
E

Email or share this story:

Beetles play an important role in reducing weeds

ScienceDaily (July 25, 2011) — Researchers funded by the UK Biotechnology and Biological Sciences Research Council and the French Institut National de la Recherche Agronomique (INRA) have found that ground beetles reduce the amount of weed seeds in the soil. Weeds reduce crop yields and these findings support the need to conserve farmland biodiversity as it plays an important supporting role to herbicides in controlling weeds and improving food security.See Also:Plants & AnimalsAgriculture and FoodSoil TypesSeedsEarth & ClimateEcologyEnvironmental IssuesSustainabilityReferenceWeedWeed controlHerbicideTransgenic plants

This research confirms a long-held belief by scientists that ground beetles play a role in weed control. Dr David Bohan, Rothamsted Research, who led the research, said "seed predation by naturally occurring beetles in farmland does have a beneficial effect, reducing weed numbers in fields and potentially improving agricultural productivity."

The study, to be published in the August edition of Journal of Applied Ecology, used data from 257 conventionally managed fields throughout the UK to determine the effect that ground beetles have on the number of weed seeds in the soil of sugar beet, maize, and spring and winter oilseed rape fields.

The researchers found that grass weeds were reduced more than other weeds, which is important because many UK farms have severe grass weed problems. Some of these species are increasingly resistant to herbicides and have a major impact on productivity as they compete with the crop for resources, leading to lower yields. Policy-driven reduction in herbicide use could lead to higher numbers of weeds in fields, so alternatives to herbicides have the potential for significant impact.

Ground beetles appear to eat a significant proportion of the weed seeds that would otherwise go into the soil. With the right management, ground beetles could be used to replace some herbicide applications and significantly reduce weed populations. 'Beetle banks', which involve leaving an area of a field as a wildlife habitat, are already supported under the Environmental Stewardship schemes available to farmers.

Professor Douglas Kell, Chief Executive, BBSRC said "We have a challenge to feed 9 billion people by 2050 and to do so we must engage in research now that will underpin improvements in yield and sustainability of farming in the future. By studying whole biological systems such as farm ecosystems we can spot the various contributions made by different aspects of a system, including these beetles. This project shows that the balance of farm ecosystems can be vital to ensuring sustainability in farming in the future. It also makes the link between biodiversity and food security very clear."

Email or share this story:

Heavy metal: Titanium implant safety under scrutiny

ScienceDaily (July 25, 2011) — A new strategy to quantify the levels of titanium in the blood of patients fitted with titanium orthopaedic implants is presented in Analytical and Bioanalytical Chemistry, a Springer journal. Yoana Nuevo-Ordóñez and colleagues of the Sanz-Medel research group from the University of Oviedo in Spain have developed a highly sensitive method to determine the levels of titanium in human blood, establishing a baseline for natural levels of titanium in untreated individuals as well as measuring levels in patients with surgical implants.See Also:Health & MedicineBone and SpineDentistryDisabilityBlood ClotsAnemiaWorkplace HealthReferenceBone fractureArtificial heartDietary mineralScoliosis

Titanium implants are routinely used for bone fractures as well as dental work. It has recently been shown that titanium-based implants both corrode and degrade, generating metallic debris. There is some concern over the increased concentrations of circulating metal-degradation products derived from these implants, and their potential harmful biological effects over a period of time, including hepatic injury and renal lesions. In order to assess the implications of these 'leaks', it is essential to accurately measure the basal, normal levels of titanium in the bloodstream, as well as quantify how much higher levels are in patients with implants.

Nuevo-Ordóñez and team collected blood from 40 healthy individuals and 37 patients with titanium implants -- 15 had tibia implants, eight had femur implants, and 14 had humerus implants (eight internal and six external fixation implants). They used their new method, based on isotope dilution analysis and mass spectrometry, or IDA-ICP-MS, to analyze the blood samples.

They found that control individuals had very low levels of titanium in the blood whereas titanium concentrations were significantly higher for all the patients with implants. The sensitivity of the method was such that the researchers were also able to show significant differences in titanium levels for different types of bone fixation devices. The more invasive implants shed more metallic debris into the blood than the external, superficial designs. The work also identified how the titanium from the implants is transported in the bloodstream and potentially distributed and accumulated.

The authors conclude: "The simplicity of the methodology based on isotope dilution analysis and the accuracy and precision of the obtained results should encourage the use of the proposed strategy on a routine basis."

Email or share this story:

Tuesday, September 6, 2011

Shuttle service in cells

ScienceDaily (July 25, 2011) — Research scientists at the Ruhr University Bochum have discovered a new enzyme, which gives decisive insights into protein import in specific cellular organelles (peroxisomes). In the Journal of Biological Chemistry, the team of Prof. Erdmann (Medical Faculty, Department of Systemic Biochemistry) reports that the enzyme Ubp15p collaborates with two other proteins to convert the protein transport machinery back into its initial condition after work has been completed. See Also:Plants & AnimalsMolecular BiologyCell BiologyGeneticsBiologyBiotechnologyBiochemistry ResearchReferenceOrganelleCell membraneDenaturation (biochemistry)Heat shock protein

The enzyme detaches a specific signal sequence from a protein which is important for transportation and recycling of this protein. A new sequence of protein can then commence. "With Ubp15p we could unravel a further mystery concerning the transport of proteins into peroxisomes," explains Prof. Erdmann. "The comprehension of these organelles at a molecular level is a decisive prerequisite for the development of new diagnostic and therapeutic approaches for patients with peroxisomal disorders who only seldom survive the first year of their life."

Shuttling to the peroxisome

Peroxisomes are multifunctional "tools." They are involved, for example, in the catabolism of fatty acids, and detoxify poisonous hydrogen peroxide. A malfunction of these organelles, as is the case in Zellweger Syndrome disorders, can have disastrous influences on the functioning of the liver, kidneys and brain. To be able to function correctly, peroxisomes need specific proteins, but they cannot produce these themselves. Thus, a shuttle system consisting of several receptors has to import them from the cytosol. The receptors recognize the proteins specified for the peroxisomes within the cytosol and escort them to their destination. Here they bond with the membrane of the peroxisome and form part of the "gate" through which the proteins are transported into the interior. An export signal (ubiquitin) is attached to the receptors, which ensures that they are released from the peroxisome membrane and available for transport yet again. What subsequently happens to the ubiquitin signal remains to be clarified.

New export components discovered

In an earlier publication in Nature Cell Biology, Prof. Erdmann's team had already described two motor proteins that withdraw the ubiquitin-marked receptor Pex5p from the membrane and transport it back into the cytosol. In a further paper (Nature Reviews Molecular Cell Biology), they postulated that this export of the receptor is mechanistically linked to the import of the peroxisomal protein. To date, it has however not been possible to detect the ubiquitin together with Pex5p in the cytosol. "We thus assumed that the ubiquitin is removed from the receptor during or shortly after export," states Prof. Erdmann. His team, funded by the collaborative research center 642 of the German National Science Foundation (Sonderforschungsbereich 642 der Deutschen Forschungsgemeinschaft), has now established that the enzyme Ubp15p disconnects the export signal and collaborates with the two motor proteins to remove the receptor from the membrane of the peroxisome.

Enzyme could be important for recycling

The scientists managed to locate Ubp15p in living yeast cells and to prove that the enzyme comes into direct contact with one of the motor proteins to reach the peroxisomes. When Prof. Erdmann's team deactivated the Ubp15p in the cells, the amount of ubiquitinated Pex5p increased. This result confirms the role of Ubp15p in cleaving the ubiquitin signal. The enzyme seems to have an important function in the import of proteins into the peroxisomes, particularly under stress conditions. "Ubp15p appears to play a vital role in the recycling of the receptor," points out Prof. Erdmann.

Email or share this story:

Newly discovered gene sheds light on the evolution of life on Earth

ScienceDaily (July 25, 2011) — A chance discovery of a genetic mutation in wild barley that grows in Israel's Judean Desert, in the course of a doctoral study at the University of Haifa, has led to an international study deciphering evolution of life on land. The study has been published in the journal PNAS.See Also:Plants & AnimalsEndangered PlantsSoil TypesEarth & ClimateWaterEcologyFossils & RuinsCharles DarwinEvolutionReferenceTransgenic plantsSeedbankGlutenAllele frequency

"Life on Earth began in the water, and in order for plants to rise above water to live on land, they had to develop a cuticle membrane that would protect them from uncontrolled evaporation and dehydration. "In our study we discovered a completely new gene that along with other genes contributes to the formation of this cuticle," said Prof. Eviatar Nevo of the Institute of Evolution of the University of Haifa, who took part in the study.

In the course of doctoral research carried out by Guoxiong Chen, which began at the University of Haifa in 2000 under the supervision of Prof. Nevo, the Chinese doctoral student found a mutation of wild barley in the Judean Desert that was significantly smaller than regular wild barley. It was found that this mutation causes an abnormal increase in water loss because of a disruption in the production of the plant's cutin that is secreted from the epidermal cells and is a component in the plant's cuticle that reduces water loss and prevents the plant's dehydration.

Guoxiong Chen has since returned to China and achieved full professorship while continuing his study of the Judean Desert's wild barley for which he enrolled an international team of scholars from China, Japan, Switzerland and Israel. After about eight years of research, this team discovered a new gene that contributes to the production of cutin, which is found in all land plants but is either nonexistent or present in tiny amounts in aquatic plants. Chen called this new gene Eibi1, in honor of his supervisor, Prof. Nevo.

"This is one of the genes that contributed to the actual eventuality of life on land as we know it today. It is a key element in the adaptation process that aquatic plants underwent in order to live on land," explained Prof. Nevo. Besides the evolutionary importance of this new gene, it is also of value in the future enhancement of cereals. According to Prof. Nevo, once we can fully understand the mechanism behind the production of cutin and discover genetic variants of the Eibi1 gene, we will have the ability to enhance the cuticle formation of wheat and barley species so as to make them more resistant to water loss and more durable in the dryer conditions on land. "Genetic enhancement of cultivated plants to make them durable in dry and saline conditions can increase food production around the world," the researcher concluded.

Email or share this story:

Diabetes mortality rates in status Aboriginal adults in Alberta, Canada concerning

ScienceDaily (July 25, 2011) — Diabetes rate increases in status Aboriginal adults in Alberta appear to be slowing compared with the general population, although diabetes is more common in status Aboriginals and death rates for this group are significantly higher than the general population, states an article in CMAJ (Canadian Medical Association Journal). Death rates have in fact remained unchanged for status Aboriginals who do not have diabetes.See Also:Health & MedicineDiabetesHormone DisordersChronic IllnessPersonalized MedicineEpigeneticsWounds and HealingReferenceDiabetes mellitus type 2Blood sugarDiabetic dietHyperglycemia

Diabetes is increasing in virtually all populations world-wide. It is common in Aboriginals in Canada, with estimated rates of type 2 diabetes and its complications two to five times higher than in the general population. Most information available regarding diabetes in Aboriginals concerns status Aboriginals (First Nations) as opposed to Métis or Inuit peoples, about whom much less is known, and most of the information is static. There is little data on the long-term diabetes trends in Aboriginal populations.

Aboriginal and non-Aboriginal researchers at the University of Alberta and Calgary sought to compare the incidence and prevalence of diagnosed diabetes in adult status Aboriginals and the general adult population in Alberta between 1995 and 2007. They also looked at mortality from any cause in those with and without diabetes. In 2007, there were 2 506 420 adults living in Alberta, including 72 725 Status Aboriginals, with 161 268 cases of diabetes and 7055 respectively. Diabetes rates were higher for status Aboriginals than the general population but over time the increase in diabetes prevalence for status Aboriginals was less than that of the general population.

"Increases in the prevalence and incidence of diabetes from 1995 to 2007 were less pronounced in the Aboriginal population than in the general population," writes PhD student Richard Oster, University of Alberta, with coauthors. However, Oster does note that among status Aboriginals diabetes was increasing more quickly in men than in women.

Over the same period mortality rates in all people with diabetes were decreasing and the rates of decrease were not different between status Aboriginals and the general population. However, mortality rates for status Aboriginals were 1.5 to 2 times higher than that of the general population, and in those without diabetes the mortality rate gap was increasing.

"The decreases in mortality observed among status Aboriginal adults with diabetes over the study period are consistent with findings from a recent study of ours showing improved diabetes-related health among status Aboriginal adults in Alberta," says Dr. Ellen Toth, the lead investigator. She adds, "the growing divergence in mortality observed between status Aboriginals and the general population without diabetes is sadly in contrast to national trends from 1980 to 2001, which showed an improvement in life expectancy among registered Indians, from 60.9 to 70.4 years among men and from 68.0 to 75.5 years among women."

The study also noted lower diabetes rate differences between Status Aboriginals and the general population in Alberta compared with Saskatchewan, Manitoba and Ontario, although the study does not explain the differences between rates.

The researchers suggest more research is necessary to understand lower rates of diabetes differences in Status Aboriginals in Alberta compared with other provinces as well as the unchanged or increasing mortality rate over 12 years in Status Aboriginals without diabetes.

Email or share this story:

Monday, September 5, 2011

Comprehensive immigrant and refugee health guidelines new resource for Canadian physicians

ScienceDaily (July 25, 2011) — The largest, most comprehensive evidence-based guidelines to immigrant health -- designed to help Canadian physicians meet the unique needs of this group -- are being published in CMAJ (Canadian Medical Association Journal).See Also:Health & MedicineHealth PolicyMental Health ResearchDiseases and ConditionsScience & SocietyPublic HealthRacial DisparityWorld DevelopmentReferenceEvidence-based medicineSports medicineClinical trialOccupational therapy

Immigrant and refugee health needs may differ significantly from those of Canadian-born people as they may have been exposed to different diseases, environment and living conditions as well as genetic factors.

The guidelines, based on evidence from around the world, are focused on helping primary care physicians provide for the often complex health needs of immigrants and refugees. Created by the Canadian Collaboration for Immigrant and Refugee Health, the project involved more than 150 investigators, including 43 family doctors, 34 researchers, staff and nurse practitioners as well as other authors.

The size of Canada's immigrant population is growing but there is a lack of evidence-based information on approaches to immigrant health. Worldwide, there are more than 200 million international migrants whose movement across borders has significant health impacts for many countries. While health task forces in Canada and the US have developed clinical prevention recommendations, they are not directly tailored toward the unique backgrounds and needs of immigrants and refugees.

"Use of evidence-based methods has yet to substantially affect the field of migration medicine," writes Dr. Kevin Pottie, University of Ottawa, with coauthors. "Our evidence reviews synthesized data from around the world, and our recommendations focus on immigrants, refugees and refugee claimants, with special attention given to refugees, women and the challenges of integrating recommendations into primary care," he states.

"Our recommendations differ from other guidelines because of our insistence on finding evidence for clear benefits before recommending routine interventions," state the authors. For example, in the case of possible intestinal parasites but no symptoms, the guidelines recommend blood testing for certain parasites and forgoing traditional stool testing, marking a shift in practice.

The package includes a summary document, clinical guidelines to immigrant health, online case studies and detailed evidence and methodologies. Content focuses on four areas: infectious diseases; mental health and physical and emotional maltreatment; chronic and noncommunicable diseases; and women's health. Detailed indexes on specific illnesses and conditions including post-traumatic stress, mental health, pediatric issues and more make it easy for physicians to find information.

The first few sections of the guidelines were published in CMAJ online in June 2010. This is now the comprehensive package of the full guidelines.

"More work must be done to improve immigrants' access to health services," conclude the authors. "We hope this evidence-based initiative will provide a foundation for improved preventive health care for immigrant populations."

Email or share this story:

Artificial lung mimics real organ's design and efficiency: Small device works with air, pure oxygen not needed

ScienceDaily (July 25, 2011) — An artificial lung built by Cleveland researchers has reached efficiencies akin to the genuine organ, using air -- not pure oxygen as current human-made lungs require -- for the source of the essential element.See Also:Health & MedicineLung DiseaseLung CancerCystic FibrosisDiseases and ConditionsMedical ImagingDisabilityReferenceArtificial heartEmphysemaOxygen therapyLung

Use in humans is still years away, but for the 200 million lung disease sufferers worldwide, the device is a major step toward creating an easily portable and implantable artificial lung, said Joe Potkay, a research assistant professor in electrical engineering and computer science at Case Western Reserve University. Potkay is the lead author of the paper describing the device and research, in the journal Lab on a Chip.

The scientists built the prototype device by following the natural lung's design and tiny dimensions. The artificial lung is filled with breathable silicone rubber versions of blood vessels that branch down to a diameter less than one-fourth the diameter of human hair.

"Based on current device performance, we estimate that a unit that could be used in humans would be about 6 inches by 6 inches by 4 inches tall, or about the volume of the human lung. In addition, the device could be driven by the heart and would not require a mechanical pump," Potkay said.

Current artificial lung systems require heavy tanks of oxygen, limiting their portability. Due to their inefficient oxygen exchange, they can be used only on patients at rest, and not while active. And, the lifetime of the system is measured in days.

The Cleveland researchers focused first on improving efficiency and portability.

Potkay, who specializes in micro- and nano-technology, worked with Brian Cmolik, MD, an assistant clinical professor at Case Western Reserve School of Medicine and researcher at the Advanced Platform Technology Center and the Cardiothoracic Surgery department at the Louis Stokes Cleveland VA Medical Center. Michael Magnetta and Abigail Vinson, biomedical engineers and third-year students at Case Western Reserve University School of Medicine, joined the team and helped develop the prototype during the past two years.

The researchers first built a mould with miniature features and then layered on a liquid silicone rubber that solidified into artificial capillaries and alveoli, and separated the air and blood channels with a gas diffusion membrane.

By making the parts on the same scale as the natural lung, the team was able to create a very large surface-area-to-volume ratio and shrink the distances for gas diffusion compared to the current state of the art. Tests using pig blood show oxygen exchange efficiency is three to five times better, which enables them to use plain air instead of pure oxygen as the ventilating gas.

Potkay's team is now collaborating with researchers from Case Western Reserve's departments of biomedical engineering and chemical engineering to develop a coating to prevent clogging in the narrow artificial capillaries and on construction techniques needed to build a durable artificial lung large enough to test in rodent models of lung disease.

Within a decade, the group expects to have human-scale artificial lungs in use in clinical trials.

They envision patients would tap into the devices while allowing their own diseased lungs to heal, or maybe implant one as a bridge while awaiting a lung transplant -- a wait that lasts, on average, more than a year.

Email or share this story:

Pacific Northwest trees struggle for water while standing in it

ScienceDaily (July 25, 2011) — Contrary to expectations, researchers have discovered that the conifers of the Pacific Northwest, some of the tallest trees in the world, face their greatest water stress during the region's eternally wet winters, not the dog days of August when weeks can pass without rain.See Also:Plants & AnimalsTreesDroughtBotanyEarth & ClimateWaterForestDrought ResearchReferenceCoast Douglas-firGrowth ringOld growth forestSequoia tree

Due to freeze-thaw cycles in winter, water flow is disrupted when air bubbles form in the conductive xylem of the trees. Because of that, some of these tall conifers are seriously stressed for water when they are practically standing in a lake of it, scientists from Oregon State University and the U.S. Forest Service concluded in a recent study.

It's not "drought stress" in a traditional sense, the researchers said, but the end result is similar. Trees such as Douglas-fir actually do better dealing with water issues during summer when they simply close down their stomata, conserve water and reduce their photosynthesis and growth rate.

"Everyone thinks that summer is the most stressful season for these trees, but in terms of water, winter can be even more stressful," said Katherine McCulloh, a research assistant professor in the OSU Department of Forest Ecosystems and Society.

"We've seen trees in standing water, at a site that gets more than two meters of rain a year, yet the xylem in the small branches at the tops of these trees can't transport as much water as during the summer," McCulloh said.

The ease with which water moves through wood is measured as the "hydraulic conductivity," and researchers generally had believed this conductivity would be the lowest during a conventional drought in the middle of summer. They found that wasn't the case.

"We thought if there was a serious decline in conductivity it would have been from drought," said Rick Meinzer, a researcher with the Pacific Northwest Research Station of the USDA Forest Service, as well as OSU. "It was known that air bubbles could form as increased tension is needed in the xylem to pull water higher and higher. But it turns out that freezing and thawing caused the most problems for water transport."

Studies such as this are important, the scientists said, to better understand how forests might respond to a warmer or drier climate of the future. And although this might imply that these conifers could be more resistant to drought than had been anticipated, the researchers said it's not that simple.

"If the climate warms, we might actually get more of these winter cycles of freezing and thawing," McCulloh said. "There's a lot of variability in the effects of climate we still don't understand.

"One of the most amazing things these trees can do is recover from these declines in conductivity by replacing the air bubbles with water," she said. "We don't understand how they do that at the significant tensions that exist at those heights. We're talking about negative pressures or tensions roughly three times the magnitude of what you put in your car tires."

When the field research on this study was done in 2009, the area actually experienced a historic heat wave during August when temperatures in the Willamette Valley hit 108 degrees. During such extreme heat, trees experienced some loss of hydraulic conductivity but largely recovered even before rains came in September. By contrast, greater loss of hydraulic conductivity was observed in the middle of winter.

The study was done at the Wind River Canopy Crane Research Facility, and published in the American Journal of Botany. The research was supported by the National Science Foundation.

"The commonly held view is that the summer months of the Pacific Northwest are extremely stressful to plants," the researchers wrote in their conclusion.

"Yet, our results indicated that the winter months are more stressful in terms of hydraulic function, and suggest that perhaps an inability to recover from increase in native embolism rates over the winter may cause greater branch dieback in old-growth trees than shifts in summer climate."

Email or share this story:

Sunday, September 4, 2011

Predictors of dying suddenly versus surviving heart attack identified

ScienceDaily (July 25, 2011) — Is it possible to predict whether someone is likely to survive or die suddenly from a heart attack? A new study by researchers at Wake Forest Baptist Medical Center has answered just that.See Also:Health & MedicineHeart DiseaseCholesterolStroke PreventionMind & BrainStrokeMultiple SclerosisSchizophreniaReferenceIschaemic heart diseaseCoronary heart diseaseCardiac arrestEchocardiography

"For some people, the first heart attack is more likely to be their last," said Elsayed Z. Soliman, M.D., M.Sc., M.S., director of the Epidemiological Cardiology Research Center (EPICARE) at Wake Forest Baptist and lead author of the study. "For these people especially, it is important that we find ways to prevent that first heart attack from ever happening because their chances of living through it are not as good."

While there are many traits that are common among heart attack patients -- both those who survive the event and those who die suddenly -- researchers found that some traits, such as hypertension, race/ethnicity, body mass index (BMI), heart rate, and additional markers that can be identified by an electrocardiogram (ECG) can differentiate between dying suddenly versus living through a heart attack, Soliman said.

The study, published by the journal Heart, is now available online.

Somewhere between 230,000 and 325,000 people in the U.S. succumb to sudden cardiac death every year, Soliman said. Most of these sudden deaths are caused by coronary heart disease.

"Since sudden cardiac death usually occurs before patients ever make it to the hospital, there is very little that can be done to save them," Soliman said. "Identifying specific predictors that separate the risk of sudden cardiac death from that of non-fatal or not immediately fatal heart attacks would be the first step to address this problem, which was the basis for our study."

Researchers analyzed data from two of the largest U.S. cardiovascular studies -- the ARIC (Atherosclerosis Risk in Communities) and the CHS (Cardiovascular Health Study) -- containing records for more than 18,000 participants. After taking into account common risk factors for coronary heart disease and the competing risk of sudden cardiac death with coronary heart disease, they found that:

Black race/ethnicity (compared to non-black) was predictive of high sudden cardiac death risk, but less risk of coronary heart disease. Hypertension and increased heart rate were stronger predictors of high risk of sudden cardiac death compared to coronary heart disease. Extreme high or low body mass index was predictive of increased risk of sudden cardiac death but not of coronary heart disease. Additional, more technical traits that a doctor evaluating an ECG report could use to evaluate risk of sudden cardiac death in their patients. (Prolongation of QTc and abnormally inverted T wave were stronger predictors of high risk of sudden cardiac death. On the other hand, elevated electrocardiographic ST height in V2 was not predictive of sudden cardiac death but predictive of coronary heart disease.)

If the results are validated and confirmed in other studies, Soliman predicts that doctors will have a way to identify patients who are at greater risk of dying suddenly if they experience a heart attack and, therefore, a group of patients for whom early intervention, including risk factor modification, may be a life-saving option.

"Our next step in this path of research is to see if we can come up with a risk stratification score that can be applied to the general population, as well as to look at interventions that reverse the effect that these traits are having on susceptibility to sudden cardiac death," Soliman said. "We need to know if lowering hypertension, BMI or resting heart rate would reduce the risk of dying suddenly."

The study was funded by the Donald W. Reynolds Cardiovascular Clinical Research Center at the Johns Hopkins University School of Medicine. The ARIC and CHS studies are supported by the National Heart, Lung, and Blood Institute.

Email or share this story:

Gray Platelet Syndrome: Elusive gene that makes platelets gray identified

ScienceDaily (July 25, 2011) — Researchers have identified an elusive gene responsible for Grey Platelet Syndrome, an extremely rare blood disorder in which only about 50 known cases have been reported. As a result, it is hoped that future cases will be easier to diagnose with a DNA test.See Also:Health & MedicineGenesBirth DefectsDiseases and ConditionsHuman BiologyBlood ClotsGene TherapyReferenceHaemophiliaLeukemiaAlleleStroke

The findings were made following a collaborative study by Professor Willem Ouwehand and Dr Cornelis Albers, who are both based at the Wellcome Trust Sanger Institute and the University of Cambridge, and Dr Paquita Nurden, from the Rare Platelet Disorders laboratory, based in Bordeaux, who have described their study.

Platelets are the second most abundant cell in the blood. Their main task is to survey the blood vessel wall for damage and to orchestrate its repair where required. On the flip side, platelets also play a "darker" role after vessel wall damage and cause blood clots that may lead to heart attacks or stroke.

Some people are born with platelets that do not function well and these rare conditions are thought to be inherited. Grey Platelet Syndrome poses a risk of bleeds, some of which can be severe and life threatening, e.g. if they occur in the brain. Grey Platelet Syndrome was first identified in the 1970s and is named for the greyish appearance of these platelets when viewed with a microscope.

Identifying the cause of increased bleeding in young patients has been a painstaking process. An important step in translating research findings in human genetics in improvements of patient care has focused around the need to develop simpler and rapid DNA-based diagnostic test. To achieve this, researchers needed to discover the gene responsible for the rare platelet bleeding disorders.

In the past it was a major challenge to discover which genes caused rare disorders because DNA samples from numerous large families affected by the same disorder had to be identified and genetically analysed to pinpoint the region harbouring the causative gene.

To achieve their latest findings, researchers used a simpler approach and deciphered about 40 million letters of genetic code covering the entire coding fraction of the genome of four non-related French patients.

They identified the gene NBEAL2 as not functioning well in Grey Platelet Syndrome, a member of a family of genes that all contain a unique domain, called the BEACH domain. The team showed that protein encoded by this gene is altered at a different position in the four non-related cases and the patients affected by the disorder have inherited two non-functioning copies of the gene, one from father and mother each.

"It is really great to see how the use of modern genomics technologies is going to be of direct benefit for patient care. It is exciting that we have shown that the genetic basis of a rare bleeding disorder can be discovered with relative ease," said Professor Willem Ouwehand, who heads a NHS Blood and Transplant research team on platelet biology at both the Wellcome Trust Sanger Institute and the University of Cambridge. "This study is one such example and it gives us confidence to achieve the same for a large number of other rare inherited platelet bleeding disorders. It is now important that we use this discovery to improve patient care in the NHS and beyond."

The team's identification of the NBEAL2 gene was confirmed by functional studies in zebrafish. Fish also have platelets named thrombocytes, and switching off the NBEAL2 gene in fish caused a complete absence of these cells which resulted in nearly half of the fish suffering spontaneous bleeds similar to patients with the disorder.

It is hoped that this gene identification will make it simpler to diagnose future cases of Grey Platelet Syndrome with a simple DNA test. This new test is now being developed with researchers at the NHS Blood and Transplant Centre at the Addenbrooke's Biomedical campus in Cambridge as part of the international ThromboGenomics initiative.

The scientists also observed that other members from the same family of BEACH proteins are implicated in other rare inherited disorders. Their findings showed that LYST protein did not function well in Chediak-Higashi syndrome, another rare but severe disorder paralysing the immune system but also causing a mild platelet bleeding disorder. As a result, a picture is emerging that BEACH proteins are essential in the way granules in blood cells and brain cells are formed or retained showing that in platelets the BEACH proteins are essential for both alpha and dense granules.

"Our discovery that another member of the family of BEACH proteins is underlying a rare but severe granule disorder in platelets firmly nails down the important role of this class of proteins in granule biology," said Cornelis Albers, a British Heart Foundation research fellow at the Sanger Institute and the University of Cambridge. "The reasons why the platelets of patients with Grey Platelet Syndrome are grey is because they lack alpha granules. The alpha granules carry the cargo of proteins that induce vessel wall repair and also form the platelet plug.

"A better understanding of how these granules are formed and how their timely release by the platelet is coordinated at the molecular level may one day underpin the development of a new class of safer anti-platelet drugs for use in patients with heart attacks and stroke. It has been a fascinating journey to identify a new and important pathway by combining the rapid advances in sequencing technology with computational analysis."

The French collaboration leader, Dr Paquita Nurden, set up the Network for Rare Platelet Disorders at the Laboratoire d'Hématologie, Hopital Xavier Arnozan close to Bordeaux. Their team made the Heruclian effort to find the French families affected by this rare disorder.

"We have worked for years to identify the families across France that suffer from rare platelet disorders and my group of scientists have used powerful microscopes to determine what was wrong with the platelets from patients with Grey Platelet Syndrome. Researchers across the world discovered in the 1980s that something was wrong with the alpha granules because they were lacking in most of the cases," said Dr Nurden, an international expert in platelet biology. "The gene, however, remained elusive for another 30 years, and it is great how our joint working has discovered the causative gene very quickly."

Email or share this story:

Harmful effects of hypothyroidism on maternal and fetal health drive new guidelines for managing thyroid disease in pregnancy

ScienceDaily (July 25, 2011) — Emerging data clarifying the risks of insufficient thyroid activity during pregnancy on the health of the mother and fetus, and on the future intellectual development of the child, have led to new clinical guidelines for diagnosing and managing thyroid disease during this critical period. The guidelines, developed by an American Thyroid Association (ATA) expert task force, are presented in Thyroid, a peer-reviewed journal published by Mary Ann Liebert, Inc.See Also:Health & MedicineThyroid DiseaseHormone DisordersPregnancy and ChildbirthDiseases and ConditionsCancerGynecologyReferenceHypothyroidismHyperthyroidismThyroid hormoneThyroid

Clinical studies are producing critical data demonstrating the harmful effects not only of overt hypothyroidism and hyperthyroidism on pregnancy, but also of subclinical thyroid disease and maternal and fetal health. Ongoing research is clarifying the link between miscarriage and preterm delivery in women with normal thyroid function who are thyroid peroxidase antibody positive. Studies are also uncovering the long-term effects of postpartum thyroiditis.

"Pregnancy has a profound impact on the thyroid gland and thyroid function…. In essence, pregnancy is a stress test for the thyroid, resulting in hypothyroidism in women with limited thyroidal reserve or iodine deficiency," state Alex Stagnaro-Green, George Washington University School of Medicine and Health Sciences (Washington, DC), and coauthors representing the ATA task force.

Among the many specific recommendations detailed in the guidelines are the following: women with overt hypothyroidism or with subclinical hypothyroidism who are TPO antibody positive should be treated with oral levothyroxine; use of other thyroid preparations such as triiodothyronine or desiccated thyroid to treat maternal hypothyroidism is strongly recommended against; and women with subclinical hypothyroidism in pregnancy who are not initially treated should be monitored for progression to overt hypothyroidism with serum TSH and free T4 measurements about every 4 weeks until 16-20 weeks gestation and at least once between 26-32 weeks gestation.

The new clinical guidelines focus on several key areas in the diagnosis and management of thyroid disease during pregnancy and postpartum: thyroid function tests, hypothyroidism, thyrotoxicosis, iodine, thyroid antibodies and miscarriage/preterm delivery, thyroid nodules and cancer, postpartum thyroiditis, recommendations on screening for thyroid disease during pregnancy, and areas for future research.

"These important guidelines were developed by a panel of international experts representing the disciplines of endocrinology, obstetrics and gynecology, and nurse midwives. This broad representation of providers that care for pregnant women will significantly increase the impact of these guidelines and translation of findings from the most recent research to clinical practice," says Gregory A. Brent, MD, Professor of Medicine and Physiology, David Geffen School of Medicine at the University of California Los Angeles and President of the ATA.

"Thyroid disease in pregnancy is common, clinically important, and time-sensitive, and our knowledge about it is rapidly changing. This ATA Guideline will disseminate this new information both widely and rapidly to improve patient care, establish what we believe is optimal care for the pregnant woman and her unborn child, and drive future research to further improve our understanding and patient outcomes," says Richard T. Kloos, MD, Professor, The Ohio State University and Secretary/Chief Operating Officer of the ATA.

Email or share this story:

Saturday, September 3, 2011

Turtles next to lizards on family tree, discovery based on microRNAs shows

ScienceDaily (July 25, 2011) — Famous for their sluggishness, turtles have been slow to give up the secrets of their evolution and place on the evolutionary tree. For decades, paleontologists and molecular biologists have disagreed about whether turtles are more closely related to birds and crocodiles or to lizards. Now, two scientists from the Mount Desert Island Biological Laboratory in Bar Harbor, Maine, and their colleagues from Dartmouth College and Harvard and Yale Universities have developed a new technique using microRNAs for classifying animals, and the secret is out. Turtles are closer kin to lizards than crocodiles.See Also:Plants & AnimalsFrogs and ReptilesBiologyNew SpeciesEarth & ClimateDesertReferenceTurtleSnapping turtleReptileSea turtle

To reach their conclusion, published in Biology Letters, the research team looked at a newly discovered class of molecules called microRNA. Most of the genetic material or DNA that scientists study provides the code for building proteins, large molecules that form an essential part of every organism. But microRNAs are much smaller molecules that can switch genes on and off and regulate protein production. They are also remarkably similar within related animal groups and provide important clues for identification.

"Different microRNAs develop fairly rapidly in different animal species over time, but once developed, they then remain virtually unchanged," said Kevin Peterson, a paleobiologist at MDIBL and Dartmouth College. "They provide a kind of molecular map that allows us to trace a species' evolution."

Peterson worked with Ben King, a bioinformatician at MDIBL. "My role in the study was to enhance our software so we could find new and unique microRNAs in the lizard genome," King said. "We identified 77 new microRNA families, and four of these turned out to also be expressed in the painted turtle. So we had the evidence we needed to say that turtles are a sister group to lizards and not crocodiles."

Though few creatures have been as puzzling as the turtle, the research team plans to use its microRNA analysis on other animals to help determine their origins and relationships as well. It is also developing a web-based platform to share the software with other researchers around the world.

In addition to King and Peterson, the research team included Tyler Lyson and Jacques Gauthier from Yale University, Eric Sperling from Harvard University, and Alysha Heimberg from Dartmouth College.

Email or share this story:

Underwater video camera opens window into the behavior of jellyfish

ScienceDaily (July 25, 2011) — MBL Whitman Center researchers are testing a new underwater video camera system that will allow scientists to study the propulsion and behavior of jellies in their natural habitat.See Also:Plants & AnimalsSea LifeBiochemistry ResearchMarine BiologyWild AnimalsDevelopmental BiologyNatureReferenceJellyfishEcological nicheSeagrassMarine biology

In the lab, you can get a sense for how a jelly swims and captures its prey, explains Sean Colin, a marine ecologist at Roger Williams University, who helped to develop the system, called a self-contained underwater velocimetry apparatus, or SCUVA for short. But what's missing is the influence that ocean currents, nearly impossible to accurately mimic in an artificial setting, have on these processes. "That's what we need this system to answer," says Colin. "How important is this natural background flow to determining who they feed on and how much they're able to ingest?"

This summer, Colin, along with colleague John Costello, a biology professor at Providence College, will use the SCUVA to observe the warty comb jelly, a native species in plentiful supply in the waters off the coast of Woods Hole, Massachusetts. "Primarily what we're interested in is understanding how jellyfish and ctenophores (like these comb jellies) interact with their surrounding fluid, because that influences how they swim, who they eat and, ultimately, their impact on the ecosystem," says Colin. "The SCUVA is allowing us to do that."

Email or share this story:

Specialized regulatory T cell stifles antibody production centers: Discovery has potential implications for cancer, autoimmune disease

ScienceDaily (July 25, 2011) — A regulatory T cell that expresses three specific genes shuts down the mass production of antibodies launched by the immune system to attack invaders, a team led by scientists at The University of Texas MD Anderson Cancer Center reported online in the journal Nature Medicine.See Also:Health & MedicineLymphomaImmune SystemStem CellsBrain TumorCancerLung CancerReferenceNatural killer cellT cellLymph nodeHeat shock protein

"Regulatory T cells prevent unwanted or exaggerated immune system responses, but the mechanism by which they accomplish this has been unclear," said paper senior author Chen Dong, Ph.D., professor in MD Anderson's Department of Immunology and director of the Center for Inflammation and Cancer.

"We've identified a molecular pathway that creates a specialized regulatory T cell, which suppresses the reaction of structures called germinal centers. This is where immune system T cells and B cells interact to swiftly produce large quantities of antibodies," Dong said.

The discovery of the germinal center off-switch, which comes two years after Dong and colleagues identified the mechanisms underlying a helper T cell that activates the centers, has potential implications for cancer and autoimmune diseases.

"In some types of cancer, the presence of many regulatory T cells is associated with poor prognosis," Dong said. "The theory is those cells suppress an immune system response in the tumor's microenvironment that otherwise might have attacked the cancer."

However, in B cell lymphomas, overproliferation and mutation of B cells are the problems, Dong said. Hitting the regulatory T cell off-switch might help against lymphomas and autoimmune diseases, while blocking it could permit an immune response against other cancers.

Antibody production central

Germinal centers are found in the lymph nodes and the spleen. They serve as gathering points for B and T cell lymphocytes, infection-fighting white blood cells.

When the adaptive immune system detects an invading bacterium or virus, B cells present a piece of the invader, an antigen, to T cells. The antigen converts a naïve T cell to a helper T cell that secretes cytokines, which help the B cells expand and differentiate into specialized antibodies to destroy the intruder.

"Germinal centers have mostly B cells with a few helper T cells to regulate them. The B cells mutate to make high-affinity antibodies and memory B cells for long-term immunity. The cell population in the germinal center structures replicates in an average of several hours, one of the fastest rates of cell replication known in mammals," Dong said.

Tracking down specialized T cell

In the Nature Medicine paper, Dong and colleagues found that a subgroup of regulatory T cells that expresses two genes, Bcl-6 and CXCR5, moves into germinal centers in both mice and humans, where they have access to B cells.

(Bcl-6 produces a protein called a transcription factor, which moves into the cell nucleus to regulate other genes. CXCR5 is a receptor protein for a signaling molecule called CXCL13.)

They also found that the Bcl-6/CXCR5 T cells aren't produced in the thymus, with other T cells, but are generated by regulatory T cell precursor cells that express Foxp3, another transcription factor.

Knocking out the regulatory T cells that express all three proteins in mice resulted in increased germinal center production of antibodies. They named this key T cell the T follicular regulatory cell, or Tfr.

In a 2009 paper in the journal Science, the researchers found that naïve T cells that expressed Bcl-6 and CXCR5 also gathered in the B cell zone of germinal centers. Expression of Bcl6 converted the T cell into a T follicular helper (Tfh) cell that launches antibody production in the germinal centers.

With Tfr turning germinal centers off and Tfh turning them on, we could potentially regulate antibody production, Dong noted. Increasing Tfr production could be a new approach to treating autoimmune inflammatory disorders, such as lupus and rheumatoid arthritis.

The team's research was funded by grants from the National Institutes of Health, the Leukemia and Lymphoma Society, MD Anderson, the American Heart Association, Doris Duke Charitable Foundation Clinical Scientist Development Award and the China Ministry of Science and Technology Protein Science Key Research Project.

Co-authors with Dong are first author Yeonseok Chung, Ph.D., Shinya Tanaka, Ph.D., Roza Nurieva, Ph.D., Gustavo Martinez, Yi-Hong Wang and Joseph Reynolds, Ph.D., of MD Anderson's Department of Immunology and the Center for Cancer Immunology; Chung also is with The University of Texas Health Science Center at Houston Institute of Molecular Medicine; Seema Rawal and Sattva Neelapu, M.D., of MD Anderson's Department of Lymphoma and Myeloma, also of the Center for Cancer Immunology; and Ziao-hui Zhou, M.D., Hui-min Fan, M.D., and Zhong-ming Liu, M.D., of Shanghai Dong Fang Hospital, Shanghai, China.

Email or share this story:

Friday, September 2, 2011

New software protects water utilities from terrorist attacks and contaminants

ScienceDaily (July 25, 2011) — Americans are used to drinking from the kitchen tap without fear of harm, even though water utilities might be vulnerable to terrorist attacks or natural contaminants.See Also:Matter & EnergyNature of WaterNuclear EnergyComputers & MathSoftwareHackingScience & SocietyResource ShortageEnvironmental PoliciesReferenceSecurity engineeringPumiceComputer securityComputer software

Now, thanks to CANARY Event Detection Software -- an open-source software developed by Sandia National Laboratories in partnership with the Environmental Protection Agency (EPA) -- public water systems can be protected through enhanced detection of such threats.

"People are excited about it because it's free and because we've shown that it works really well. We would love to have more utilities using it," said Regan Murray, acting associate division director of the EPA's Water Infrastructure Protection Division at the National Homeland Security Research Center.

The software tells utility operators within minutes whether something is wrong with their water, giving them time to warn and protect the public. And it's improving water quality by giving utility managers more comprehensive real-time data about changes in their water.

CANARY is being used in Cincinnati and Singapore, and Philadelphia is testing the software system. A number of other U.S. utilities also are evaluating CANARY for future use.

Sean McKenna, the Sandia researcher who led the team that developed CANARY, said people began to pay attention to the security of the nation's water systems after 9/11.

McKenna and Murray said CANARY could have lessened the impact of the nation's largest public water contamination. In 1993, a cryptosporidiosis outbreak in Milwaukee hastened the deaths of dozens of citizens, made more than 400,000 residents ill and cost more than $96 million in medical expenses and lost productivity, according to reports about the tragedy.

"If you don't have a detection system, the way you find out about these things is when people get sick," Murray said.

Sandia, a national security laboratory, had worked on water security before the 9/11 attacks. So when the EPA was looking for help early in the last decade to better monitor water utilities, they contacted Sandia.

A Sandia-developed, risk-assessment methodology for water focused on physical security of the utility infrastructure, but did not address detection and assessment of the impact of contamination within the water itself. CANARY was designed to meet that need for better assessment, McKenna said.

CANARY, which runs on a desktop computer, can be customized for individual water utilities, working with existing sensors and software, McKenna said.

While some utilities monitor their water using real-time sensors, many still send operators out once a week to take samples, said David Hart, the lead Sandia software developer for CANARY.

Compared to weekly samples, CANARY works at lightning speed.

"From the start of an event -- when a contaminant reaches the first sensor -- to an event alarm would be 20-40 minutes, depending on how the utility has CANARY configured," McKenna said.

The challenge for any contamination detection system is reducing the number of false alarms and making data meaningful amidst a "noisy" background of information caused by the environment and the utility infrastructure itself.

CANARY researchers used specially designed numerical algorithms to analyze data coming from multiple sensors and differentiate between natural variability and unusual patterns that indicate a problem. For example, the Multivariate-Nearest Neighbor algorithm groups data into clusters based on time and distance, explained Kate Klise, a numerical analyst at Sandia. When new data is received, CANARY decides whether it's close enough to a known cluster to be considered normal or whether it's far enough away to be deemed anomalous. In the latter case, CANARY alerts the utility operator, Klise said.

The computer program uses a moving 1.5- to two-day window of past data to detect abnormal events by comparing predicted water characteristics with current observations. But a single outlier won't trigger the alarm, which helps to avoid costly and inefficient false alarms. CANARY aggregates information over multiple 2- to 5-minute time steps to build evidence that water quality has undergone a significant change, McKenna said.

"We've taken techniques from different fields and put those together in a way they haven't been put together before; certainly the application of those techniques to water quality monitoring hasn't been done before," McKenna said.

CANARY also provides information about gradual changes in the water, McKenna said.

One unintended benefit of the software is that when utility operators better understood the data being sent by their sensors, they could make changes to the management of the water systems to improve its overall quality, McKenna said.

"What we found from utilities we work with is that a better managed system is more secure, and a more secure system is better managed," McKenna said.

Harry Seah, director of the Technology and Water Quality Office at the Public Utilities Board (PUB), Singapore's national water authority, wrote in a letter supporting CANARY that the software provided a "quantum leap" in the utility's practice.

In the past, Seah wrote, the utility depended on preset limits of three water characteristics to determine water quality.

"With the implementation of CANARY, relative changes in the patterns of these three parameters can be used to uncover water quality events, even if each individual parameter lies within the alarm limits," Seah wrote. "This dramatically improves PUB's ability to respond to water quality changes, and allows PUB to arrest poor quality water before

Making biological images sharper, deeper and faster

ScienceDaily (July 25, 2011) — For modern biologists, the ability to capture high-quality, three-dimensional (3D) images of living tissues or organisms over time is necessary to answer problems in areas ranging from genomics to neurobiology and developmental biology. The better the image, the more detailed the information that can be drawn from it. Looking to improve upon current methods of imaging, researchers from the California Institute of Technology (Caltech) have developed a novel approach that could redefine optical imaging of live biological samples by simultaneously achieving high resolution, high penetration depth (for seeing deep inside 3D samples), and high imaging speed.See Also:Plants & AnimalsDevelopmental BiologyBiotechnologyBiologyMatter & EnergyOpticsBiochemistryMedical TechnologyReferenceConfocal laser scanning microscopySpectroscopyInfraredElectron microscope

The imaging technique is explained in a paper in the advance online publication of the journal Nature Methods, released on July 14. It will also appear in an upcoming print version of the journal.

"Before our work, the state-of-the-art imaging techniques typically excelled in only one of three key parameters: resolution, depth, or speed. With our technique, it's possible to do well in all three and, critically, without killing, damaging, or adversely affecting the live biological samples," says biologist Scott Fraser, director of the Biological Imaging Center at Caltech's Beckman Institute and senior author of the study.

The research team achieved this imaging hat trick by first employing an unconventional imaging method called light-sheet microscopy, where a thin, flat sheet of light is used to illuminate a biological sample from the side, creating a single illuminated optical section through the sample. The light given off by the sample is then captured with a camera oriented perpendicularly to the light sheet, harvesting data from the entire illuminated plane at once. This allows millions of image pixels to be captured simultaneously, reducing the light intensity that needs to be used for each pixel. This not only enables fast imaging speed but also decreases the light-induced damage to the living samples, which the teams demonstrated using the embryos of fruit fly and zebrafish.

To achieve sharper image resolution with light-sheet microscopy deep inside the biological samples, the team used a process called two-photon excitation for the illumination. This process has been used previously to allow deeper imaging of biological samples; however, it usually is used to collect the image one pixel at a time by focusing the exciting light to a single small spot.

"The conceptual leap for us was to realize that two-photon excitation could also be carried out in sheet-illumination mode," says Thai Truong, a postdoctoral fellow in Fraser's laboratory and first author of the paper. This novel side-illumination with a two-photon illumination is the topic of a pending patent.

"With this approach, we believe that we can make a contribution to advancing biological imaging in a meaningful way," continues Truong, who did his Ph.D. training in physics. "We did not want to develop a fanciful optical imaging technique that excels only in one niche area, or that places constraints on the sample so severe that the applications will be limited. With a balanced high performance in resolution, depth, and speed, all achieved without photo-damage, two-photon light-sheet microscopy should be applicable to a wide variety of in vivo imaging applications." He credits this emphasis on wide applicability to the interdisciplinary nature of the team, which includes two biologists, two physicists, and one electrical engineer.

"We believe the performance of this imaging technique will open up many applications in life sciences and biomedical research -- wherever it is useful to observe, non-invasively, dynamic biological process in 3D and with cellular or subcellular resolution," says Willy Supatto, co-author of the paper and a former postdoctoral fellow in Fraser's laboratory (now at the Centre National de la Recherche Scientifique, in France).

One example of such an application would be to construct 3D movies of the entire embryonic development of an organism, covering the entire embryo in space and time. These movies could capture what individual cells are doing, as well as important genes' spatiotemporal expression patterns -- elucidating the activation of those genes within specific tissues at specific times during development.

"The goal is to create 'digital embryos,' providing insights into how embryos are built, which is critical not only for basic understanding of how biology works but also for future medical applications such as robotic surgery, tissue engineering, or stem-cell therapy," says Fraser. The team's first attempt at this can be seen in the accompanying movie, in which the cell divisions and movements that built the entire fruit fly embryo were captured without perturbing its development: http://www.youtube.com/watch?v

Global bioterrorism threat analyzed for world animal health office

ScienceDaily (July 25, 2011) — Around the globe, many nations are realizing that the potential for bioterrorism isn't just about the U.S., officials say.See Also:Plants & AnimalsAgriculture and FoodZoologyBehavioral ScienceScience & SocietyWorld DevelopmentPublic HealthScientific ConductReferenceTularemiaAnthraxBiological warfareRicin

And because an intentional introduction of bacteria, a virus or a toxin could happen anywhere, the World Organization for Animal Health is issuing a paper aimed at prevention.

"Any emerging country that is beginning to think about maintaining international trade needs to be aware of the potential for bioterrorism," said Dr. Neville Clarke, special assistant to the Texas A&M University System's vice chancellor of agriculture.

Clarke is lead author of "Bioterrorism: intentional introduction of animal disease," which appears in the animal health organization's journal Scientific and Technical Review this month.

Preventing bioterrorism worldwide

Around the globe, many nations are realizing that the potential for bioterrorism isn't just about the U.S., officials say.

First off, bioterrorism is not new.

The intentional introduction of animal disease dates to the Middle Ages when "diseased carcasses and bodies were catapulted over enemy walls in attempts to induce sickness in humans or animals," Clarke wrote with co-author Jennifer L. Rinderknecht, Texas AgriLife Research assistant.

Throughout time, similar practices ensued until 1975, when more than 160 countries at the Biological and Toxic Weapons Convention agreed to prohibit biological warfare programs, the article noted.

But, the authors say, evidence around the world indicates that the "development of biological agents continues in some countries."

Clarke said that those farthest away from being prepared are the developing nations such as in Sub-Saharan Africa and Indonesia. He said the article would be helpful for nations that are wanting to protect their markets as they grow globally.

The article discusses potential perpetrators and their methods, priority diseases, modern biology, trade and regulatory restraints as listed by the World Organization for Animal Health, which is headquartered in Paris and known as OIE for Office International des Epizooties.

Clarke pointed to the live animal and fresh meat restrictions on imports from Brazil that are in place because there are still pockets of Foot-and-Mouth Disease in that South American country.

"That impairs their ability to export to the U.S.," he said. "Trade restriction is one of the most important underlying issues that face countries. That makes bioterrorism everyone's business."

While the article deals specifically with intentional introductions, Clarke said the "clean up and control is same" for either type event.

"The only difference is in attribution," he said. "If an act is intentional, then the focus goes to finding out who did it."

Email or share this story:

Thursday, September 1, 2011

Antibiotic appears more effective than cranberry capsules for preventing urinary tract infections

ScienceDaily (July 25, 2011) — In premenopausal women who have repeated urinary tract infections (UTIs), the antibiotic trimethoprim-sulfamethoxazole (TMP-SMX) appeared more effective than cranberry capsules for preventing recurrent infections, at the risk of contributing to antibiotic resistance, according to a report in the July 25 issue of Archives of Internal Medicine, one of the JAMA/Archives journals.See Also:Health & MedicineDiabetesInfectious DiseasesWomen's HealthPlants & AnimalsBacteriaMicrobes and MoreFoodReferenceUrinary tract infectionCranberryClinical trialDouble blind

Urinary tract infections are common in women, affecting nearly half at some point in their lives, according to background information in the article. The authors note that up to 30 percent of women develop recurrent UTIs (rUTIs), a condition for which a low-dose antibiotic is frequently used as a preventive measure. "However, this may lead to drug resistance not only of the causative microorganisms but also of the indigenous flora," write the authors. Studies of cranberries and cranberry products have shown some effectiveness in preventing rUTIs, but these trials have not compared those interventions directly with TMP-SMX, the standard antibiotic used in these cases.

Mariëlle A.J. Beerepoot, M.D., from the Academic Medical Center, Amsterdam, and colleagues conducted a double-blind noninferiority trial of cranberry capsules and TMP-SMX. The 221 participants were premenopausal adult women who had reported at least three symptomatic UTIs in the previous year. They were randomized to take either TMP-SMX (480 mg at night, plus one placebo capsule twice daily) or cranberry capsules (500 mg twice daily, plus one placebo tablet at night) for 12 months. Researchers assessed participants' clinical status once a month (and for three months after stopping the study medication) via urine and feces samples and a questionnaire; participants also submitted urine samples when they experienced UTI-like symptoms.

At 12 months, the average number of clinical recurrences was 1.8 in the TMP-SMX group and 4.0 in the cranberry capsules group. Recurrence occurred, on average, after eight months in the drug group and after four months in the cranberry capsules group. Antibiotic resistance rates tripled in the pathogens found in patients in the TMP-SMX group, although three months after the drug was discontinued, resistance rates returned to the levels they had been at when the study began.

The antibiotic used in this study appeared to be more effective at preventing rUTIs than cranberry capsules, but the researchers noted that achieving this result also seemed to increase the rate of antibiotic resistance. "From clinical practice and during the recruitment phase of this study, we learned that many women are afraid of contracting drug-resistant bacteria using long-term antibiotic prophylaxis and preferred either no or nonantibiotic prophylaxis," they report. "In those women, cranberry prophylaxis may be a useful alternative despite its lower effectiveness."

This study was supported by a grant from the Netherlands Organization for Health Research and Development. The authors received the cranberry and placebo capsules from Springfield Nutraceuticals BV, Oud Beijerland, the Netherlands.

Commentary: Cranberries as Antibiotics?

An accompanying commentary by Bill J. Gurley, Ph.D., from the University of Arkansas for Medical Sciences, Little Rock, evaluates the results obtained by Beerepoot and colleagues in the context of nonpharmacologic remedies. Botanical dietary supplements are not intended to be used to treat, cure or prevent disease, he writes, but "most U.S. consumers, however, have expectations of health benefits from the dietary supplements they consume." Nevertheless, supplements such as cranberry capsules may not demonstrate optimal efficacy due to issues with poor water solubility and the type of metabolism that occurs.

Dr. Gurley notes that the report by Beerepoot and colleagues has two important features. Given that one month into the study, antibiotic resistance for Escherichia coli was higher than 85 percent in the TMP-SMX group but less than 30 percent in the cranberry capsule group, "such a marked reduction in antibiotic resistance certainly favors the therapeutic potential of cranberry as a natural UTI preventative." Further, Gurley points out that TMP-SMX showed superior efficacy to cranberry capsules, but that the low rate of bioavailability of bacteria-fighting chemicals in the dosage used of the latter may have affected the study's results. "Because optimal doses have not been established for many botanicals, clinical efficacy trials have often yielded negative or inconclusive results," Gurley points out. He mentions an ongoing dose-ranging study for cranberry that may provide more information on this supplement's effectiveness.

Email or share this story: