Wednesday, August 31, 2011

What is war good for? Sparking civilization, suggest archaeology findings from Peru

ScienceDaily (July 25, 2011) — Warfare, triggered by political conflict between the fifth century B.C. and the first century A.D., likely shaped the development of the first settlement that would classify as a civilization in the Titicaca basin of southern Peru, a new UCLA study suggests.See Also:Fossils & RuinsAnthropologyAncient CivilizationsArchaeologyScience & SocietyConflictPolitical ScienceSocial IssuesReferenceArtifact (archaeology)Stone AgeCivilizationExcavation

Charles Stanish, director of UCLA's Cotsen Institute of Archaeology, and Abigail Levine, a UCLA graduate student in anthropology, used archaeological evidence from the basin, home to a number of thriving and complex early societies during the first millennium B.C., to trace the evolution of two larger, dominant states in the region: Taraco, along the Ramis River, and Pukara, in the grassland pampas.

"This study is part of a larger, worldwide comparative research effort to define the factors that gave rise to the first societies that developed public buildings, widespread religions and regional political systems -- or basically characteristics associated with ancient states or what is colloquially known as 'civilization,'" said Stanish, who is also a professor of anthropology at UCLA. "War, regional trade and specialized labor are the three factors that keep coming up as predecessors to civilization."

The findings appear online in the latest edition of Proceedings of the National Academy of Sciences.

Conducted between 2004 and 2006, the authors' excavations in Taraco unearthed signs of a massive fire that raged sometime during the first century A.D., reducing much of the state to ash and architectural rubble. The authors compared artifacts dating from before and after the fire and concluded that agriculture, pottery and the obsidian industry, all of which had flourished in the state, greatly declined after the fire.

Based on the range and extent of the destruction and the lack of evidence supporting reconstruction efforts, the authors suggest that the fire was a result of war, not of an accident or a ritual.

Iconographic evidence of conflict in regional stone-work, textiles and pottery suggests that the destruction of Taraco had been preceded by several centuries of raids. This includes depictions of trophy heads and people dressed in feline pelts cutting off heads, among other evidence.

Because the downfall of Taraco, which was home to roughly 5,000 people, coincided with the rise of neighboring Pukara as a dominant political force in the region, the authors suggest that warfare between the states may have led to the raids, shaping the early political landscape of the northern Titicaca basin.

Inhabited between 500 B.C. and 200 A.D., Pukara was the first regional population center in the Andes highlands. During its peak, it covered more than 2 square kilometer and housed approximately 10,000 residents, including bureaucrats, priests, artisans, farmers, herders and possibly warriors.

The civilization's ruins include impressive monolithic sculptures with a variety of geometric, zoomorphic and anthropomorphic images, plus intricate, multi-colored pottery in a variety of ritual and domestic forms.

War appears to have played a similar civilizing role in Mesoamerica, as well as Mesopotamia, Stanish said. To further test his theories on the origins of civilization, Stanish will begin a new project next year at a Neolithic site in Armenia.

Email or share this story:

Minority rules: Scientists discover tipping point for the spread of ideas

ScienceDaily (July 25, 2011) — Scientists at Rensselaer Polytechnic Institute have found that when just 10 percent of the population holds an unshakable belief, their belief will always be adopted by the majority of the society. The scientists, who are members of the Social Cognitive Networks Academic Research Center (SCNARC) at Rensselaer, used computational and analytical methods to discover the tipping point where a minority belief becomes the majority opinion. The finding has implications for the study and influence of societal interactions ranging from the spread of innovations to the movement of political ideals.See Also:Mind & BrainSocial PsychologyRelationshipsComputers & MathComputer ModelingMathematical ModelingScience & SocietyEthicsPolitical ScienceLiving WellStrange ScienceReferenceCognitive biasFunctional neuroimagingIntuition (knowledge)Culture of fear

"When the number of committed opinion holders is below 10 percent, there is no visible progress in the spread of ideas. It would literally take the amount of time comparable to the age of the universe for this size group to reach the majority," said SCNARC Director Boleslaw Szymanski, the Claire and Roland Schmitt Distinguished Professor at Rensselaer. "Once that number grows above 10 percent, the idea spreads like flame."

As an example, the ongoing events in Tunisia and Egypt appear to exhibit a similar process, according to Szymanski. "In those countries, dictators who were in power for decades were suddenly overthrown in just a few weeks."

The findings were published in the July 22, 2011, early online edition of the journal Physical Review E in an article titled "Social consensus through the influence of committed minorities."

An important aspect of the finding is that the percent of committed opinion holders required to shift majority opinion does not change significantly regardless of the type of network in which the opinion holders are working. In other words, the percentage of committed opinion holders required to influence a society remains at approximately 10 percent, regardless of how or where that opinion starts and spreads in the society.

To reach their conclusion, the scientists developed computer models of various types of social networks. One of the networks had each person connect to every other person in the network. The second model included certain individuals who were connected to a large number of people, making them opinion hubs or leaders. The final model gave every person in the model roughly the same number of connections. The initial state of each of the models was a sea of traditional-view holders. Each of these individuals held a view, but were also, importantly, open minded to other views.

Once the networks were built, the scientists then "sprinkled" in some true believers throughout each of the networks. These people were completely set in their views and unflappable in modifying those beliefs. As those true believers began to converse with those who held the traditional belief system, the tides gradually and then very abruptly began to shift.

"In general, people do not like to have an unpopular opinion and are always seeking to try locally to come to consensus. We set up this dynamic in each of our models," said SCNARC Research Associate and corresponding paper author Sameet Sreenivasan. To accomplish this, each of the individuals in the models "talked" to each other about their opinion. If the listener held the same opinions as the speaker, it reinforced the listener's belief. If the opinion was different, the listener considered it and moved on to talk to another person. If that person also held this new belief, the listener then adopted that belief.

"As agents of change start to convince more and more people, the situation begins to change," Sreenivasan said. "People begin to question their own views at first and then completely adopt the new view to spread it even further. If the true believers just influenced their neighbors, that wouldn't change anything within the larger system, as we saw with percentages less than 10."

The research has broad implications for understanding how opinion spreads. "There are clearly situations in which it helps to know how to efficiently spread some opinion or how to suppress a developing opinion," said Associate Professor of Physics and co-author of the paper Gyorgy Korniss. "Some examples might be the need to quickly convince a town to move before a hurricane or spread new information on the prevention of disease in a rural village."

The researchers are now looking for partners within the social sciences and other fields to compare their computational models to historical examples. They are also looking to study how the percentage might change when input into a model where the society is polarized. Instead of simply holding one traditional view, the society would instead hold two opposing viewpoints. An example of this polarization would be Democrat versus Republican.

The research was funded by the Army Research Laboratory (ARL) through SCNARC, part of the Network Science Collaborative Technology Alliance (NS-CTA), the Army Research Office (ARO), and the Office of Naval Research (ONR).

The research is part of a much larger body of work taking place under SCNARC at Rensselaer. The center joins researchers from a broad spectrum of fields -- including sociology, physics, computer science, and engineering -- in exploring social cognitive networks. The center studies the fundamentals of network structures and how those structures are altered by technology. The goal of the center is to develop a deeper understanding of networks and a firm scientific basis for the newly arising field of network science. More information on the launch of SCNARC can be found at http://news.rpi.edu/update.do?artcenterkey

Butterfly study sheds light on convergent evolution: Single gene controls mimicry across different species

ScienceDaily (July 22, 2011) — For 150 years scientists have been trying to explain convergent evolution. One of the best-known examples of this is how poisonous butterflies from different species evolve to mimic each other's color patterns -- in effect joining forces to warn predators, "Don't eat us," while spreading the cost of this lesson.See Also:Plants & AnimalsEvolutionary BiologyNatureEarth & ClimateEcologyExotic SpeciesFossils & RuinsEvolutionCharles DarwinReferenceDifference between a butterfly and a mothParallel evolutionComputational genomicsTransgenic plants

Now an international team of researchers led by Robert Reed, UC Irvine assistant professor of ecology & evolutionary biology, has solved part of the mystery by identifying a single gene called optix responsible for red wing color patterns in a wide variety of passion vine butterfly species. The result of 10 years of work, the finding is detailed in a paper that appears online July 21 in the journal Science.

"This is our first peek into how mimicry and convergent evolution happen at a genetic level," Reed said. "We discovered that the same gene controls the evolution of red color patterns across remotely related butterflies.

"This is in line with emerging evidence from various animal species that evolution generally is governed by a relatively small number of genes. Out of the tens of thousands in a typical genome, it seems that only a handful tend to drive major evolutionary change over and over again."

The scientists spent several years crossbreeding and raising the delicate butterflies in large netted enclosures in the tropics so they could map the genes controlling color pattern. UCI postdoctoral researcher Riccardo Papa (now an assistant professor at the University of Puerto Rico, Rio Piedras) then perfected a way to analyze the genome map by looking at gene expression in microdissected butterfly wings.

Finding a strong correlation between red color patterns and gene expression in one small region of the genome was the breakthrough that led to discovery of the gene. Population genetics studies in hybrid zones, where different color types of the same species naturally interbreed, confirmed it.

"Biologists have been asking themselves, 'Are there really so few genes that govern evolution?'" Reed said. "This is a beautiful example of how a single gene can control the evolution of complex patterns in nature. Now we want to understand why: What is it about this one gene in particular that makes it so good at driving rapid evolution?"

Papa was co-author on the study. Arnaud Martin, a UCI graduate student in ecology & evolutionary biology, also contributed.

Email or share this story:

Tuesday, August 30, 2011

Ohio Supercomputer Center lifts land speed racer toward 400-mph goal

miles per hour requires innovative components, corporate partnerships, hours of diligent preparation and a powerful supercomputer.See Also:Matter & EnergyVehiclesTransportation ScienceBatteriesComputers & MathComputer ModelingSoftwareComputer ScienceReferenceAlternative fuel vehicleAutomotive aerodynamicsProjectileBiomass

A team of engineering students at The Ohio State University's (OSU) Center for Automotive Research (CAR) recently began running aerodynamics simulations at the Ohio Supercomputer Center (OSC), one of the first steps in the long and careful process of designing, building and racing the fourth iteration of their record-breaking, alternative-fuel streamliner.

"The third generation electric land speed record vehicle to be designed and built by OSU students, the Buckeye Bullet 3, will be an entirely new car designed and built from the ground up," noted Giorgio Rizzoni, Ph.D., professor of mechanical and aerospace engineering and director of CAR. "Driven by two custom-made electric motors designed and developed by Venturi, and powered by prismatic A123 batteries, the goal of the new vehicle will be to surpass all previous electric vehicle records."

In 2004, the team achieved distinction on the speedway at Bonneville Salt Flats in Wendover, Utah, by setting the U.S. electric land speed record at just over 314 mph with the original Buckeye Bullet, a nickel-metal hydride battery-powered vehicle.

Several years later, the team returned with the Buckeye Bullet 2, a completely new vehicle powered by hydrogen fuel cells, and set the international land speed record for that class at nearly 303 mph. The team then replaced the power source, once again, using the same frame and body with a new generation of lithium-ion batteries and set an international electric vehicle record in partnership with Venturi Automobiles and A123 Systems at just over 307 mph.

"OSC has been a partner of the Buckeye Bullet team throughout the project's history," said Ashok Krishnamurthy, OSC interim co-executive director. "We've been extremely pleased to provide computational resources and technical assistance to the student engineers as they learn valuable computational science skills that they can easily transfer to careers in the automotive industry and elsewhere."

This spring, the Buckeye Bullet team, again in partnership with Venturi and A123 Systems, began the development process for a completely re-engineered vehicle designed to break the 400-mph mark. In consideration of that blistering speed, one of the first critical aspects the team had to consider was the aerodynamic design of the vehicle.

"This goal places the team in direct competition with many of the fastest internal combustion cars in the world," said Cary Bork, chief engineer for the team and an OSU graduate student in mechanical engineering. "What sets the new design apart from the previous Buckeye Bullet vehicles is that at these higher speeds it is possible to produce shock waves under the vehicle. Such shock waves under the vehicle negatively affect the vehicle drag and can produce lift. Lift is undesirable in this application. Minimizing or eliminating these shock waves is critical to ensuring the safety and stability of the vehicle.."

For both versions of the Buckeye Bullet 2, student engineers ran aerodynamics simulations on OSC computer systems to compliment studies of physical models tested in wind tunnels. However, the current team quickly found that wind tunnels with a "rolling-road" component required to test land-bound vehicles at the target speeds don't exist. Rizzoni and Bork, therefore, leveraged the flagship IBM 1350 Opteron Cluster at OSC to run extensive simulations, initially focusing on validating performance of Buckeye Bullet 2 and eventually giving shape to the lean, new streamliner.

"We're using computational fluid dynamics (CFD) to design and optimize the vehicle shape," said Cary Bork, a graduate student and chief engineer for the project. "The simulations are needed to accurately predict the aerodynamic forces on the vehicle at these speeds and can only be run on large computing clusters. Various mesh sizes have been used from 1 million to 50 million cells. Most of the simulations use 25 million cells."

In addition to an overall optimization of the body and fin shape, the new aerodynamic design features several additional areas of improvement over its predecessors. The vehicle will incorporate a layout where the driver is placed forward of the front tires to improve volume utilization, reduce overall vehicle length, decrease the vehicle drag by five percent and improve the vehicle balance. Also, the team is studying the addition of wind deflectors beneath the vehicle and in front of the tires to decrease the amount of air that enters the wheel well, thereby reducing drag as much as 14.9 percent. Finally, the team is studying simulations of new air-brake equipment to determine the system's effect on vehicle stability.

Most of the CFD design is being completed using OpenFOAM, a free, open-source software package, and meshing is done using OpenFOAM's automated utility snappyHexMesh. Vehicle geometry for the vehicle is being generated using Catia solid modeling and surfacing software, while post-processing is performed using Paraview. Some additional CFD testing also will be completed using Fluent software for comparative purposes. The team has brought on the Dublin, Ohio, firm TotalSim LLC, a frequent collaborative partner of OSC, as a technical partner and CFD consultant.

The Buckeye Bullet team plans to complete the design process by the end of the summer and spend the upcoming academic year constructing and testing the vehicle. Then, in Fall 2012, the students intend to return to the Bonneville Speedway to unveil yet another record-setting Buckeye Bullet.

Email or share this story:

Vascular changes linked to dementia, experts say

ScienceDaily (July 22, 2011) — The same artery-clogging process (atherosclerosis) that causes heart disease can also result in age-related vascular cognitive impairments (VCI), according to a new American Heart Association/American Stroke Association scientific statement published online in Stroke: Journal of the American Heart Association.See Also:Health & MedicineHeart DiseaseStroke PreventionCholesterolMind & BrainDementiaStrokeIntelligenceReferenceMulti-infarct dementiaDementia with Lewy bodiesDementiaStroke

Cognitive impairment, also known as dementia, includes difficulty with thinking, reasoning and memory, and can be caused by vascular disease, Alzheimer's disease, a combination of both and other causes.

Atherosclerosis is a build- up of plaque in the arteries associated with elevated blood pressure, cholesterol, smoking and other risk factors. When it restricts or blocks blood flow to the brain, it is called cerebrovascular disease, which can result in vascular cognitive impairment.

Alzheimer's disease is a progressive brain disorder that damages and destroys brain cells. "We have learned that cerebrovascular disease and Alzheimer's disease may work together to cause cognitive impairment and the mixed disorder may be the most common type of dementia in older persons," said Philip B. Gorelick, M.D., M.P.H., co-chair of the writing group for the statement and director of the Center for Stroke Research at the University of Illinois College of Medicine at Chicago.

The prevalence of dementia increases with advancing age and affects about 30 percent of people over 80 years of age, costing more than $40,000 per patient annually in the United States, according to the statement authors. . .

Treating risk factors for heart disease and stroke with lifestyle changes and medical management may prevent or slow the development of dementia in some people, Gorelick said. Physical activity, healthy diet, healthy body weight, tobacco avoidance as well as blood pressure and cholesterol management could significantly help many people maintain their mental abilities as they age.

"Generally speaking, what is good for the heart is good for the brain," Gorelick said. "Although it is not definitely proven yet, treatment or prevention of major risk factors for stroke and heart disease may prove to also preserve cognitive function with age."

Understanding common causes of late-life cognitive impairment and dementia has advanced and many of the traditional risk factors for stroke also are risk markers for Alzheimer's disease and vascular cognitive impairment. For example:

• Reducing high blood pressure is recommended to reduce the risk of vascular cognitive impairment. High blood pressure in mid-life may be an important risk factor for cognitive decline later in life. • Controlling high cholesterol and abnormal blood sugar may also help reduce the risk of vascular cognitive impairment, although more study is needed to confirm the role of these interventions. • Smoking cessation could lessen the risk of vascular cognitive impairment. • Increasing physical exercise, consuming a moderate level of alcohol (i.e., up to 2 drinks for men and 1 drink for non-pregnant women) for those who currently consume alcohol; and maintaining a healthy weight may also lessen the risk of VCI, but more study is needed to confirm usefulness. • Taking B vitamins or anti-oxidant supplements, however, does not prevent vascular cognitive impairment, heart disease or stroke.

Identifying people at risk for cognitive impairment is a promising strategy for preventing or postponing dementia and for public health cost savings, the writers said. "We encourage clinicians to use screening tools to detect cognitive impairment in their older patients and continue to treat vascular risks according to nationally- or regionally-accepted guidelines." Vascular cognitive impairment is most obvious after a stroke, but there could be cognitive repercussions from small strokes, microbleeds or areas of diminished blood flow in the brain that cause no obvious neurological symptoms, according to the statement.

In many cases, the risk factors for vascular cognitive impairment are the same as for stroke, including high blood pressure, high cholesterol, abnormalities in heart rhythm and diabetes. The American Academy of Neurology and the Alzheimer's Association have endorsed the statement.

Other members of the writing group include: Angelo Scuteri, co-chair, M.D., Ph.D.; David Bennett, M.D.; Sandra E. Black, M.D.; Charles DeCarli, M.D.; Helena C. Chui, M.D.; Steven M. Greenberg, M.D., Ph.D.; Randall T. Higashida, M.D.; Costantino Iadecola, M.D.; Lenore J. Launer, M.D.; Stephane Laurent, M.D.; Oscar L. Lopez, M.D.; David Nyenhuis, Ph.D.; Ronald C. Petersen, M.D., Ph.D.; Julie A. Schneider, M.D.; Christophe Tzourio, M.D., Ph.D.; Donna K. Arnett, Ph.D.; Ruth Lindquist, Ph.D., R.N.; Peter M. Nilsson, M.D., Ph.D.; Gustavo C. Roman, M.D.; Frank W. Sellke, M.D.; and Sudha Seshadri, M.D. Author disclosures are on the manuscript.

Email or share this story:

Optimism associated with lower risk of having stroke

ScienceDaily (July 22, 2011) — A positive outlook on life might lower your risk of having a stroke, according to new research reported in Stroke: Journal of the American Heart Association.See Also:Health & MedicineStroke PreventionHeart DiseaseElder CareMind & BrainCaregivingStrokeBrain InjuryLiving WellReferenceMulti-infarct dementiaHormone replacement therapyHigh density lipoproteinBrain damage

In an observational study, a nationally representative group of 6,044 adults over age 50 rated their optimism levels on a 16-point scale. Each point increase in optimism corresponded to a 9 percent decrease in acute stroke risk over a two-year follow-up period.

"Our work suggests that people who expect the best things in life actively take steps to promote health," said Eric Kim, study lead author and a clinical psychology doctoral student at the University of Michigan.

Optimism is the expectation that more good things, rather than bad, will happen.

Previous research has shown that an optimistic attitude is associated with better heart health outcomes and enhanced immune-system functioning, among other positive effects.

The study is the first to discover a correlation between optimism and stroke. Previous research has shown that low pessimism and temporary positive emotions are linked to lower stroke risk. Researchers analyzed self-reported stroke and psychological data from the ongoing Health and Retirement Study, collected between 2006 and 2008. Participants were stroke-free at the beginning of the study.

Researchers measured optimism levels with the modified Life Orientation Test-Revised, a widely used assessment tool in which participants rank their responses on a numeric scale.

The team used logistic regression analysis to establish the association between optimism and stroke and adjusted for factors that might affect stroke risk, including chronic illness, self-reported health and sociodemographic, behavioral, biological and psychological conditions.

"Optimism seems to have a swift impact on stroke," said Kim, noting that researchers followed participants for only two years. The protective effect of optimism may primarily be due to behavioral choices that people make, such as taking vitamins, eating a healthy diet and exercising, researchers said. However, some evidence suggests positive thinking might have a strictly biological impact as well.

Stroke is the No. 3 killer in the United States, behind heart disease and cancer, and a leading cause of disability.

Co-authors of the study are Nansook Park, Ph.D., and Christopher Peterson, Ph.D. Author disclosures are on the manuscript.

The Robert Wood Johnson Foundation's Pioneer Portfolio funded a part of the study through the Positive Psychology Center of the University of Pennsylvania.

Email or share this story:

Monday, August 29, 2011

Adolescent boys among those most affected by Washington state parental military deployment, study suggests

ScienceDaily (July 21, 2011) — In 2007, nearly two million children in the United States had at least one parent serving in the military. Military families and children, in particular, suffer from mental health problems related to long deployments.See Also:Health & MedicineChildren's HealthTeen HealthMind & BrainChild PsychologyChild DevelopmentScience & SocietyConflictEducational PolicyReferenceSex educationAdolescenceCyber-bullyingGulf War syndrome

A new study from researchers at the University of Washington (UW) concludes that parental military deployment is associated with impaired well-being among adolescents, especially adolescent boys. The study, "Adolescent well-being in Washington state military families," was published online in the American Journal of Public Health.

Lead author Sarah C. Reed, who has a master's degree from the UW School of Public Health, said the findings show that it is time to focus more on the children that are left behind in times of war. "There is a lot of research about veterans and active-duty soldiers, and how they cope or struggle when they return from a deployment," said Reed. "Those studies hit the tip of the iceberg of how families are coping and how their children are doing."

Adolescents are uniquely vulnerable to adverse health effects from parental military deployment. Healthy development, including identifying a sense of self and separation from family, can be interrupted during parents' active military service.

Media exposure and the developmental ability to understand the consequences of war may further disrupt adolescents' adjustment and coping. Teens may also have additional responsibilities at home after a parent's deployment, researchers said.

UW researchers used data from the Washington state 2008 Healthy Youth Survey, administered to more than 10,000 adolescents in 8th, 10th- and 12th grade classrooms. Female 8th graders with parents deployed to combat appear to be at risk of depression and thoughts of suicide, while male counterparts in all grades are at increased risk of impaired well-being in all of the areas examined (low quality of life, binge drinking, drug use and low academic achievement).

National research organizations, including RAND Health and the RAND National Security Research Division, have studied what's known as the "invisible wounds" of war. But Reed and her team said existing research is not enough. "We have to figure out more of what's going on within families and with children, and what's going to be helpful to mitigate the difficult things -- including risky behaviors by adolescents -- that are happening in families," she said.

Reed said that implementing or strengthening school-based programs that target affected adolescents would be a good starting point. Research and support programs also need to be beefed up, based on the research team's analysis. "There seem to be a lot of programs available but they are scattered and hard to navigate," said Reed. "In Washington state, schools have support programs, but they appear to be disconnected. There's a lot of energy in terms of people who would like to help, but a more cohesive effort in reaching out to adolescents and providing services is important."

Reed and her team are working on a follow-up study, analyzing parental military service and adolescent behaviors of school-based physical fighting, weapon carrying and gang membership.

Funding for the study was supported by a grant from the Maternal and Child Health Bureau, Health Resources and Services Administration, U.S. Department of Health & Human Services. Co-authors on the study include Janice Bell, UW assistant professor of health services, and Todd Edwards, UW research assistant professor of health services, both in the UW School of Public Health.

Email or share this story:

Blue collar workers work longer and in worse health than their white collar bosses, study finds

ScienceDaily (July 21, 2011) — While more Americans are working past age 65 by choice, a growing segment of the population must continue to work well into their sixties out of financial necessity. Research conducted by the Columbia University's Mailman School of Public Health and the University of Miami Miller School of Medicine looked at aging, social class and labor force participation rates to illustrate the challenges that lower income workers face in the global marketplace.See Also:Health & MedicineWorkplace HealthArthritisJoint PainScience & SocietyIndustrial RelationsDisaster PlanPublic HealthReferenceArthritisCOX-2 inhibitorOily fishRheumatoid arthritis

The study used the burden of arthritis to examine these connections because 49 million U.S. adults have arthritis, and 21 million suffer activity limitations as a result. The condition is also relatively disabling and painful but not fatal. The researchers found that blue collar workers are much more likely to work past 65 than white collar workers and are much more likely to suffer from conditions like arthritis, reducing their quality of life and work productivity.

The study findings are reported online in the American Journal of Public Health.

The investigators calculated estimates and compared age-and occupational specific data for workers with and without arthritis, merging data from the U.S. National Health Interview Survey (NHIS), Medical Expenditure Panel Survey (MEPS) and National Death Index. They studied 17,967 individuals for the analysis out of 38,473 MEPS participants.

"Arthritis serves as a powerful lens for looking at these convergent phenomena," said Alberto J. Caban-Martinez, DO, PhD, MPH, Department of Epidemiology and Public Health at the University of Miami Miller School of Medicine and first author. "We found that blue-collar workers with arthritis are in much worse health than are all other workers, suggesting that they are struggling to stay in the workforce despite their health condition."

At all ages, blue-collar workers in the workforce are in worse health than white-collar workers. By age 65, 19% of white-collar workers with arthritis remain in the workforce compared with 22% of blue-collar workers. But employed blue-collar workers have more severe disease than employed white-collar workers, and look forward to fewer years of healthy life -- approximately 11 for blue-collar workers and 14 for white-collar workers.

The investigators reported that lower-income workers of older age in the service and farming sectors-- two job types that are unlikely to come with pension plans--are more likely to have arthritis than not, with 58% of service workers and 67% of farm workers continuing to work despite struggling with the painful health condition. Sixteen percent of all blue collar workers are over 65 and 47% report they have arthritis. By contrast, 14% of white collar workers work beyond the age of 65, and 51% of these workers reporting arthritis. Overall, approximately 15% of all workers remain in the workforce at or past retirement age, and 44% have arthritis.

"The increasing age of the U.S. workforce presents new challenges for government, employers and working families," observes Peter Muennig, MD, MPH, associate professor of Health Policy and Management and senior author. "It is estimated that by the year 2030 approximately 67 million adults aged 18 years and older will have arthritis. Because the 'graying' workforce will be disproportionately represented by people from middle and lower occupational classes that also suffer from a higher prevalence of arthritis and a shorter life expectancy than wealthier Americans, Dr. Muennig points out that additional enhancements to federal programs such as better disability, health and unemployment insurance will be needed to maintain a higher quality of life for all workers, particularly for those with chronic conditions such as arthritis. "As the population ages in the face of expanding budget deficits, we face politically difficult choices if the U.S. is to prevent significant declines in its standard of living."

This study was funded by in part the National Institute of Arthritis and Musculoskeletal and Skin Diseases and the National Institute for Occupational Safety and Health.

Email or share this story:

Multiple sclerosis research: Myelin influences how brain cells send signals

ScienceDaily (July 22, 2011) — The development of a new cell-culture system that mimics how specific nerve cell fibers in the brain become coated with protective myelin opens up new avenues of research about multiple sclerosis. Initial findings suggest that myelin regulates a key protein involved in sending long-distance signals.See Also:Health & MedicineNervous SystemBrain TumorImmune SystemMind & BrainNeuroscienceMultiple SclerosisBrain InjuryReferenceMyelinAxonPupillary reflexNeuron

Multiple sclerosis (MS) is an autoimmune disease characterized by damage to the myelin sheath surrounding nerve fibers. The cause remains unknown, and it is a chronic illness affecting the central nervous system that has no cure.

MS has long been considered a disease of white matter, a reference to the white-colored bundles of myelin-coated axons that project from the main body of a brain cell. But researchers have discovered that the condition also affects myelinated axons scattered in gray matter that contains main bodies of brain cells, and specifically the hippocampus region, which is important for learning and memory.

Up to half of MS patients suffer cognitive deficits in addition to physical symptoms. Researchers suspect that cognitive problems are caused by abnormal electrical activities of the demyelinated axons extending from hippocampal cells, but until now have not been able to test myelin's role in this part of the brain.

Ohio State University researchers have created a system in which two types of cells interact in a dish as they do in nature: neurons from the hippocampus and other brain cells, called oligodendrocytes, whose role is to wrap myelin around the axons.

Now that the researchers can study how myelination is switched on and off for hippocampal neurons, they also can see how myelin does more than provide insulation -- it also has a role in controlling nerve impulses traveling between distant parts of the nervous system. Identifying this mechanism when myelin is present will help improve understanding of what happens when axons in this critical area of the brain lose myelin as a result of MS, researchers say.

So far, the scientists have used the system to show that myelin regulates the placement and activity of a key protein, called a Kv1.2 voltage-gated potassium channel, that is needed to maintain ideal conditions for the effective transmission of electrical signals along these hippocampal axons.

"This channel is important because it is what leads to electrical activity and how neurons communicate with each other downstream," said Chen Gu, assistant professor of neuroscience at Ohio State and lead author of the study. "If that process is disrupted by demyelination, disease symptoms may occur."

The study appears in the current (July 22, 2011) issue of the Journal of Biological Chemistry.

To create the cell culture system, the researchers began with hippocampus neurons from a rodent brain -- a cell type that Gu has worked with for years. In culture, these cells can grow and develop dendrites -- other branch-like projections off of neurons -- and axons as well as generate electrical activity and synaptic connections, the same events that occur in the brain.

The researchers then added oligodendrocytes, along with some of their precursor cells, to the same dish as the neurons. And eventually, after maturing, these oligodendrocytes began to wrap myelin around the axons of the hippocampal neurons.

This system takes about five weeks to create, but the trickiest part, Gu said, was developing the proper solution for this culture so that both kinds of cells would behave as nature intended.

"In the end, the composition of the culture medium is basically half from a solution that supports the neurons and half from a medium in which the oligodendrocytes function well. We know that all the cells were happy because we got myelin," said Gu, also an investigator in Ohio State's Center for Molecular Neurobiology.

With the system established, they then turned to experimentation to test the effects of the myelin's presence on these specific brain cells.

Nerve cells send their signals encoded in electrical impulses over long distances. Concerted actions of various ion channels are required for properly generating these nerve impulses. Potassium channels are involved at the late phase in an impulse, and its role is to return a nerve cell to a resting state after the impulse has passed through it and gear up for the next one. The Kv1.2 ion channel helps ensure that this process works smoothly.

By experimentally manipulating signal conditions with the new co-culture system, Gu and his colleague were able to establish part of the sequence of events required for myelinated hippocampal neurons to effectively get their signals to their targets. Starting with a protein known to be produced by myelin and axons, called TAG-1, a cell adhesion molecule, they traced a series of chemical reactions indicating that myelin on the hippocampal axons was controlling the placement and activity of the Kv1.2 ion channel.

"The analysis allowed us to see the signaling pathways involving myelin's regulation of the Kv1.2 channel's placement along the axon as well as fine-tuning of the channel's activity," Gu said.

When MS demyelinates these axons, the affected nerve cells don't get the message to rest, and subsequently can't prepare adequately to receive and transmit the next signal that comes along.

"This means a nerve impulse will have a hard time traveling through the demyelinated region," Gu said. "This shows that the ion channel is probably involved in the downstream disease progression of MS."

Gu envisions many additional uses for the new co-culture system, including additional studies of how myelin affects the behavior of other channels, proteins and molecules that function within axons, as well as to screen the effects of experimental drugs on these myelinated cells.

This work was supported by a Career Transition Fellowship Award from the National Multiple Sclerosis Society and a grant from the National Institute for Neurological Disorders and Stroke.

Gu conducted this study with Yuanzheng Gu, a research associate in the Department of Neuroscience at Ohio State.

Email or share this story:

Sunday, August 28, 2011

Smartphone making your eyes tired? Images placed in front of the screen increase visual discomfort

ScienceDaily (July 22, 2011) — Several reports indicate that prolonged viewing of mobile devices and other stereo 3D devices leads to visual discomfort, fatigue and even headaches. According to a new Journal of Vision study, the root cause may be the demand on our eyes to focus on the screen and simultaneously adjust to the distance of the content.See Also:Health & MedicineEye CarePain ControlMatter & EnergyOpticsEngineeringComputers & MathMobile ComputingComputer ProgrammingReferenceSound effectComputer animationEye examinationHyperopia

Scientifically referred to as vergence-accommodation, this conflict and its effect on viewers of stereo 3D displays are detailed in a recent Journal of Vision article, The Zone of Comfort: Predicting Visual Discomfort with Stereo Displays.

"When watching stereo 3D displays, the eyes must focus -- that is, accommodate -- to the distance of the screen because that's where the light comes from. At the same time, the eyes must converge to the distance of the stereo content, which may be in front of or behind the screen," explains author Martin S. Banks, professor of optometry and vision science, University of California, Berkeley.

Through a series of experiments on 24 adults, the research team observed the interaction between the viewing distance and the direction of the conflict, examining whether placing the content in front of or behind the screen affects viewer discomfort. The results demonstrated that with devices like mobile phones and desktop displays that are viewed at a short distance, stereo content placed in front of the screen -- appearing closer to the viewer and into the space of viewer's room -- was less comfortable than content placed behind the screen. Conversely, when viewing at a longer distance such as a movie theater screen, stereo content placed behind the screen -- appearing as though the viewer is looking through a window scene behind the screen -- was less comfortable.

"Discomfort associated with viewing Stereo 3D is a major problem that may limit the use of technology," says Banks. "We hope that our findings will inspire more research in this area."

The team of investigators suggests future studies focus on a larger sample in order to develop population-based statistics that include children. With the explosion of stereo 3D imagery in entertainment, communication and medical technology, the authors also propose guidelines be established for the range of disparities presented on such displays and the positioning of viewers relative to the display.

"This is an area of research where basic science meets application and we hope that the science can proceed quickly enough to keep up with the increasingly widespread use of the technology," adds Banks.

Email or share this story:

Computer simulations aid understanding of bacterial resistance against commonly used antibiotics

ScienceDaily (July 22, 2011) — A recent study into the interactions between aminoglycoside antibiotics and their target site in bacteria used computer simulations to elucidate this mechanism and thereby suggest drug modifications.See Also:Health & MedicineInfectious DiseasesCystic FibrosisPlants & AnimalsBacteriaMicrobes and MoreComputers & MathComputer ModelingComputer ScienceReferencePenicillin-like antibioticsBacterial meningitisEndosporeAntiviral drug

In the article, which will be published on July 21st in the open-access journal PLoS Computational Biology, researchers from University of Warsaw, Poland, and University of California San Diego, USA, describe their study of the physical basis of one bacterial resistance mechanism -- mutations of the antibiotic target site, namely RNA of the bacterial ribosome. They performed simulations and observed changes in the interaction between the antibiotic and the target site when different mutations were introduced.

In hospitals throughout the world, aminoglycosidic antibiotics are used to combat even the most severe bacterial infections, being very successful especially against tuberculosis and plague. However, the continuous emergence of resistant bacteria has created an urgent need to improve these antibiotics. Previous experiments on bacteria have shown that specific point mutations in the bacterial ribosomal RNA confer high resistance against aminoglycosides. However, the physico-chemical mechanism underlying this effect has not been known. Using computer simulations the researchers explained how various mutations in this specific RNA fragment influence its dynamics and lead to resistance.

Bacteria have developed other ways of gaining resistance, not just through mutations, and further studies are underway. The authors are now investigating the resistance mechanism by which bacterial enzymes actively modify and neutralize aminoglycosidic antibiotics. These molecular modeling studies together with experiments could help to design even better aminoglycoside derivatives in the future.

Email or share this story:

Chromosome number changes in yeast

ScienceDaily (July 21, 2011) — Researchers from Trinity College Dublin have uncovered the evolutionary mechanisms that have caused increases or decreases in the numbers of chromosomes in a group of yeast species during the last 100-150 million years. The study, to be published on July 21st in the open-access journal PLoS Genetics, offers an unprecedented view of chromosome complement (chromosome number) changes in a large group of related species.See Also:Plants & AnimalsGeneticsMolecular BiologyEpigenetics ResearchEvolutionary BiologyCell BiologyNew SpeciesReferenceChromosomal crossoverSomatic cellVector (biology)Sex linkage

A few specific cases of chromosome number changes have been studied in plants and animals, for example the fusion of two great ape chromosomes that gave rise to chromosome 2 in humans, giving humans a chromosome count of 23 pairs compared to 24 pairs in great apes. The family of yeasts studied in this new research spans a similar evolutionary distance to that of vertebrates. The availability of completely sequenced genomes facilitated the reconstruction of ancestral genome structures at different evolutionary time points. Tracing the positions of essential parts of chromosomes (centromeres and telomeres) through time allowed for the identification of specific genome rearrangement events that resulted in chromosome complement changes.

The addition of large numbers of genes is not often tolerated by cells, and neither are deletions of large numbers of genes. This restricts the types of possible changes in chromosome complement to rearrangements of genes on chromosomes that maintain the same number of genes.

The researchers show that, in yeasts, chromosome complement has decreased in all except one notable event, a whole genome duplication -- an event that doubled the complement of an ancestor of several of the species from 8 chromosomes to 16. The decreases in chromosome number were mostly by the fusion of whole chromosomes, similar to the one that gave rise to chromosome 2 in humans. One exception to this mechanism was the breakage of a chromosome and the subsequent fusion of the two broken edges to two different chromosome ends.

Although some aspects of the research are specific to yeast, many of the mechanisms of chromosome number change in yeast are similar to those found in other organisms and therefore shed light on how chromosome complements evolve.

Email or share this story:

Saturday, August 27, 2011

Chance favors the concentration of wealth, study shows; New model isolates the effects of chance in an investment-based economy

ScienceDaily (July 21, 2011) — Most of our society's wealth is invested in businesses or other ventures that may or may not pan out. Thus, chance plays a role in where the wealth of a society will end up.See Also:Computers & MathMathematical ModelingComputer ModelingComputational BiologyScience & SocietyEconomicsEducational PolicyLand ManagementReferenceComputer simulationMicroeconomicsExperimental economicsMathematical model

But does chance favor the concentration of wealth in the hands of a few, or does it tend to level the playing field? Three University of Minnesota researchers have built a simplified model that isolates the effects of chance and found that it consistently pushes wealth into the hands of a few, ever-richer people.

The study, "Entrepreneurs, chance, and the deterministic concentration of wealth," is published in the July 20 issue of the journal PLoS ONE.

The researchers simulated the performance of a large number of investors who started out with equal amounts of capital and who realized returns annually over a number of years. But wealth did not remain equal, because each year an entrepreneur's return was a random draw taken from a pool of possible return rates. Thus, a high return did not guarantee continuing high returns, nor did early low returns mean continuing bad luck.

Even though all investors had an equal chance of success, the simulations consistently resulted in dramatic concentration of wealth over time. The reason: With compounding capital returns, some individuals will have a string of high returns and, given enough time, will accumulate an overwhelming share of the wealth.

This appears to be a fundamental feature of economies where wealth is primarily generated from returns on investment (for example, through business ownership and growth), the researchers said.

"Predictions from this model about how wealth is distributed were more accurate than predictions from classic economic models," said first author Joseph Fargione, an adjunct professor of ecology, evolution and behavior in the university's College of Biological Sciences.

The model predicts that the rate at which wealth concentrates depends on the variation among individual return rates. For example, when variation is high, it would take only 100 years for the top 1 percent to increase their share of total wealth from 40 percent -- a recent level in the United States -- to 90 percent.

Healthy economies support diverse entrepreneurial efforts, leading to high economic growth. But concentration of wealth reduces diversity, and with it the most likely growth rate for a country's economy, according to the researchers.

"The implication is that nations with diverse economies should tend to outcompete on the world stage those with large concentrations of wealth, such as monarchies, or established democracies that have allowed their wealth to concentrate," said author Clarence Lehman, associate dean for research in the College of Biological Sciences.

But while the rate of wealth concentration was increased by high variation among individual investors' returns, it bore no relation to the average economic growth.

"This leads to the surprising finding that wealth will concentrate due to chance alone in growing, stagnant or shrinking economies," said author Steve Polasky, professor of applied economics in the College of Food, Agricultural and Natural Resource Sciences.

The simulation results showed wealth concentrating regardless of economic cycles of growth and recession and regardless of whether wealth is split between two offspring every generation. As wealth concentrates with a few individuals, the growth of the economy will depend more and more on the returns of those few, making the economy less resilient to disruptions in their investments, the researchers said.

"The irony is that the economic diversity that helps ensure the presence of some successful enterprises and spurs economic growth could be lost if the success of these enterprises undermines economic diversity," said Fargione. "To retain the benefits of a diverse capitalist economy, we need economic policies that counter what seems to be the innate tendency for economies to concentrate wealth and become less diverse."

The simulations showed that a tax (or other mandatory donation to the public good) on the largest inherited fortunes would short-circuit the over-concentration of wealth. But the researchers stress that their point is to advocate not a particular policy, but a policy that accomplishes the goal of protecting long-term economic stability.

Email or share this story:

Scientists complete first mapping of molecule found in human embryonic stem cells

ScienceDaily (July 21, 2011) — Stem cell researchers at UCLA have generated the first genome-wide mapping of a DNA modification called 5-hydroxymethylcytosine (5hmC) in embryonic stem cells, and discovered that it is predominantly found in genes that are turned on, or active.See Also:Health & MedicineGenesStem CellsHuman BiologyBrain TumorProstate CancerLymphomaReferenceBRCA1Tumor suppressor geneEmbryonic stem cellDNA microarray

The finding by researchers with the Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research at UCLA may prove to be important in controlling diseases like cancer, where the regulation of certain genes plays a role in disease development.

"Any way you can control genes will be hugely important for human disease and cancer," said Steven E. Jacobsen, a professor of molecular, cell and developmental biology in the Life Sciences and a Howard Hughes Medical Institute investigator. "Cancer is generally a problem of genes being inappropriately turned off or mutated, like tumor suppressors genes, or genes that should be off getting switched on."

The study appears in the July issue of the journal Genome Biology.

5hmC is formed from the DNA base cytosine by adding a methyl group and then a hydroxy group. The molecule is important in epigenetics -- the study of changes in gene expression caused by mechanisms other than changes in the DNA sequence -- because the newly formed hydroxymethyl group on the cytosine can potentially switch a gene on and off, Jacobsen said.

The molecule 5hmC was only recently discovered, and its function has not been clearly understood, Jacobsen said. Until now, researchers didn't know where 5hmC was located within the genome.

"That is important to know because it helps you to understand how it is functioning and what it's being used for," said Jacobsen, who also is a researcher with UCLA's Jonsson Comprehensive Cancer Center. "We had known that DNA could be modified by 5hmC, but it wasn't clear where on the genome this was occurring."

Jacobsen, whose lab studies the molecular genetics and genomics of DNA methylation patterning, used genomics to define where in human embryonic stem cells the 5hmC was present. They used human embryonic stem cells because it had been shown previously that the molecule is abundant in those cells, as well as in brain cells, Jacobsen said.

In the study, Jacobsen found that 5hmC was associated with genes and tended to be found on genes that were active. The study also revealed that 5hmC was present on a type of DNA regulatory element, called enhancers, which help control gene expression. In particular, 5hmC was present on enhancers that are crucial for defining the nature of the human embryonic stem cells.

The results suggest that 5hmC plays a role in the activation of genes. This is opposite of the role of the more well studied 5mC (DNA methylation), which is involved in silencing genes. This relationship is in line with the view that 5hmC is created directly from 5mC.

"If we can understand the function of 5hmC, that will lead to greater understanding of how genes are turned on and off and that could lead to the development of methods for controlling gene regulation," Jacobsen said.

Moving forward, Jacobsen and his team will seek to uncover the mechanism by which 5hmC is created from DNA methylation and how it becomes localized to particular areas of the genome, such as the enhancers.

The two-year study was funded by the Howard Hughes Medical Institute, a Fred Eiserling and Judith Lengyel Graduate Doctoral Fellowship, the Leukemia & Lymphoma Society, the National Institutes of Health and by an Innovation Award from the Eli and Edythe Broad Center of Regenerative Medicine & Stem Cell Research at UCLA.

Email or share this story:

Signaling molecule identified as essential for maintaining a balanced immune response

ScienceDaily (July 22, 2011) — St. Jude Children's Research Hospital investigators have identified a signaling molecule that functions like a factory supervisor to ensure that the right mix of specialized T cells is available to fight infections and guard against autoimmune disease.See Also:Health & MedicineImmune SystemStem CellsLymphomaNervous SystemBrain TumorCancerReferenceNatural killer cellT cellImmune systemInflammation

The research also showed the molecule, phosphatase MKP-1, is an important regulator of immune balance. Working in laboratory cell lines and mice with specially engineered immune systems, scientists demonstrated that MKP-1 serves as a bridge between the innate immune response that is the body's first line of defense against infection and the more specialized adaptive immune response that follows. The results are published in the July 22 print edition of the scientific journal Immunity.

The results raise hopes that the MKP-1 pathway will lead to new tools for shaping the immune response, said Hongbo Chi, Ph.D., assistant member of the St. Jude Department of Immunology and the study's senior author. The co-first authors are Gonghua Huang, Ph.D., and Yanyan Wang, Ph.D., both postdoctoral fellows in Chi's laboratory.

The findings provide new details about how dendritic cells regulate the fate of naïve or undifferentiated T cells. Dendritic cells are the sentinels of the innate immune response, patrolling the body and ready to respond at the first sign of infection.

Investigators were surprised that a single molecule regulated production of three out of the four major subsets of T cells, which each play different roles. MKP-1 is a negative regulator of the enzyme p38, which is part of the MAP kinase family of enzymes that control pathways involved in cell proliferation, differentiation and death.

Chi and his colleagues demonstrated that MKP-1 works in dendritic cells by altering production of protein messengers known as cytokines. Those cytokines determine which subset of specialized T cells the undifferentiated T cells are fated to become. In this study, scientists showed that MKP1 controls production of the cytokines that yield T helper 1 (Th1), T helper 17 (Th17) and regulatory T (Treg) cells. Th1 cells combat intracellular bacterial and viral infections. Th17 cells fight extracellular bacterial infections and fungi. Treg cells help with immune suppression, protecting against autoimmune diseases.

The study showed that suppression of p38 by MKP-1 promotes production of interleukin 12 (IL-12), which leads to an increase in Th1 cells. Rising IL-12 coincides with a drop in interleukin 6 (IL-6) and a corresponding dip in production of Th17. MKP-1 also inhibited the generation of Treg cells by down-regulating production of a third cytokine, TGF-beta.

Knocking out MKP-1 in mice disrupted production of IL-12 and IL-6 in dendritic cells as well as the anti-bacterial and anti-fungal immune response, researchers reported. MKP-1 deficiency also promoted T-cell driven inflammation in a mouse model of colitis, an inflammatory disease.

"MKP-1 is the first signaling molecule found in dendritic cells to program differentiation of these diverse T- cell subsets," Chi said.

Previous work by other scientists focused on T cell differentiation in response to stimulation by cytokines. "This research fills a gap in our understanding of dendritic cell-mediated control of T-cell lineage choices," Chi said. "T cells do not recognize pathogens directly, but dendritic cells do. T cells need dendritic cells to tell them what to do. In this study, we show that MKP-1 signaling in dendritic cells bridges the innate and adaptive immune responses by regulating cytokine production."

Other authors are Lewis Shi and Thirumala-Devi Kanneganti, both of St. Jude.

The research was supported in part by the National Institutes of Health, the National Multiple Sclerosis Society, the Cancer Research Institute, The Hartwell Foundation and ALSAC.

Email or share this story:

Friday, August 26, 2011

Endangered river turtle's genes reveal ancient influence of Maya Indians

ScienceDaily (July 22, 2011) — A genetic study focusing on the Central American river turtle (Dermatemys mawii) recently turned up surprising results for a team of Smithsonian scientists involved in the conservation of this critically endangered species. Small tissue samples collected from 238 wild turtles at 15 different locations across their range in Southern Mexico, Belize and Guatemala revealed a "surprising lack" of genetic structure, the scientists write in a recent paper in the journal Conservation Genetics.See Also:Plants & AnimalsFrogs and ReptilesEndangered AnimalsEarth & ClimateWaterFloodsFossils & RuinsFossilsAncient CivilizationsReferenceSea turtleSnapping turtleMarine conservationTurtle

The turtles, which are entirely aquatic, represent populations from three different river basins that are geographically isolated by significant distance and high mountain chains.

"We were expecting to find a different genetic lineage in each drainage basin," explains the paper's main author Gracia González-Porter of the Center for Conservation and Evolutionary Genetics at the Smithsonian Conservation Biology Institute. "Instead, we found the mixing of lineages. It was all over the place." Despite appearing isolated, the genetic data showed the different turtle populations had been in close contact for years.

"But how?" the researchers wondered.

The best possible explanation, González-Porter and her colleagues say, is that for centuries humans have been bringing them together. The turtles have been used as food, in trade and in rituals for millennia, widely transported and customarily kept in holding ponds till they were needed.

"For centuries, this species has been part of the diet of the Mayans and other indigenous people who lived in its historic distribution range," the scientists point out in their paper. "D. mawii was a very important source of animal protein for the ancient Mayans of the Peten (Preclassic period 800-400 B.C.)…. And it is possible that these turtles were part of the diet of the Olmec culture more than 3,000 years ago."

One specimen of D. mawii was found in an ancient Teotihuacan burial site in Mexico, a spot located more than 186 miles from the known range of this turtle, the researchers say. An ancient sculpture of a Central American river turtle at the National Museum of Anthropology in Mexico City was found in the Basin of Mexico, more than 217 miles from the turtle's range.

"The Central American River turtle is tame and resilient," González-Porter explains, "which makes it easy to transport. Their shells give them lots of protection. People don't have refrigeration so they put the turtles in ponds in their back yards."

During the rainy season in the tropics, the water flows are huge, she says. Rivers and ponds flood, captive turtles escape and mix with the local turtles.

This ancient practice still persists today. In Guatemala, Central American river turtles are kept in medium-sized ponds where they can be easily captured when needed. Similarly, in the State of Tabasco, Mexico, captured turtles are kept in rustic ponds and raised until they are either consumed or sold.

The genetic analysis of the Central American River turtle was initiated because these animals are critically endangered, González-Porter says.

They are the last surviving species of the giant river turtles of the family Dermatemydidae. D. mawii is currently the most endangered turtle species in Central America. A recent increase in the commercial demand for its meat has pushed it to the brink of extinction -- 2.2 pounds of their meat can fetch $100. Most local populations have disappeared and this turtle is now largely restricted to remote areas that are inaccessible to humans.

Email or share this story:

A novel and potent antioxidant found in tomato plants, initial results suggest

ScienceDaily (July 22, 2011) — A team of researchers from the Institute of Molecular and Cell Biology (IBMCP) -a joint centre of the Universitat Politècnica de València and CSIC, the Spanish National Research Council- have identified a novel and potent natural antioxidant occurring in tomato plants. It is a phenolic substance that is synthesised by the tomato plant when it is subjected to biotic stress. Until now, it was completely unknown.See Also:Health & MedicineVitamin CVitamin EPlants & AnimalsEndangered PlantsBotanyEarth & ClimateEnvironmental PolicyGrasslandReferencePolyphenol antioxidantMediterranean dietAntioxidantEssential nutrient

The UPV and CSIC have registered the national and international patents of the new antioxidant and the laboratory procedures used to isolate and synthesise it chemically.

The finding was recently published in the journal Environmental and Experimental Botany.

IBMCP researchers point out that the antioxidant power of the new compound is much higher -14 times higher, to be precise- than, for example, that of resveratrol, a well-known antioxidant, found in red wine, which can delay cellular aging. In addition, it is 4.5 times more potent than vitamin E and 10 times more potent than vitamin C.

This substance could have multiple applications. For example, in the food industry it could be used as a preservative in food for human consumption and in animal fodder, because of its action as a retarder of lipid oxidation. This powerful antioxidant would prevent changes such as fats and oils becoming rancid, which strongly diminishes food quality. It could also be used as a supplement in certain products after careful processing.

Getting a grip on grasping

ScienceDaily (July 22, 2011) — Quickly grabbing a cup of coffee is an everyday action for most of us. For people with severe paralysis however, this task is unfeasible -- yet not "unthinkable." Because of this, interfaces between the brain and a computer can in principle detect these "thoughts" and transform them into steering commands. Scientists from Freiburg now have found a way to distinguish between different types of grasping on the basis of the accompanying brain activity.See Also:Health & MedicineBrain TumorDisabilityBirth DefectsMind & BrainIntelligenceBrain InjuryPsychologyReferenceFunctional neuroimagingPhantom limbElectroencephalographyMirror neuron

In the current issue of the journal "NeuroImage," Tobias Pistohl and colleagues from the Bernstein Center Freiburg and the University Medical Centre describe how they succeeded in differentiating the brain activity associated with a precise grip and a grip of the whole hand. Ultimately, the scientists aim to develop a neuroprosthesis: a device that receives commands directly from the brain, and which can be used by paralysed people to control the arm of a robot -- or even their own limbs.

One big problem about arm movements had been so far unresolved. In our daily lives, it is important to handle different objects in different ways, for example a feather and a brick. The researchers from Freiburg now found aspects in the brain's activity that distinguish a precise grip from one with the whole hand.

To this end, Pistohl and his collaborators made use of signals that are measured on the surface of the brain. The big advantage of this approach is that no electrodes have to be implanted directly into this delicate organ. At the same time, the obtained signals are much more precise than those that can be measured on the skull's surface.

The scientists conducted a simple experiment with patients that were not paralysed, but had electrodes implanted into their skull for medical reasons. The task was to grab a cup, either with a precise grip formed by the thumb and the index finger, or with their whole hand. At the same time, a computer recorded the electrical changes at the electrodes. And in fact, the scientists were able to find signals in the brain's activity that differed, depending on the type of grasp. A computer was able to attribute these signals to the different hand positions with great reliability. Now, the next challenge will be to identify these kinds of signals in paralysed patients as well -- with the aim of eventually putting a more independent life back within their reach.

Email or share this story:

Thursday, August 25, 2011

Fault in immune memory causes atopic eczema and psoriasis

ScienceDaily (July 22, 2011) — Scientists from the Centre for Allergy and Environment in Munich (ZAUM), the Helmholtz Zentrum München and the Technische Universität München believe the have discovered the causes of atopic eczema and psoriasis. The results of the studies have been published in the New England Journal of Medicine.See Also:Health & MedicineImmune SystemSkin CarePsoriasisLymphomaDiseases and ConditionsNervous SystemReferenceRashDandruffEczemaItch

The findings of a research study conducted by Stefanie and Kilian Eyerich show that both diseases are caused by an impaired immunological memory.

The couple, who are engaged in research at the Helmholtz Zentrum München and the Department of Dermatology and Allergology Biederstein, Technische Universität München (TUM), based their study on a rare group of patients who suffer from both diseases. As their results show, the T-cells of the immune system in the skin activate an inflammatory programme that causes either atopic eczema or psoriasis. Professor Ring, co-author and Director of Department of Dermatology and Allergology Biederstein believes that "this study highlights the critical role of T-cells in psoriasis."

The scientists now aim to find out which T-cell molecules are responsible for triggering these diseases. "Clearly, future therapy strategies should focus on the impairment of the immunological memory," says Professor Carsten Schmidt-Weber, Director of ZAUM.

T-cells together with the B-cells form the body's immunological memory. They initiate an immune response when they recognice substances that are foreign to the body. In the case of atopic eczema / neurodermatitis, the T-cells recognise substances that trigger an immune response: these include components of pollen, house-dust mites and also bacteria. In the case of psoriasis, it remains unclear which molecules are responsible for the response.Email or share this story:

The origin of comet material formed at high temperatures

ScienceDaily (July 22, 2011) — Comets are icy bodies, yet they are made of materials formed at very high temperatures. Where do these materials come from? Researchers from the Institut UTINAM(1)(CNRS/Université de Besançon) have now provided the physical explanation behind this phenomenon. They have demonstrated how these materials migrated from the hottest parts of the solar system to its outer regions before entering the composition of comets.See Also:Space & TimeSunAsteroids, Comets and MeteorsAstronomySolar FlareSolar SystemStarsReferenceComet Hale-BoppComet Shoemaker-Levy 9Orion NebulaNear-Earth asteroid

Their results are published in the July 2011 issue of the journal Astronomy & Astrophysics.

On 15 January 2006, after an eight-year voyage, NASA's Stardust Mission (Discovery program) brought dust from Comet Wild 2 back to Earth. Comets are formed at very low temperatures (around 50 Kelvin, i.e. -223°C). However, analyses have revealed that Comet Wild 2 is made of crystalline silicates and CAIs (Calcium-Aluminium-rich Inclusions). Considering that the synthesis of these minerals requires very high temperatures (above 1 000 Kelvin or 727°C), how can this composition be explained?

A team from the Institut UTINAM1 (CNRS/Université de Besançon), in collaboration with researchers from the Institut de Physique de Rennes (CNRS/Université de Rennes), the University of Duisburg-Essen (Germany) and the Laboratoire Astrophysique, Instrumentation et Modélisation (CNRS/CEA/Université Paris Diderot), have provided the answer on the basis of a physical phenomenon called photophoresis. This force depends on two parameters: the intensity of solar radiation and gas pressur e. At the birth of the solar system, the comets were formed from the protoplanetary disk(2). Inside this disk, a mixture of solid grains ranging in size from a few microns to several centimeters was bathed in a dilute gas that let sunlight through.

According to the researchers, photophoresis drove the particles towards the outer regions of the disk. Under the effect of solar radiation, one face of the grains was "hotter" than the other and the behavior of gas molecules on the surface of these grains was modified: on the "sunny" side, the gas molecules were more unstable and moved about more rapidly than on the "cold" side. By causing a pressure difference, this imbalance moved the grains away from the Sun. Through digital simulations, the researchers have borne out this photophoresis phenomenon. They demonstrated that the grains of crystalline silicates formed in the inner, hot region of the protoplanetary disk near to the Sun migrated to its outer, cold region before playing a part in the formation of the comets! . This novel physical explanation could account for the position of certain dust rings observed in protoplanetary disks and thus shed light on the conditions of planet formation.

(1)Institut "Univers, Transport, Interfaces Nanostructures, Atmosphère et Environnement, Molécules" (CNRS/Université de Besançon). (2)The protoplanetary disk of a young star (for example the Sun) is the disk of gas and dust that surrounds it, and in which planets are likely to form.

Email or share this story:

Powerful fluorescence tool lights the way to new insights into RNA of living cells

ScienceDaily (July 30, 2011) — The ability to tag proteins with a green fluorescent light to watch how they behave inside cells so revolutionized the understanding of protein biology that it earned the scientific teams who developed the technique Nobel Prizes in 2008. Now, researchers at Weill Cornell Medical College have developed a similar fluorescent tool that can track the mysterious workings of the various forms of cellular RNA.See Also:Plants & AnimalsBiologyCell BiologyBiotechnologyMatter & EnergyBiochemistryOrganic ChemistryTechnologyReferenceRNATrait (biology)OrganelleMolecular biology

In the July 29 issue of Science, the Weill Cornell investigators report how they developed an RNA mimic of green fluorescent protein (GFP) -- which they dubbed Spinach -- and describe how it will help unlock the secrets of the complex ways that RNA sustains human life as well as contributes to disease.

"These fluorescent RNAs offer us a tool that will be critical for understanding the diverse roles that RNA plays in human biology," says the study's senior author, Dr. Samie Jaffrey, an associate professor of pharmacology at Weill Cornell Medical College.

In recent years, the many roles played by RNA have become clearer. "Scientists used to think that RNA's function was limited to making proteins and that these proteins alone dictated everything that happened in cells," he says. "But now we are understanding that cells contain many different forms of RNA -- and some RNAs influence cell signaling and gene expression without ever being used for synthesizing proteins."

The list of known types of RNA has grown rapidly over the past several years -- from messenger RNA that codes for proteins, to diverse "non-coding" RNAs that affect translation and gene expression, and in some cases bind to proteins and regulate their function -- yet little is known about how these RNAs work, the researchers say.

The study's first author, Dr. Jeremy Paige, who conducted the research as a graduate student in pharmacology at Weill Cornell Medical College, adds that the new technology may provide insights into the development of common disorders. "More and more diseases are being linked to misregulation of RNA, but without being able to see the RNA, we can't understand how these processes lead to disease.

"We hope our RNA mimics of GFP open up the road to discovery," he says.

The RNAs developed by the Jaffrey group function like GFP, a natural protein expressed in jellyfish that exhibits a green fluorescence. GFP has enabled scientists to watch how proteins move in cells, providing powerful new insights into their roles in cell function. The DNA that encodes GFP is placed next to a gene that encodes for a protein, resulting in the expression of a protein fused to GFP, which can be observed by specialized forms of microscopy.

To make an RNA that functions like GFP, the Weill Cornell investigators took advantage of the ability of RNA to fold into complex three-dimensional shapes. Their goal was to create two new entities: a synthetic RNA sequence that would adopt a specific shape, and a small molecule that would bind to the new RNA and begin to fluoresce. "These were two huge challenges," says Dr. Jaffrey. "One challenge was to come up with an RNA sequence that could 'switch on' a small molecule. The other big hurdle was to find a small molecule that would fluoresce only when we wanted it to and would not be toxic to cells."

They tried a number of molecules, most of which stuck to oily lipids in the cell membrane and started fluorescing, or they would kill the cell. Finally, the team realized that GFP itself had a molecule, a fluorophore, within it that switched its light on when it was bound in a certain way within the protein. They created chemical molecules based on the shape of this fluorophore and then developed an artificial RNA sequence, or "aptamer," that held the fluorophore in exactly the same way that GFP held its fluorophore. They named this RNA "Spinach" for its bright green fluorescence.

The researchers went even further. They also developed several other RNA-fluorophore pairs, in addition to Spinach, that each emit a different fluorescent color, just as GFP has been evolved to exhibit a palette of colors that helps researchers track many proteins at once. Whereas GFP derivatives are often named after fruits, the Weill Cornell researchers named their RNA mimics of GFP after vegetables -- Spinach, Carrot and Radish.

The Weill Cornell investigators have already begun to use Spinach to track non-coding RNAs in cells. "Our laboratory has been very interested understanding why defects in RNA trafficking and translocation lead to developmental disorders in children, such as mental retardation," says Dr. Jaffrey. Using Spinach, they were able to watch as a non-coding RNA, fluorescing green, rapidly clusters in response to cellular stress. "We expect that Spinach will provide new insights into RNA trafficking in cells, and how this is affected in medical disorders," he says.

"There is still a lot of mystery surrounding RNA in biology. Fluorescent labeling and imaging has proved to be a powerful tool for scientists in the past, and we are hoping that Spinach too will be a tool that helps accelerate scientific discovery," says Dr. Paige.

Dr. Karen Wu of the Department of Pharmacology is a co-author on the study.

The work was supported by the McKnight Neuroscience Technology Innovation Award and the National Institutes of Neurological Disorders and Stroke.

Email or share this story:

Wednesday, August 24, 2011

Increasing potency of HIV-battling proteins

ScienceDaily (July 28, 2011) — If one is good, two can sometimes be better. Researchers at the California Institute of Technology (Caltech) have certainly found this to be the case when it comes to a small HIV-fighting protein.See Also:Health & MedicineHIV and AIDSInfectious DiseasesSTDSexual HealthVirusesDiseases and ConditionsReferenceAntiviral drugTransmission (medicine)HIV testBlood test

The protein, called cyanovirin-N (CV-N), is produced by a type of blue-green algae and has gained attention for its ability to ward off several diseases caused by viruses, including HIV and influenza. Now Caltech researchers have found that a relatively simple engineering technique can boost the protein's battling prowess.

"By linking two cyanovirins, we were able to make significantly more potent HIV-fighting molecules," says Jennifer Keeffe, a staff scientist at Caltech and first author of a new paper describing the study in the Proceedings of the National Academy of Sciences (PNAS). "One of our linked molecules was 18 times more effective at preventing infection than the naturally occurring, single protein."

The team's linked pairs, or dimers, were able to neutralize all 33 subtypes of HIV that they were tested against. The researchers also found the most successful dimer to be similar or more potent than seven well-studied anti-HIV antibodies that are known to be broadly neutralizing.

CV-N binds well to certain carbohydrates, such as the kind found in high quantities connected to the proteins on the envelope that surrounds the HIV virus. Once attached, CV-N prevents a virus from infecting cells, although the mechanism by which it accomplishes this is not well understood.

What is known is that each CV-N protein has two binding sites where it can bind to a carbohydrate and that both sites are needed to neutralize HIV.

Once the Caltech researchers had linked two CV-Ns together, they wanted to know if the enhanced ability of their engineered dimers to ward off HIV was related to the availability of additional binding sites. So they engineered another version of the dimers -- this time with one or more of the binding sites knocked out -- and tested their ability to neutralize HIV.

It turns out that the dimers' infection-fighting potency increased with each additional binding site -- three sites are better than two, and four are better than three. The advantages seemed to stop at four sites, however; the researchers did not see additional improvements when they linked three or four CV-N molecules together to create molecules with six to eight binding sites.

Although CV-N has a naturally occurring dimeric form, it isn't stable at physiological temperatures, and thus mainly exists in single-copy form. To create dimers that would be stable under such conditions, the researchers covalently bound together two CV-N molecules in a head-to-tail fashion, using flexible polypeptide linkers of varying lengths.

Interestingly, by stabilizing the dimers and locking them into a particular configuration, it seems that the group created proteins with distances between binding sites that are very similar to those between the carbohydrate binding sites in a broadly neutralizing anti-HIV antibody.

"It is possible that we have created a dimer that has its carbohydrate binding sites optimally positioned to block infection," says Stephen Mayo, Bren Professor of Biology and Chemistry, chair of the Division of Biology, and corresponding author of the new paper.

Because it is active against multiple disease-causing viruses, including multiple strains of HIV, CV-N holds unique promise for development as a drug therapy. Other research groups have already started investigating its potential application in prophylactic gels and suppositories.

"Our hope is that those who are working to make prophylactic treatments using cyanovirin will see our results and will use CVN2L0 instead of naturally occurring cyanovirin," Keeffe says. "It has higher potency and may be more protective."

The work was funded by the National Security Science and Engineering Faculty Fellowship program, the Defense Advanced Research Projects Agency Protein Design Processes program, and the Bill and Melinda Gates Foundation through the Grand Challenges in Global Health Initiative.

Email or share this story:

Zinc lozenges may shorten common cold duration, Finnish research suggests

ScienceDaily (July 26, 2011) — Depending on the total dosage of zinc and the composition of lozenges, zinc lozenges may shorten the duration of common cold episodes by up to 40%, according to a study published in the Open Respiratory Medicine Journal.See Also:Health & MedicineDietary SupplementVitaminCold and FluInfectious DiseasesDiseases and ConditionsLiving WellReferenceCommon coldMicronutrientZincDietary mineral

For treating the common cold, zinc lozenges are dissolved slowly in the mouth. Interest in zinc lozenges started in the early 1980s from the serendipitous observation that a cold of a young girl with leukemia rapidly disappeared when she dissolved a therapeutic zinc tablet in her mouth instead of swallowing it. Since then over a dozen studies have been carried out to find out whether zinc lozenges are effective, but the results of those studies have diverged.

Dr. Harri Hemila of the University of Helsinki, Finland, carried out a meta-analysis of all the placebo-controlled trials that have examined the effect of zinc lozenges on natural common cold infections. Of the 13 trial comparisons identified, five used a total daily zinc dose of less than 75 mg and uniformly those five comparisons found no effect of zinc. Three trials used zinc acetate in daily doses of over 75 mg, with the average indicating a 42% reduction in the duration of colds. Five trials used zinc salts other than acetate in daily doses of over 75 mg, with the average indicating a 20% decrease in the duration of colds.

In several studies, zinc lozenges caused adverse effects, such as bad taste, but there is no evidence that zinc lozenges might cause long term harm. Furthermore, in the most recent trial on zinc acetate lozenges, there were no significant differences between the zinc and placebo groups in the occurrence of adverse effects although the daily dose of zinc was 92 mg. Dr. Hemila concluded that "since a large proportion of trial participants have remained without adverse effects, zinc lozenges might be useful for them as a treatment option for the common cold."

Email or share this story:

Modeling plant metabolism to optimize oil production

ScienceDaily (July 26, 2011) — Scientists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory have developed a computational model for analyzing the metabolic processes in rapeseed plants -- particularly those related to the production of oils in their seeds. Their goal is to find ways to optimize the production of plant oils that have widespread potential as renewable resources for fuel and industrial chemicals.See Also:Plants & AnimalsSeedsEndangered PlantsMatter & EnergyOrganic ChemistryPetroleumEarth & ClimateEnergy and the EnvironmentOil SpillsReferenceCanolaAromatherapyBiomassLegume

The model, described in two featured articles in the August 1, 2011, issue of the Plant Journal, may help to identify ways to maximize the conversion of carbon to biomass to improve the production of plant-derived biofuels.

"To make efficient use of all that plants have to offer in terms of alternative energy, replacing petrochemicals in industrial processes, and even nutrition, it's essential that we understand their metabolic processes and the factors that influence their composition," said Brookhaven biologist Jorg Schwender, who led the development of the model with postdoctoral research associate Jordan Hay.

In the case of plant oils, the scientists' attention is focused on seeds, where oils are formed and accumulated during development. "This oil represents the most energy-dense form of biologically stored sunlight, and its production is controlled, in part, by the metabolic processes within developing seeds," Schwender said.

One way to study these metabolic pathways is to track the uptake and allotment of a form of carbon known as carbon-13 as it is incorporated into plant oil precursors and the oils themselves. But this method has limits in the analysis of large-scale metabolic networks such as those involved in apportioning nutrients under variable physiological conditions.

"It's like trying to assess traffic flow on roads in the United States by measuring traffic flow only on the major highways," Schwender said.

To address these more complex situations, the Brookhaven team constructed a computational model of a large-scale metabolic network of developing rapeseed (Brassica napus) embryos, based on information mined from biochemical literature, databases, and prior experimental results that set limits on certain variables. The model includes 572 biochemical reactions that play a role in the seed's central metabolism and/or seed oil production, and incorporates information on how those reactions are grouped together and interact.

The scientists first tested the validity of the model by comparing it to experimental results from carbon-tracing studies for a relatively simple reaction network -- the big-picture view of the metabolic pathways analogous to the traffic on U.S. highways. At that big-picture level, results from the two methods were largely consistent, providing validation for both the computer model and the experimental technique, while identifying a few exceptions that merit further exploration.

The scientists then used the model to simulate more complicated metabolic processes under varying conditions -- for example, changes in oil production or the formation of oil precursors in response to changes in available nutrients (such as different sources of carbon and nitrogen), light conditions, and other variables.

"This large-scale model is a much more realistic network, like a map that represents almost every street," Schwender said, "with computational simulations to predict what's going on." Continuing the traffic analogy, he said, "We can now try to simulate the effect of 'road blocks' or where to add new roads to most effectively eliminate traffic congestion."

The model also allows the researchers to assess the potential effects of genetic modifications (for example, inactivating particular genes that play a role in plant metabolism) in a simulated environment. These simulated "knock-out" experiments gave detailed insights into the potential function of alternative metabolic pathways -- for example, those leading to the formation of precursors to plant oils, and those related to how plants respond to different sources of nitrogen.

"The model has helped us construct a fairly comprehensive overview of the many possible alternative routes involved in oil formation in rapeseed, and categorize particular reactions and pathways according to the efficiency by which the organism converts sugars into oils. So at this stage, we can enumerate, better than before, which genes and reactions are necessary for oil formation, and which make oil production most effective," Schwender said.

The researchers emphasize that experimentation will still be essential to further elucidating the factors that can improve plant oil production. "Any kind of model is a largely simplified representation of processes that occur in a living plant," Schwender said. "But it provides a way to rapidly assess the relative importance of multiple variables and further refine experimental studies. In fact, we see our model and experimental methods such as carbon tracing as complementary ways to improve our understanding of plants' metabolic pathways."

The scientists are already incorporating information from this study that will further refine the model to increase its predictive power, as well as ways to extend and adapt it for use in studying other plant systems.

This work was supported by the DOE Office of Science.

Email or share this story:

Tuesday, August 23, 2011

Brain autopsies of four former football players reveal not all get chronic traumatic encephalopathy

ScienceDaily (July 26, 2011) — Preliminary results from the first four brains donated to the Canadian Sports Concussion Project at the Krembil Neuroscience Centre, TorontoWesternHospital, reveal that two of the four former Canadian Football League (CFL) players suffered from a brain disease known as Chronic Traumatic Encephalopathy (CTE), while two did not show signs of CTE.See Also:Health & MedicineChronic IllnessHealthy AgingDiseases and ConditionsMind & BrainBrain InjuryDisorders and SyndromesIntelligenceReferenceDementia with Lewy bodiesDementiaHead injuryNeurology

Bobby Kuntz, a former Toronto Argonaut and Hamilton Tiger-Cat and Jay Roberts, an Ottawa Roughrider both had a history of repeated concussions during their careers and showed the characteristic signs of CTE, an abnormal build-up of a protein called Tau in the brain, and other degenerative changes.

CTE can result in memory impairment, emotional instability, erratic behavior, depression, and problems with impulse control. CTE may eventually progress to full-blown dementia. Dr. Hazrati is very clear, however, to emphasize that the precise relationship between concussions and neurodegeneration remains to be demonstrated by future research.

Peter Ribbins, a former Winnipeg Blue Bomber, passed away in December 2010, at age 63 of Parkinson's disease. Autopsy results show he did not have signs of CTE. Tony Proudfoot, anall-star defensive back for the Montreal Alouettes, died at age 61 in 2011 of Lou Gehrig's disease (a neurodegenerative condition also known as ALS). Although a connection between ALS and repeated head trauma is being researched, Proudfoot did not have signs of CTE. Both of these players were in the league at a time when it was common to spear tackle with the crown of the head. According to the Canadian Football League Alumni Association (CFLAA), Proudfoot experienced repeated head trauma as a hard-hitting defensive back throughout his 12 seasons in the league.

Kuntz passed away in February 2011 at age 79 after a long battle with Parkinson's Disease and diffuse Lewy body disease, a condition that overlaps with Parkinson's and Alzheimer's. Roberts, 67, who died in October 2010, suffered from dementia and lung cancer. The autopsies were performed by Dr. Lili-Naz Hazrati, a neuropathologist in the Laboratory Medicine Program at the University Health Network.

"While both of these men appeared to have pathological signs of CTE, they also suffered from other serious neurological and vascular related diseases," said Dr. Hazrati. "Right now we have more questions than answers about the relationship between repeated concussions and late brain degeneration. For example, we are still trying to understand why these two players acquired CTE and the other two did not."

Mary Kuntz, wife of the late Bobby Kuntz, donated his brain to the Canadian Sports Concussion Project at the Krembil Neuroscience Centre and believes the more players who donate their brains, the better the chances of helping future athletes.

"We've always had questions about Bob's health, because there were so many conflicting medical opinions," said Mary Kuntz. "We knew there must have been some effect from all of the concussions over the years, and this was an affirmation that concussions did have a part in his health problems.

"Young players should know the risks of concussions. When you are young, you can't believe what can happen to you when you are older, but we have lived though it. What is good about this study is that there will be more evidence and information for players."

"We were very happy to be involved in this and it has brought us a sense of closure."

The Canadian Sports Concussion Project at the Krembil Neuroscience Centre is organized by a team of concussion experts including Dr. Charles Tator and Dr. Richard Wennberg and scientists from several other Canadian institutions. The focus of the project is to further our understanding of how concussions affect the brain.

"There are still so many unanswered questions surrounding concussion and the long-term consequences of repeated head injuries," said Dr. Tator. "We are trying to determine why some athletes in contact sports develop CTE and others don't, as well as how many concussions lead to the onset of this degenerative brain disease. Also, we need to develop tests to detect this condition at an early stage and to discover treatments."

According to Jed Roberts, son of Jay Roberts, he and his sisters began noticing early signs of their father's memory decline when he starting repeating stories, but insisting he had never told them. "My dad had numerous concussions, although they were undocumented, and I think he knew there was something was wrong, which is why he wanted to help find answers that would hopefully protect future football players," said Jed, a former CFL player with the Edmonton Eskimos. "I think it is really important that we create awareness around this issue, so that players can live healthy, productive lives beyond the game."

Leo Ezerins, former CFL player and current Executive Director of the CFLAA is a member of the Project Team. It has been through the joint efforts of the CFLAA and the research team that these four donations were made possible.

Email or share this story:

Scientists developing new therapy for HER2-positive breast cancer

ScienceDaily (July 26, 2011) — Patients with HER2-positive breast cancer may have an alternative therapy when they develop resistance to trastuzumab, also known as Herceptin, according to a laboratory finding published in Clinical Cancer Research, a journal of the American Association for Cancer Research.See Also:Health & MedicineBreast CancerCancerBrain TumorLung CancerOvarian CancerSkin CancerReferenceBreast cancerMonoclonal antibody therapyMetastasisNanomedicine

Jacek Capala, Ph.D., D.Sc., an investigator at the National Cancer Institute, and colleagues designed, produced and tested HER2-Affitoxin, a novel protein that combines HER2-specific affibody molecules and a modified bacterial toxin, PE38.

"Unlike the current HER2-targeted therapeutics, such as Herceptin, this protein does not interfere with the HER2 signaling pathway but, instead, uses HER2 as a target to deliver a modified form of bacterial toxin specifically to the HER2-positive cancer cells. When cells absorb the toxin, it interferes with protein production and, thereby, kills them," said Capala.

At least, that is what happened in Capala's laboratory. After Affitoxin was injected into tumor-bearing mice, even relatively large, aggressive tumors stopped growing and most of them disappeared. The effect was strong enough that Capala believes it warrants a clinical trial.

"Herceptin has revolutionized the treatment of patients with HER2-positive breast cancer, but a significant number of tumors acquire resistance to the drug," said Capala. "Affitoxin could offer another therapeutic option for those patients whose tumors no longer respond to Herceptin."

Email or share this story:

Medicare Part D associated with reduction in nondrug medical spending

ScienceDaily (July 26, 2011) — Among elderly Medicare beneficiaries with limited prior drug coverage, implementation of Medicare Part D was associated with significant reductions in nondrug medical spending, such as for inpatient and skilled nursing facility care, according to a study in the July 27 issue of JAMA.See Also:Health & MedicinePharmacologyToday's HealthcareHealth PolicyPersonalized MedicineHIV and AIDSControlled SubstancesReferencePalliative carePharmaceutical companyDetoxClinical trial

"Implementation of the Medicare prescription drug benefit (Part D) in January 2006 was followed by increased medication use, reduced out-of-pocket costs, and improved adherence to essential medications for elderly persons. The effects of Part D on nondrug medical spending for Medicare beneficiaries have not been clearly defined," according to background information in the article.

J. Michael McWilliams, M.D., Ph.D., of Harvard Medical School and Brigham and Women's Hospital, Boston, and colleagues compared nondrug medical spending for traditional Medicare beneficiaries before and after the implementation of Part D. Nationally representative survey data and linked Medicare claims from 2004-2007 were used to compare nondrug medical spending before and after the implementation of Part D by self-reported generosity of prescription drug coverage (extent to which medications were mostly or completely covered) before 2006. Participants included 6,001 elderly Medicare beneficiaries from the Health and Retirement Study, including 2,538 with generous and 3,463 with limited drug coverage before 2006.

Adjusted total nondrug medical spending before implementation of Part D was consistently but not significantly higher for participants with limited drug coverage than for participants with generous drug coverage (7.6 percent relative difference). The researchers found that nondrug medical spending after Part D implementation was 3.9 percent lower for participants with limited prior drug coverage than for those with generous prior drug coverage, which was a significant differential