intellectual vanities… about close to everything

Archive for January 2008

Brain Circuits: Method Applied To Learning And Memory Pathway

leave a comment »

Researchers at the Picower Institute for Learning and Memory at MIT report in the Jan. 24 online edition of Science that they have created a way to see, for the first time, the effect of blocking and unblocking a single neural circuit in a living animal.

The green-stained section of this mouse hippocampus represents where the new DICE-K technique blocked the neural-signal transmission in one of the hippocampal circuits of the brain. (Credit: Image / Toshi Nakashiba, MIT)
This revolutionary method allowed Susumu Tonegawa, Picower Professor of Biology and Neuroscience, and colleagues to see how bypassing a major memory-forming circuit in the brain affected learning and memory in mice.
“Our data strongly suggest that the hippocampal neural pathway called the tri-synaptic pathway, or TSP, plays a crucial role in quickly forming memories when encountering new events and episodes in day-to-day life,” Tonegawa said. “Our results indicate that the decline of these abilities, such as that which accompanies neurodegenerative diseases and normal aging in humans, is likely to be due, at least in part, to the malfunctioning of this circuit.”

Combining several cutting-edge genetic engineering techniques, Tonegawa’s laboratory invented a method called doxycycline-inhibited circuit exocytosis-knockdown, or DICE-K-an acronym that also reflects Tonegawa’s admiration of ace Boston Red Sox pitcher Daisuke Matsuzaka. DICE-K allows researchers for the first time to induce and reverse a blockade of synaptic transmission in specific neural circuits in the hippocampus.

“The brain is the most complex machine ever assembled on this planet,” Tonegawa said. “Our cognitive abilities and behaviors are based on tens of thousands of molecules that compose several billion neurons, as well as how those neurons are connected.

“One effective way to understand how this immensely complex cellular network works in a major form of cognition like memory is to intervene in the specific neural circuit suspected to be involved,” he said.

Computing memories

The hippocampus, a seahorse-shaped brain region, plays a part in memory and spatial navigation. In Alzheimer’s disease, the hippocampus is one of the first regions to suffer damage; memory problems and disorientation are among the disease’s first symptoms.

The hippocampus is made up of several regions–CA1, CA3 and the dentate gyrus–that are wired up with distinct pathways.

The MIT study sought to determine how the interactions between neural pathways and the hippocampal regions affect learning and memory tasks.

Imagine that the three hippocampal regions are computers, and neural pathways are the conduits through which the computers get data from all over the brain. The computers perform different tasks, so the types of data processing will depend on which conduits the data travels through.

The hippocampus has two major, parallel information-carrying routes: the tri-synaptic pathway (TSP) and the shorter monosynaptic pathway (MSP). The TSP includes data processing from all three hippocampal regions, whereas the MSP skips through most of them.

Uisng DICE-K, the researchers were surprised to find that mice in which the major TSP pathway was shut down could still learn to navigate a maze. The shorter MSP pathway was sufficient for the job.

However, the maze is a task that is slowly learned over many repeated trials. When the mice were tested with a different task in a new environment that required rapid learning and memory formation, the researchers found that the mice with TSP shut down could not perform the task. Thus, the TSP pathway is required for animals to quickly acquire memories in a new environment. “This kind of learning results in the most sophisticated form of memory that makes animals more intelligent and is known to decline with age,” Tonegawa said.

Science. 2008 Jan 24 [Epub ahead of print]
Transgenic Inhibition of Synaptic Transmission Reveals Role of CA3 Output in Hippocampal Learning.

The Picower Institute for Learning and Memory, Howard Hughes Medical Institute, RIKEN-MIT Neuroscience Research Center, Department of Biology and Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139, USA.

The hippocampus is an area of the brain involved in learning and memory. It contains parallel excitatory pathways referred to as the trisynaptic pathway (which carries information from the entorhinal cortex –> dentate gyrus –> CA3 –> CA1 –> entorhinal cortex) and the monosynaptic pathway (which connects entorhinal cortex –> CA1 –> entorhinal cortex). We developed a generally applicable tetanus toxin-based method for transgenic mice that permits inducible and reversible inhibition of synaptic transmission and applied it to the trisynaptic pathway while preserving transmission in the monosynaptic pathway. We found that synaptic output from CA3 in the trisynaptic pathway is dispensable and the short monosynaptic pathway is sufficient for incremental spatial learning. In contrast, the full trisynaptic pathway containing CA3 is required for rapid, one-trial contextual learning, for pattern completionbased memory recall and for spatial tuning of CA1 cells.

Written by huehueteotl

January 31, 2008 at 5:04 pm

Genes Predict Risk Of Getting Hooked On Cigarettes

with one comment

Cigarette smoking is the largest preventable source of death and disability in the USA, contributing to ~ 400,000 deaths annually. Despite widespread knowledge of the health dangers, ~ 1 in 8 American adults is a habitual heavy smoker.
For several decades, scientists have known that most of the risk for habitual heavy smoking (smoking a pack each day) is largely influenced by genetics. This conclusion comes from the study of identical and fraternal twins from Scandinavia, North America, Australia and (more recently) China. It has been estimated that ~ 2/3 of the risk to become a heavy habitual smoker is genetic. This does not imply that this genetic risk is due to a single gene. It is known that many genes are involved, each one contributing a small amount of risk.

Finding the individual genes is a considerable challenge, but worth the effort, because it is hoped that the genes conveying risk for heavy smoking could be used to develop new medicines to help people quit. The development of new medicines to help people quit is particularly important, because the existing medications, including nicotine replacement (‘the patch’ or gum), bupropion and varenicline are effective in the short-term (several months) for a minority of heavy smokers.

This paper describes the results of a genetic study of 14,000 people, from the USA and Europe, whose smoking histories were known. DNA samples from ~ 6000 people were analyzed at ~ 500,000 known variations in the human genome to determine whether any of these variations predicted cigarettes per day during the period of heaviest smoking for these individuals. The results implicated variations in two genes, both producing brain proteins to which nicotine binds in generating its addicting effects. These two proteins (are their genes) are termed the alpha 3 and alpha 5 nicotinic receptor subunits, so-called because they form (with other nicotinic receptor subunits) binding sites for nicotine on certain brain cells which are known to be activated during the process of addiction.

A second population of ~ 8000 people (whose smoking histories were known) was analyzed in a similar manner, the result again suggesting that variations in these two genes increased risk for heavy smoking. Taken together, these two studies provide convincing proof that variations in the alpha 3 and alpha 5 nicotinic receptor subunit genes play a significant role in risk for nicotine addiction. A previously published paper, using similar methods, also supports this conclusion.

These results suggest two important research activities. First, and foremost, the alpha 3 and alpha 5 nicotinic receptor subunits will be made targets for new smoking cessation medication development programs by pharmaceutical companies. Second, the implicated DNA variants can be used to determine whether they predict ability to quit using the one of the currently available smoking cessation medicines. This “personalized medicine” approach might allow for more efficient and productive use of those medicines, until improved ones can be created.

Mol Psychiatry. 2008 Jan 29 [Epub ahead of print]
alpha-5/alpha-3 nicotinic receptor subunit alleles increase risk for heavy smoking.
[1] 1Department of Psychiatry, University of Pennsylvania School of Medicine, Philadelphia, PA, USA [2] 2Clinical Pharmacology and Discovery Medicine, GlaxoSmithKline, Upper Merion, PA, USA [3] 3Clinical Pharmacology and Discovery Medicine, GlaxoSmithKline, Verona, Italy.

Twin studies indicate that additive genetic effects explain most of the variance in nicotine dependence (ND), a construct emphasizing habitual heavy smoking despite adverse consequences, tolerance and withdrawal. To detect ND alleles, we assessed cigarettes per day (CPD) regularly smoked, in two European populations via whole genome association techniques. In these approximately 7500 persons, a common haplotype in the CHRNA3-CHRNA5 nicotinic receptor subunit gene cluster was associated with CPD (nominal P=6.9 x 10(-5)). In a third set of European populations (n= approximately 7500) which had been genotyped for approximately 6000 SNPs in approximately 2000 genes, an allele in the same haplotype was associated with CPD (nominal P=2.6 x 10(-6)). These results (in three independent populations of European origin, totaling approximately 15 000 individuals) suggest that a common haplotype in the CHRNA5/CHRNA3 gene cluster on chromosome 15 contains alleles, which predispose to ND.Molecular Psychiatry advance online publication, 29 January 2008; doi:10.1038/

see also:

smoking is a thing in the head – nicotine rush too

The Brain And The Nicotine


Written by huehueteotl

January 31, 2008 at 4:33 pm

Different Patterns Of Brain Activity In Creative And Noncreative Problem Solvers

with 3 comments

Why do some people solve problems more creatively than others? Are people who think creatively somehow different from those who tend to think in a more methodical fashion?

These questions are part of a long-standing debate, with some researchers arguing that what we call “creative thought” and “noncreative thought” are not basically different. If this is the case, then people who are thought of as creative do not really think in a fundamentally different way from those who are thought of as noncreative. On the other side of this debate, some researchers have argued that creative thought is fundamentally different from other forms of thought. If this is true, then those who tend to think creatively really are somehow different.

A new study led by John Kounios, professor of psychology at Drexel University and Mark Jung-Beeman of Northwestern University addresses these questions by comparing the brain activity of creative and noncreative problem solvers. The study published in the journal Neuropsychologia, reveals a distinct pattern of brain activity, even at rest, in people who tend to solve problems with a sudden creative insight — an “Aha! Moment” — compared to people who tend to solve problems more methodically.

At the beginning of the study, participants relaxed quietly for seven minutes while their electroencephalograms (EEGs) were recorded to show their brain activity. The participants were not given any task to perform and told they could think about whatever they wanted. Later, they were asked to solve a series of anagrams — scrambled letters that can be rearranged to form words [MPXAELE = EXAMPLE]. These can be solved by deliberately and methodically trying out different letter combinations, or they can be solved with a sudden insight or “Aha!” in which the solution pops into awareness. After each successful solution, participants indicated in which way the solution had come to them.

The participants were then divided into two groups — those who reported solving the problems mostly by sudden insight, and those who reported solving the problems more methodically — and resting-state brain activity for these groups was compared. As predicted, the two groups displayed strikingly different patterns of brain activity during the resting period at the beginning of the experiment — before they knew they would have to solve problems or even knew what the study was about.

One difference was that the creative solvers exhibited greater activity in several regions of the right hemisphere. Previous research has suggested that the right hemisphere of the brain plays a special role in solving problems with creative insight, likely due to right-hemisphere involvement in the processing of loose or “remote” associations between the elements of a problem, which is understood to be an important component of creative thought. The current study shows that greater right-hemisphere activity occurs even during a “resting” state in those with a tendency to solve problems by creative insight. This finding suggests that even the spontaneous thought of creative individuals, such as in their daydreams, contains more remote associations.

Second, creative and methodical solvers exhibited different activity in areas of the brain that process visual information. The pattern of “alpha” and “beta” brainwaves in creative solvers was consistent with diffuse rather than focused visual attention. This may allow creative individuals to broadly sample the environment for experiences that can trigger remote associations to produce an Aha! Moment. For example, a glimpse of an advertisement on a billboard or a word spoken in an overheard conversation could spark an association that leads to a solution. In contrast, the more focused attention of methodical solvers reduces their distractibility, allowing them to effectively solve problems for which the solution strategy is already known, as would be the case for balancing a checkbook or baking a cake using a known recipe.

Thus, the new study shows that basic differences in brain activity between creative and methodical problem solvers exist and are evident even when these individuals are not working on a problem. According to Kounios, “Problem solving, whether creative or methodical, doesn’t begin from scratch when a person starts to work on a problem. His or her pre-existing brain-state biases a person to use a creative or a methodical strategy.”

In addition to contributing to current knowledge about the neural basis of creativity, this study suggests the possible development of new brain imaging techniques for assessing potential for creative thought, and for assessing the effectiveness of methods for training individuals to think creatively.

Neuropsychologia. 2008;46(1):281-91. Epub 2007 Jul 27.
The origins of insight in resting-state brain activity.
Department of Psychology, Drexel University, Philadelphia, PA, USA.
People can solve problems in more than one way. Two general strategies involve (A) methodical, conscious, search of problem-state transformations, and (B) sudden insight, with abrupt emergence of the solution into consciousness. This study elucidated the influence of initial resting brain-state on subjects’ subsequent strategy choices. High-density electroencephalograms (EEGs) were recorded from subjects at rest who were subsequently directed to solve a series of anagrams. Subjects were divided into two groups based on the proportion of anagram solutions derived with self-reported insight versus search. Reaction time and accuracy results were consistent with different cognitive problem-solving strategies used for solving anagrams with versus without insight. Spectral analyses yielded group differences in resting-state EEG supporting hypotheses concerning insight-related attentional diffusion and right-lateralized hemispheric asymmetry. These results reveal a relationship between resting-state brain activity and problem-solving strategy, and, more generally, a dependence of event-related neural computations on the preceding resting state.

see also:

Cognitive Insight And Its Neural Mechanism


Written by huehueteotl

January 31, 2008 at 11:59 am

Deep Brain Stimulation In Hypothalamus Triggers Memories

with 2 comments

Deep brain stimulation (DBS) surgery, which is used to treat Parkinson’s disease and other movement disorders, is now being studied for its potential to treat a variety of conditions. A new study found that hypothalamic DBS performed in the treatment of a patient with morbid obesity unexpectedly evoked a sense of déjà vu and detailed personal memories.

Model of neurons firing in the brain. Researchers may have accidentally hit on a trigger spot for déjà vu in the hypothalamus. (Credit: iStockphoto/Kiyoshi Takahase)

Led by Andres Lozano, Professor of Neurosurgery and Canada Research Chair in Neuroscience and his team at the Toronto Western Hospital in Toronto, Ontario, researchers conducted an experimental study to treat a 50-year-old man with a lifelong history of obesity in whom a variety of treatment approaches had failed. While they were identifying potential appetite suppressant sites in the hypothalamus by stimulating electrode contacts that had been implanted there, the patient suddenly experienced a feeling of “déjà vu.”

He reported the perception of being in a park with friends from when he was around 20 years old and as the intensity of the stimulation was increased, the details became more vivid. These sensations were reproduced when the stimulation was performed in a double-blinded manner. The contacts that most readily induced the memories were located in the hypothalamus and estimated to be close to the fornix, an arched bundle of fibers that carries signals within the limbic system, which is involved in memory and emotions. Stimulation was shown to drive the activity the temporal lobe and the hippocampus, important components of the brain’s memory circuit.

At the first office visit two months after the patient was released from the hospital, the researchers were able to induce and videotape the memory effects seen in the operating room by turning on the electrical stimulation. They also tested the patient’s memory during and without stimulation and found that after three weeks of continuous hypothalamic stimulation he showed significant improvements in two learning tests. In addition, the patient was much more likely to remember unrelated paired objects when stimulation was on than when it was off. They conclude that “just as DBS can influence motor and limbic circuits, it may be possible to apply electrical stimulation to modulate memory function and, in so doing, gain a better understanding of the neural substrates of memory.”

DBS of the hypothalamus has also been used to treat cluster headaches and aggressiveness in humans, and stimulating this area influences feeding behavior in animals.

Annals of Neurology Volume 63, Issue 1, Date: January 2008, Pages: 119-123 DOI 10.1002/ana.21295

Memory enhancement induced by hypothalamic/fornix deep brain stimulation

Clement Hamani, Mary Pat McAndrews, Melanie Cohn, Michael Oh, Dominik Zumsteg, Colin M. Shapiro, Richard A. Wennberg, Andres M. Lozano

Bilateral hypothalamic deep brain stimulation was performed to treat a patient with morbid obesity. We observed, quite unexpectedly, that stimulation evoked detailed autobiographical memories. Associative memory tasks conducted in a double-blinded manner demonstrated that stimulation increased recollection but not familiarity-based recognition, indicating a functional engagement of the hippocampus. Electroencephalographic source localization showed that hypothalamic deep brain stimulation drove activity in mesial temporal lobe structures. This shows that hypothalamic stimulation in this patient modulates limbic activity and improves certain memory functions.

Written by huehueteotl

January 31, 2008 at 11:32 am

Checking One Voice In A Noisy Room? New Findings On Selectively Interpreting Sounds

with one comment

Scientists at Cold Spring Harbor Laboratory (CSHL) have reported new findings about how the mammalian brain interprets and fashions representations of sound that may help explain how we are able to focus on one particular sound among many in noisy environments such as offices or cocktail parties.

Neurons in the brain’s auditory cortex interpret incoming sound signals and send them to the rest of the nervous system, in the brain and spinal cord. Using rats, the CSHL team discovered that a very small minority of available auditory neurons react strongly when exposed to any specific sound.

“This finding challenges the standard model of sound representations in the auditory cortex, which predicts that neural representations of stimuli often engage a large fraction of neurons,” said Anthony Zador, Ph.D., CSHL professor and corresponding author of a new research paper.*

The researchers used a new technique called “in vivo cell-attached patch clamp recording” which measures the reaction of individual neurons. This recording technique samples neurons in a fair and unbiased way, unlike traditional approaches, which favored the largest and most active neurons. Using this technique, the team found that only 5% of neurons in the auditory cortex had a “high firing rate” when receiving a range of sounds of varying length, frequency, and volume. The experiment included white noise and natural animal sounds.

The team’s objective was to quantify the relative contributions of different sub-populations of neurons in response to the range of sounds. Most of what is known about the auditory cortex of the mammalian brain comes from studies of the anesthetized cortex. The results of the experiments reported today are important partly because they measure the response of neurons in rats that were not anesthetized. In animals that are awake, it’s possible to measure the response over an interval of time to one sound among many that are co-occurring.

This is the approach the Zador lab has taken to explain “selective attention,” or what Dr. Zador calls “the cocktail party problem.” Half of the neurons measured in the reported experiments showed no reaction at all to incoming stimuli. The researchers hypothesize that each neuron in the auditory cortex may have an “optimal stimulus” to which it is particularly sensitized.

“Your entire sensory apparatus is there to make successful representations of the outside world,” said Dr. Zador, who is director of the CSHL Swartz Center for Computational Neuroscience. “Sparse representations may make sensory stimuli easier to recognize and remember.” Recognizing the brain’s ability to distinguish “optimal stimuli” could help scientists find ways to improve how sounds are learned. Prior research has already yielded similar results when measuring sight, movement, and smell. This is the first evidence of a correlation between sparse representations and hearing.

“The goal of sensory processing is to take a signal, like a sound or a vision, from your environment and use it to drive behavior,” said Dr. Zador. “The brain needs to recognize and learn about these inputs in order to survive.”

PLoS Biol 6(1): e16 doi:10.1371/journal.pbio.0060016

Sparse Representation of Sounds in the Unanesthetized Auditory Cortex

Tomáš Hromádka1, Michael R. DeWeese2, Anthony M. Zador3*

1 Cold Spring Harbor Laboratory, Watson School of Biological Sciences, Cold Spring Harbor, New York, United States of America, 2 Department of Physics and Helen Wills Neuroscience Institute, University of California, Berkeley, California, United States of America, 3 Cold Spring Harbor Laboratory, Cold Spring Harbor, New York, United States of America

How do neuronal populations in the auditory cortex represent acoustic stimuli? Although sound-evoked neural responses in the anesthetized auditory cortex are mainly transient, recent experiments in the unanesthetized preparation have emphasized subpopulations with other response properties. To quantify the relative contributions of these different subpopulations in the awake preparation, we have estimated the representation of sounds across the neuronal population using a representative ensemble of stimuli. We used cell-attached recording with a glass electrode, a method for which single-unit isolation does not depend on neuronal activity, to quantify the fraction of neurons engaged by acoustic stimuli (tones, frequency modulated sweeps, white-noise bursts, and natural stimuli) in the primary auditory cortex of awake head-fixed rats. We find that the population response is sparse, with stimuli typically eliciting high firing rates (>20 spikes/second) in less than 5% of neurons at any instant. Some neurons had very low spontaneous firing rates (<0.01 spikes/second). At the other extreme, some neurons had driven rates in excess of 50 spikes/second. Interestingly, the overall population response was well described by a lognormal distribution, rather than the exponential distribution that is often reported. Our results represent, to our knowledge, the first quantitative evidence for sparse representations of sounds in the unanesthetized auditory cortex. Our results are compatible with a model in which most neurons are silent much of the time, and in which representations are composed of small dynamic subsets of highly active neurons.

Written by huehueteotl

January 30, 2008 at 6:42 pm

Can People Be Too Happy?

with one comment

Could the pursuit of happiness go too far? Most self-help books on the subject offer tips on how to maximize one’s bliss, but a new study suggests that moderate happiness may be preferable to full-fledged elation.

The researchers, from the University of Virginia, the University of Illinois and Michigan State University, looked at data from the World Values Survey, a large-scale analysis of economic, social, political and religious influences around the world. They also analyzed the behaviors and attitudes of 193 undergraduate students at Illinois.

Their findings challenge the common assumption that all measures of well-being go up as happiness increases. While many indicators of success and well-being do correspond to higher levels of happiness, the researchers report, those at the uppermost end of the happiness scale (people who report that they are 10s on a 10-point life satisfaction score) are in some measures worse off than their slightly less elated counterparts.

To put the findings in perspective, it is important to note that happiness generally correlates with all kinds of positive measures, said Illinois psychology professor Ed Diener, an author of the study. In general, the happier you are the more successful you are in terms of money, employment and relationships.

“Happy people are more likely (than unhappy people) to get married, are more likely to stay married, are more likely to think their marriage is good,” Diener said. “They’re more likely to volunteer. They’re more likely to be rated highly by their supervisor and they’re more likely to make more money.”

Happy people are also, on average, healthier than unhappy people and they live longer, Diener said. And, he said, some research indicates that happiness is a cause of these sources of good fortune, not just a result.

“But there is a caveat, and that is to say: Do you then have to be happier and happier” How happy is happy enough?”

The research team began with the prediction that mildly happy people (those who classify themselves as eights and nines on the 10-point life satisfaction scale) may be more successful in some realms than those who consider themselves 10s. This prediction was based on the idea that profoundly happy people may be less inclined to alter their behavior or adjust to external changes even when such flexibility offers an advantage.

Their analysis of World Values Survey data affirmed that prediction.

“The highest levels of income, education and political participation were reported not by the most satisfied individuals (10 on the 10-point scale),” the authors wrote, “but by moderately satisfied individuals (8 or 9 on the 10-point scale).”

The 10s earned significantly less money than the eights and nines. Their educational achievements and political engagement were also significantly lower than their moderately happy and happy-but-not-blissful counterparts.

In the more social realms, however, the 10s were the most successful, engaging more often in volunteer activities and maintaining more stable relationships.

The student study revealed a similar pattern in measures of academic and social success. In this analysis, students were categorized as unhappy, slightly happy, moderately happy, happy or very happy. Success in the categories related to academic achievement (grade-point average, class attendance) and conscientiousness increased as happiness increased, but dropped a bit for the individuals classified as very happy. In other words, the happy group outperformed even the very happy in grade-point average, attendance and conscientiousness.

Those classified as very happy scored significantly higher on things like gregariousness, close friendships, self-confidence, energy and time spent dating.

The data indicate that happiness may need to be moderated for success in some areas of life, such as income, conscientiousness and career, Diener said.

“The people in our study who are the most successful in terms of things like income are mildly happy most of the time,” he said.

In an upcoming book on the science of well-being, Diener notes that being elated all the time is not always good for one’s success — or even for one’s health. Reviews of studies linking health and emotions show that for people who have been diagnosed with serious illnesses, being extremely happy doesn’t always improve survival rates, Diener said. This may be because the elated don’t worry enough about issues that can have profound implications for their ability to survive their illness, he said.

“Happy people tend to be optimistic and this might lead them to take their symptoms too lightly, seek treatment too slowly, or follow their physician’s orders in a half-hearted way,” he writes.

All in all, Diener said, the evidence indicates that happiness is a worthy goal for those who lack it, but the endless pursuit of even more happiness for the already happy may be counterproductive.

“If you’re worried about success in life, don’t be a 1, 2, 3 or 4 (on the 10-point scale),” Diener said. “If you are unhappy or only slightly happy, you may need to seek help or read those self-help books or do something to make yourself happier. But if you’re a 7 or 8, maybe you’re happy enough!”

Perspectives on Psychological Science December 2007 (Volume 2, Issue 4)

The Optimum Level of Well-Being: Can People Be Too Happy?

Shigehiro Oishi, Ed Diener, and Richard E. Lucas

Psychologists, self-help gurus, and parents all work to make their clients, friends, and children happier. Recent research indicates that happiness is functional and generally leads to success. However, most people are already above neutral in happiness, which raises the question of whether higher levels of happiness facilitate more effective functioning than do lower levels. Our analyses of large survey data and longitudinal data show that people who experience the highest levels of happiness are the most successful in terms of close relationships and volunteer work, but that those who experience slightly lower levels of happiness are the most successful in terms of income, education, and political participation. Once people are moderately happy, the most effective level of happiness appears to depend on the specific outcomes used to define success, as well as the resources that are available.

Written by huehueteotl

January 29, 2008 at 12:02 pm

‘Satiety Center’ Of The Mouse Brain

with one comment

By pitting two forces — hunger and circadian rhythms — against each other, researchers at Rockefeller University have identified the region of the mouse brain that first registers changes in food availability. The research, as aforesaid in mice, suggests that shifting the timing of a meal increases mental alertness even during times when they are usually at rest, findings that, perhaps, may have implications for targeting health concerns such as obesity and diabetes as well as optimizing performance on tasks that require sustained vigilance in humans.

To pit the need for food against the need for sleep, scientists led by Donald Pfaff, head of the Laboratory of Neurobiology and Behavior, gradually shifted the mice’s mealtime during the night, when mice are most active, to a four-hour window during the day, when they are usually at rest. Three days after the mealtime shift, the mice began to show classic signs of anticipatory behavior: wheel-running an hour or two before the timed meal. Compared to control animals, the shifted mice ran three times the distance on the wheel — increased activity signaling a heightened sense of alertness. This behavior also suggests that the light-dark cycle no longer regulated the mice’s behavioral arousal; food did.

The researchers used immunocytochemistry to test where in the brain these two arousal pathways converge. Out of the 16 brain regions tested, only one had become activated: the ventromedial hypothalamus, a group of neurons known as the satiety center of the brain. Animals, including humans, tend to stop eating when this region is activated, and damage to this group of neurons leads to obesity. The activity of the paraventricular nucleus, a region that produces many hormones, was decreased.

“Since we examined the brain as close as possible to the development of this anticipatory behavior,” says postdoc Ana Ribeiro, “the neuronal changes we observed are the ones most likely causing the changes in behavioral arousal. These regions are thus the best targets for modulating arousal.”

As about implications for humans, first author Ribeiro daringly claims that to optimize performance on tasks that require sustained vigilance, ones performed by air-traffic controllers, physicians, the military and others, understanding the neural mechanisms and molecules involved in mediating arousal becomes important. “This research,” she says, “gives us a big clue as to what these mechanisms may be.”

PNAS | December 11, 2007 | vol. 104 | no. 50 | 20078-20083
Two forces for arousal: Pitting hunger versus circadian influences and identifying neurons responsible for changes in behavioral arousal
Ana C. Ribeiro*,{dagger}, Evelyn Sawa*, Isabelle Carren-LeSauter*, Joseph LeSauter{ddagger}, Rae Silver{ddagger},§, and Donald W. Pfaff*,{dagger}

*Laboratory of Neurobiology and Behavior, The Rockefeller University, New York, NY 10021; {ddagger}Department of Psychology, Barnard College, New York, NY 10027; §Department of Psychology, Columbia University, New York, NY 10027; and Department of Anatomy and Cell Biology, College of Physicians and Surgeons, Columbia University, New York, NY 10032

Contributed by Donald W. Pfaff, October 24, 2007 (received for review July 6, 2007)

The mechanisms underlying CNS arousal in response to homeostatic pressures are not known. In this study, we pitted two forces for CNS arousal against each other (circadian influences vs. restricted food availability) and measured the neuronal activation that occurs in a behaviorally defined group of animals that exhibited increased arousal in anticipation of feeding restricted to their normal sleeping time. The number of c-FOS+ neurons was significantly increased only in the ventromedial nucleus of the hypothalamus (VMH) in these mice, compared with control animals whose feeding was restricted to their normal active and feeding time (P < 0.01). Because the activation of VMH neurons coincides with the earliest signs of behavioral arousal preceding a change in meal time, we infer that VMH activation is involved in the increased arousal in anticipation of food.

Written by huehueteotl

January 29, 2008 at 9:54 am