Fit Mind


Many people hit the gym or pound the pavement to improve cardiovascular health, build muscle, and of course, get a rockin’ bod, but working out has above-the-neck benefits, too. For the past decade or so, scientists have pondered how exercising can boost brain function. Regardless of age or fitness level (yup, this includes everyone from mall-walkers to marathoners), studies show that making time for exercise provides some serious mental benefits. Get inspired to exercise by reading up on these unexpected ways that working out can benefit mental health, relationships and lead to a healthier and happier life overall.

Workout-Benefit-for-Brain-Health

1. Reduce Stress 

Rough day at the office? Take a walk or head to the gym for a quick workout. One of the most common mental benefits of exercise is stress relief. Working up a sweat can help manage physical and mental stress. Exercise also increases concentrations of norepinephrine, a chemical that can moderate the brain’s response to stress. So go ahead and get sweaty — working out can reduce stress and boost the body’s ability to deal with existing mental tension. Win-win!

r-83732998-large570

2. Boost Happy Chemicals
Slogging through a few miles on the ‘mill can be tough, but it’s worth the effort! Exercise releases endorphins, which create feelings of happiness and euphoria. Studies have shown that exercise can even alleviate symptoms among the clinically depressed. For this reason, docs recommend that people suffering from depression or anxiety (or those who are just feeling blue) pencil in plenty of gym time. In some cases, exercise can be just as effective as antidepressant pills in treating depression. Don’t worry if you’re not exactly the gym rat type — getting a happy buzz from working out for just 30 minutes a few times a week can instantly boost overall mood.

???????????????????????

3. Improve Self-Confidence
Hop on the treadmill to look (and more importantly, feel) like a million bucks. On a very basic level, physical fitness can boost self-esteem and improve positive self-image. Regardless of weight, size, gender or age, exercise can quickly elevate a person’s perception of his or her attractiveness, that is, self-worth. How’s that for feeling the (self) love?

4. Enjoy The Great Outdoors
For an extra boost of self-love, take that workout outside. Exercising in the great outdoors can increase self-esteem even more. Find an outdoor workout that fits your style, whether it’s rock-climbing, hiking, renting a canoe or just taking a jog in the park. Plus, all that Vitamin D acquired from soaking up the sun (while wearing sunscreen, of course!) can lessen the likelihood of experiencing depressive symptoms. Why book a spa day when a little fresh air and sunshine (and exercise) can work wonders for self-confidence and happiness?

original

5. Prevent Cognitive Decline

It’s unpleasant, but it’s true — as we get older, our brains get a little… hazy. As aging and degenerative diseases like Alzheimer’s kill off brain cells, the noggin actually shrinks, losing many important brain functions in the process. While exercise and a healthy diet can’t “cure” Alzheimer’s, they can help shore up the brain against cognitive decline that begins after age 45 Working out, especially between age 25 and 45, boosts the chemicals in the brain that support and prevent degeneration of the hippocampus, an important part of the brain for memory and learning.

Benefits-Of-Exercise

6. Alleviate Anxiety
Quick Q&A: Which is better at relieving anxiety — a warm bubble bath or a 20-minute jog? You might be surprised at the answer. The warm and fuzzy chemicals that are released during and after exercise can help people with anxiety disorders calm down. Hopping on the track or treadmill for some moderate-to-high intensity aerobic exercise (intervals, anyone?) can reduce anxiety sensitivity. And we thought intervals were just a good way to burn calories!

Benefits-Of-Exercise-On-Mental-Health

7. Boost Brainpower
Those buff lab rats might be smarter than we think. Various studies on mice and men have shown that cardiovascular exercise can create new brain cells (akaneurogenesis) and improve overall brain performance. Ready to apply for a Nobel Prize? Studies suggest that a tough workout increases levels of a brain-derived protein (known as BDNF) in the body, believed to help with decision making, higher thinking and learning. Smarty (spandex) pants, indeed.

8. Sharpen Memory
Get ready to win big at Go Fish. Regular physical activity boosts memory and ability to learn new things. Getting sweaty increases production of cells in hippocampusresponsible for memory and learning. For this reason, research has linked children’sbrain development with level of physical fitness (take that, recess haters!). But exercise-based brainpower isn’t just for kids. Even if it’s not as fun as a game of Red Rover, working out can boost memory among grown-ups, too. A study showed thatrunning sprints improved vocabulary retention among healthy adults.

images

9. Help Control Addiction
The brain releases dopamine, the “reward chemical” in response to any form of pleasure, be that exercise, sex, drugs, alcohol or food. Unfortunately, some people become addicted to dopamine and dependent on the substances that produce it, like drugs or alcohol (and more rarely, food and sex). On the bright side, exercise can help in addiction recovery. Short exercise sessions can also effectively distract drug oralcohol addicts, making them de-prioritize cravings (at least in the short term). Working out when on the wagon has other benefits, too. Alcohol abuse disrupts many body processes, including circadian rhythms. As a result, alcoholics find they can’t fall asleep (or stay asleep) without drinking. Exercise can help reboot the body clock, helping people hit the hay at the right time.

10. Increase Relaxation
Ever hit the hay after a long run or weight session at the gym? For some, a moderate workout can be the equivalent of a sleeping pilleven for people with insomnia. Moving around five to six hours before bedtime raises the body’s core temperature. When the body temp drops back to normal a few hours later, it signals the body that it’s time to sleep.

mental-benefits-of-exercise

11. Get More Done
Feeling uninspired in the cubicle? The solution might be just a short walk or jog away. Research shows that workers who take time for exercise on a regular basis are more productive and have more energy than their more sedentary peers. While busy schedules can make it tough to squeeze in a gym session in the middle of the day, some experts believe that midday is the ideal time for a workout due to the body’scircadian rhythms.

12. Tap Into Creativity
Most people end a tough workout with a hot shower, but maybe we should be breaking out the colored pencils instead. A heart-pumping gym session can boost creativity for up to two hours afterwards. Supercharge post-workout inspiration by exercising outdoors and interacting with nature (see benefit #4). Next time you need a burst of creative thinking, hit the trails for a long walk or run to refresh the body and the brain at the same time.

exercise

13. Inspire Others
Whether it’s a pick-up game of soccer, a group class at the gym, or just a run with a friend, exercise rarely happens in a bubble. And that’s good news for all of us. Studies show that most people perform better on aerobic tests when paired up with a workout buddy. Pin it to inspiration or good old-fashioned competition, nobody wants to let the other person down. In fact, being part of a team is so powerful that it can actuallyraise athletes’ tolerances for pain. Even fitness beginners can inspire each other to push harder during a sweat session, so find a workout buddy and get moving!

shutterstock_83598049-617x416

Working out can have positive effects far beyond the gym (and beach season). Gaining self-confidence, getting out of a funk, and even thinking smarter are some of the motivations to take time for exercise on a regular basis.

 

* Text by Sophia Breene, Huff Post (3/27/2013)

Weighty choices can be shifted by surprising factors.

how-your-moral-decisions-shaped-by-mood_1

Imagine you’re standing on a footbridge over some trolley tracks. Below you, an out-of-control trolley is bearing down on five unaware individuals standing on the track. Standing next to you is a large man. You realize that the only way to prevent the five people from being killed by the trolley is to push the man off the bridge, into the path of the trolley. His body would stop the trolley, saving the lives of the five people further down the track.

What would you do? Would you push the man to save the others? Or would you stand by and watch five people die, knowing that you could have saved them? Regardless of which option you choose, you no doubt believe that it will reflect your deeply held personal convictions, not trifles such as your mood.

Well, think again. In a paper published in the March edition of the journal Cognition, a group of German researchers have shown that people’s mood can strongly influence how they respond to this hypothetical scenario. Though this general observation is well-known in the literature on moral judgments and decision making, the current paper helps to resolve a question which has long lurked in the background. That is, how does this happen? What is the mechanism through which moods influence our moral decisions?

Early research showed a difference between personal moral decisions, such as the footbridge problem above, and impersonal moral decisions, such as whether to keep money found in a lost wallet. Areas of the brain usually characterized as responsible for processing emotional information seemed to be more strongly engaged when making these personal as opposed to impersonal moral decisions, they found. These scientists concluded that emotions were playing a strong role in these personal moral judgments while the more calculating, reasoning part of our mind was taking a siesta.

Unfortunately, given the various shortcomings of previous investigations on this particular topic, there are a variety of other explanations for the observation that emotions, or the more general emotional states known as moods, affect how people may respond to the footbridge scenario.

For example, moods could influence the thought process itself.  This is the “moral thought” hypothesis: just as something like attention may change our thought process by biasing how we perceive two choices, mood could also bias our thought process, resulting in different patterns of moral thinking. This is different from the “moral emotion” hypothesis, which suggests that emotions directly change how we feel about the moral choice. That is, our good mood could making us feel better (or worse) about potentially pushing, and therefore more (or less) likely to do it. Resolving this ambiguity with neuroimaging studies such as the one detailed above is difficult because of fMRI’s low temporal resolution – a brain scan is similar to taking a camera with the exposure set to a couple of seconds. This makes it difficult to faithfully capture events which happen quickly, such as whether moods change the experience of the decision, or if they directly influence the thought process.

To test these competing ideas, participants were first put into a specific mood by listening to music and write down an autobiographical memory. Those in the positive mood condition listened to Mozart’s Eine Kleine Nachtmusic and wrote down a positive memory, while those in the negative mood condition listened to Barber’s Adagio for Strings, Opus 11 and wrote down a negative memory. The participants in the neutral mood condition listened to Kraftwerk’s Pocket Calculator and wrote about a neutral memory.

After this mood induction procedure, participants were then presented with the trolley scenario. Some participants were asked: “Do you think it is appropriate to be active and push the man?” while others were asked “Do you think it is appropriate to be passive and not push the man?”.

Participants in a positive mood were more inclined to agree to the question, regardless of which way it was asked. If asked if it was okay to push, they were more likely to push. If asked if it was okay not to push, they were more likely to not push. The opposite pattern was found for those in a negative mood.

If mood directly changed our experience of potentially pushing — the moral emotion hypothesis — then putting people in a positive mood should have made them more likely to push, no matter how the question was asked. The ‘moral thought’ hypothesis, on the other hand, accounts for these results quite nicely. Specifically, it is known from previous research that positive moods validate accessible thoughts, and negative moods invalidate accessible thoughts. So, for example, if I ask you if it’s okay to push, you will begin to consider the act of pushing, making this thought accessible. If you’re in a positive mood, that mood acts on this thought process by making you more likely to feel as though this is an acceptable behavior – it validates the thought of pushing. On the other hand, if I were to ask if it is okay to not push, the positive mood should validate the thought of not pushing, leading you to feel like not pushing is an acceptable behavior. Negative mood, which invalidates accessible thought, has a parallel effect, but in the opposite direction. Thus, this idea fits well with the observed pattern of results in this experiment.

These findings raise some further questions, some of which psychologists have been attempting to answer for a long time. Emotions and logical thought are frequently portrayed as competing processes, with emotions depicted as getting in the way of effective decision-making. The results here are another demonstration that instead of competing, our emotions and our cognitions interact and work closely to determine our behaviors. In fact, some researchers have recently begun to suggest that the division between these two is rather tough to make, and there may not actually be any meaningful difference between thought and emotion. After all, if moods and emotions play a fundamental role in information processing, what differentiates them on a functional level from other basic kinds of cognitive processes, such as attention or memory? This paper obviously doesn’t resolve this issue, but it is certainly another piece of the puzzle.

It would also be exciting, as the authors say, to see how more specific emotions might influence our moral decision-making. Anger and sadness are both negative emotions, but differ in important ways. Could these subtle differences also lead to differences in how we make moral judgments?

This paper demonstrates that our professed moral principles can be shifted by subtle differences in mood and how a question is posed. Though there are plenty of implications for our daily lives, one that arguably screams the loudest concerns the yawning gap between how humans actually think and behave, and how the legal system pretends they think and behave. The relative rigidity of western law stands in stark contrast to the plasticity of human thought and behavior. If a simple difference in mood changes how likely one person is to throw another over a footbridge, then does this imply that the law should account for a wider variety of situational factors than it does presently? Regardless of how you feel, it is clear that this paper, and behavioral science in general, should contribute to the decision. Having a legal system based on reality is far preferable to one based on fantasy.

 

By Travis Riddle (March 2013)

ABOUT THE AUTHOR(S)

Travis Riddle is a doctoral student in the psychology department at Columbia University. His work in the Sparrow Lab focuses on the sense of control people have over their thoughts and actions, and the perceptual and self-regulatory consequences of this sense of control.

Eco Truly Park, at kilometer 63 of the Panamericana Norte, by Pasamayo, is the largest farm in South America built by a community of Hare Krishnas. Its peculiar trulys, cone-shaped constructions made with mud and organic material, invite the visitor to feel like part of nature and to get in touch with the universe.

“The trulys are a natural way of living, because in the world, there are not square things. The planet is round and spins in a circle around the sun, the seasons turn…the atoms turn, even our blood circulates through our blood to get to our heart,” says the Krishna monk. “They are constructions where the energy moves circularly and tries to separate us from this square thinking, as happens in normal homes, or notebooks or the TVs we are used to watching.”

The Hare Krishnas try to remove themselves from everything that damages nature, and they practice the philosophy of universal love and respect for everything that exists on the planet. They take a bit from all of the world’s religion and seek to enter into contact with their spirits, to serve and to worship the gods or creators of the universe, living a lifestyle that doesn’t harm anything or anyone on earth.

They are in a constant state of pilgrimage, trying to complete the work of building more communities like Eco Truly around the world, and following the teachings of their spiritual masters. Everything on the farm is done organically, from eating, growing plants, doing chores and even going to the bathroom.

It’s a perfect location to medítate, practice yoga, eat healthy and find out more about this religión or way of life. The compound offers housing and vegetarian food. If you want a different kind of weekend and, why not, to try a different way of living, visit this park on the Chacra y Mar beach in the district of Aucallama, north of the capital.


 

For more information, visit this link: 

 http://volunteeringecotrulypark.blogspot.com/

 

In the 1993 movie “Groundhog Day,” Bill Murray plays Phil Connors, a reporter who, confronted with living the same day over and over again, matures from an arrogant, self-serving professional climber to someone capable of loving and appreciating others and his world. Murray convincingly portrays the transformation from someone whose self-importance is difficult to abide into a person imbued with kindness.  It seems that the Nietzschean test of eternal return, insofar as it is played out in Punxsutawney, yields not an overman but a man of decency.

But there is another story line at work in the film, one we can see if we examine Murray’s character not in the early arrogant stage, nor in the post-epiphany stage, where the calendar is once again set in motion, but in the film’s middle, where he is knowingly stuck in the repetition of days. In this part of the narrative, Murray’s character has come to terms with his situation. He alone knows what is going to happen, over and over again.  He has no expectations for anything different.  In this period, his period of reconciliation, he becomes a model citizen of Punxsutawney. He radiates warmth and kindness, but also a certain distance.

The early and final moments of “Groundhog Day” offer something that is missing during this period of peace:  passion. Granted, Phil Connors’s early ambitious passion for advancement is a far less attractive thing than the later passion of his love for Rita (played by Andie MacDowell).  But there is passion in both cases.  It seems that the eternal return of the same may bring peace and reconciliation, but at least in this case not intensity.

And here is where a lesson about love may lie.  One would not want to deny that Connors comes to love Rita during the period of the eternal Groundhog Day.  But his love lacks the passion, the abandon, of the love he feels when he is released into a real future with her. There is something different in those final moments of the film.  A future has opened for their relationship, and with it new avenues for the intensity of his feelings for her. Without a future for growth and development, romantic love can extend only so far.  Its distinction from, say, a friendship with benefits begins to become effaced.

There is, of course, in all romantic love the initial infatuation, which rarely lasts.  But if the love is to remain romantic, that infatuation must evolve into a longer-term intensity, even if a quiet one, that nourishes and is nourished by the common engagements and projects undertaken over time.

This might be taken to mean that a limitless future would allow for even more intensity to love than a limited one.  Romantic love among immortals would open itself to an intensity that eludes our mortal race.  After all, immortality opens an infinite future.  And this would seem to be to the benefit of love’s passion.  I think, however, that matters are quite the opposite, and that “Groundhog Day” gives us the clue as to why this is.  What the film displays, if we follow this interpretive thread past the film’s plot, is not merely the necessity of time itself for love’s intensity but the necessity of a specific kind of time:  time for development.  The eternal return of “Groundhog Day” offered plenty of time.  It promised an eternity of it.  But it was the wrong kind of time.  There was no time to develop a coexistence.  There was instead just more of the same.

The intensity we associate with romantic love requires a future that can allow its elaboration.  That intensity is of the moment, to be sure, but is also bound to the unfolding of a trajectory that it sees as its fate.  If we were stuck in the same moment, the same day, day after day, the love might still remain, but its animating passion would begin to diminish.

This is why romantic love requires death.

If our time were endless, then sooner or later the future would resemble an endless Groundhog Day in Punxsutawney.  It is not simply the fact of a future that ensures the intensity of romantic love; it is the future of meaningful coexistence.  It is the future of common projects and the passion that unfolds within them.  One might indeed remain in love with another for all eternity.  But that love would not burn as brightly if the years were to stammer on without number.

Why not, one might ask?  The future is open.  Unlike the future in “Groundhog Day,” it is not already decided.  We do not have our next days framed for us by the day just passed.  We can make something different of our relationships.  There is always more to do and more to create of ourselves with the ones with whom we are in love.

This is not true, however, and romantic love itself shows us why.  Love is between two particular people in their particularity.  We cannot love just anyone, even others with much the same qualities.  If we did, then when we met someone like the beloved but who possessed a little more of a quality to which we were drawn, we would, in the phrase philosophers of love use, “trade up.”  But we don’t trade up, or at least most of us don’t.  This is because we love that particular person in his or her specificity.  And what we create together, our common projects and shared emotions, are grounded in those specificities.  Romantic love is not capable of everything. It is capable only of what the unfolding of a future between two specific people can meaningfully allow.

Sooner or later the paths that can be opened by the specificities of a relationship come to an end.  Not every couple can, with a sense of common meaningfulness, take up skiing or karaoke, political discussion or gardening.  Eventually we must tread the same roads again, wearing them with our days.  This need not kill love, although it might.  But it cannot, over the course of eternity, sustain the intensity that makes romantic love, well, romantic.

One might object here that the intensity of love is a filling of the present, not a projection into the future.  It is now, in a moment that needs no other moments, that I feel the vitality of romantic love.  Why could this not continue, moment after moment?

To this, I can answer only that the human experience does not point this way.  This is why so many sages have asked us to distance ourselves from the world in order to be able to cherish it properly.  Phil Connors, in his reconciled moments, is something like a Buddhist.  But he is not a romantic.

Many readers will probably already have recognized that this lesson about love concerns not only its relationship with death, but also its relationship with life.  It doesn’t take eternity for many of our romantic love’s embers to begin to dim.  We lose the freshness of our shared projects and our passions, and something of our relationships gets lost along with them.  We still love our partner, but we think more about the old days, when love was new and the horizons of the future beckoned us.  In those cases, we needn’t look for Groundhog Day, for it will already have found us.

And how do we live with this?  How do we assimilate the contingency of romance, the waning of the intensity of our loves?  We can reconcile ourselves to our loves as they are, or we can aim to sacrifice our placid comfort for an uncertain future, with or without the one we love.  Just as there is no guarantee that love’s intensity must continue, there is no guarantee that it must diminish.  An old teacher of mine once said that “one has to risk somewhat for his soul.” Perhaps this is true of romantic love as well. The gift of our deaths saves us from the ineluctability of the dimming of our love; perhaps the gift of our lives might, here or there, save us from the dimming itself.

 

* Text By TODD MAY, NYT, FEBRUARY 26, 2012

 Todd May is Class of 1941 Memorial Professor of the Humanities at Clemson University.  His forthcoming book, “Friendship in an Age of Economics,” is based on an earlier column for The Stone.

Actor Charlie Sheen, known for his heavy cocaine use, has been stating in interviews that he freed himself of his drug habit. How likely is that?


When asked recently on The Today Showhow he cured himself of his addiction, Two and a Half Men sitcom star Charlie Sheen replied, “I closed my eyes and made it so with the power of my mind.”
Until last month, he was the highest paid actor on TV, despite his well-known bad-boy lifestyle and persistent problems with alcohol and cocaine. After the rest of his season’s shows were canceled by producers, Sheen has gone on an interview tear with many bizarre statements, including that he is on a “winning” streak. His claims of quitting a serious drug habit on his own, however, is perhaps one of his least eccentric statements.

A prevailing view of substance abuse, supported by both the National Institute on Drug Abuse and Alcoholics Anonymous, is the disease model of addiction. The model attributes addiction largely to changes in brain structure and function. Because these changes make it much harder for the addict to control substance use, health experts recommend professional treatment and complete abstinence.

But some in the field point out that many if not most addicts successfully recover without professional help. A survey by Gene Heyman, a research psychologist at McLean Hospital in Massachusetts, found that between 60 to 80 percent of people who were addicted in their teens and 20s were substance-free by their 30s, and they avoided addiction in subsequent decades. Other studies on Vietnam War veteranssuggest that the majority of soldiers who became addicted to narcotics overseas later stopped using them without therapy.

Scientific American spoke with Sally Satel, a resident scholar at the American Enterprise Institute for Public Policy Research and lecturer in psychiatry at the Yale University School of Medicine, about quitting drugs without professional treatment. Satel was formerly a staff psychiatrist at the Oasis Clinic in Washington, D.C., where she worked with substance abuse patients.

[An edited transcript of the interview follows.]

Is it possible to cure yourself of addiction without professional help? How often does that happen?

Of course it’s possible. Most people recover and most people do it on their own. That’s in no way saying that everyone should be expected to quit on their own and in no way denies that quitting is a hard thing to do. This is just an empirical fact. It is even possible that those who quit on their own could have quit earlier if they sought professional help. The implicit message isn’t that treatment isn’t important for many—in fact it should probably be made more accessible—but it is simply a fact that most people cure themselves.

How do addicts stop on their own?

They have to be motivated. It takes the realization that their family, their future, their employment—all these—are becoming severely compromised. The subtext isn’t that they just “walk away” from the addiction. But I’ve had a number of patients in the clinic whose six-year-old says, “Why don’t you ever come to my ball games?” This can prompt a crisis of identity causing the addict to ask himself, “Is this the type of father I want to be?”

If not, there are lots of recovery strategies that users figure out themselves. For example, they change whom they associate with. They can make it harder to access drugs, perhaps by never carrying cash with them. People will put obstacles in front of themselves. True, some people decide they can’t do it on their own and decide to go into treatment—that’s taking matters into one’s own hands, too.


What do professional drug addiction programs offer that is difficult to replicate on one’s own?


If you’re already in treatment, you’ve made a big step. Even for court-ordered treatment, people often internalize the decision as their own. You get a lot of support. You get instruction in formal relapse prevention therapy. You might get methadone for withdrawal and medications for an underlying psychiatric problem.

Most experts regard drug addiction as a brain disease. Do you agree?
I’m critical of the standard view promoted by the National Institute on Drug Abuse that addiction is a brain disease. Naturally, every behavior is mediated by the brain, but the language “brain disease” carries the connotation that the afflicted person is helpless before his own brain chemistry. That is too fatalistic.

It also overlooks the enormously important truth that addicts use drugs to help them cope in some manner. That, as destructive as they are, drugs also serve a purpose. This recognition is very important for designing personalized therapies.


Don’t most studies show that addicts do better with professional help?


People who come to treatment tend to have concurrent psychiatric illness, and they also tend to be less responsive to treatment. Most research is done on people in a treatment program, so by definition you’ve already got a skewed population. This is called the “clinical illusion,” and it applies to all medical conditions. It refers to a tendency to think that the patients you see in a clinical setting fully represent all people with that condition. It’s not true. You’re not seeing the full universe of people.


Based on his public interviews, does it seem likely that Charlie Sheen cured himself?


I doubt it. Of course, I haven’t examined him, but based on what one sees, one would be concerned about ongoing drug use and underlying mental illness.


Is there brain damage from drug use? Is it possible to recover from such damage?


The only drugs that are neurotoxic are alcohol, methamphetamine, probably MDMA [ecstasy], and some inhalants.* Cocaine can lead to micro strokes. That’s brain damage. Yes, addiction changes the brain but this does not doom people to use drugs forever. The most permanent change is memories. Some people have stronger memories and they are more cue-reactive [more reactive to stimulus that triggers the reward pathway]. Nonaddicts won’t show that level of cue-reactivity.

For some people the addiction and withdrawal will be more intense through genetically mediated problems. Those people have a harder time stopping.


What else might account for Charlie Sheen’s strange behavior in those interviews?


One would want to explore the possibility of underlying psychiatric problems. The grandiosity, the loose associations, the jumbled flow suggest a thought disorder. Heavy, heavy drug use could cause that. Stimulant use can cause temporary thought disorder or intensify an underlying thought disorder or hypomanic state. To try to make a good diagnosis, whatever ongoing drug use there is would have to stop. After the withdrawal phase is resolved clinicians would then need to see if an underlying thought or mood disorder persisted. That would aid in parsing how much of a confusing clinical picture is due to drug use and how much is due to a primary mental disorder.


By Nina Bai , March 4, 2011

It’s hardly a secret that taking cocaine can change the way you feel and the way you behave. Now, a study published in the Jan. 8 issue of Science shows how it also alters the way the genes in your brain operate. Understanding this process could eventually lead to new treatments for the 1.4 million Americans with cocaine problems, and millions more around the world.

The study, which was conducted on mice, is part of a hot new area of research called epigenetics, which explores how experiences and environmental exposures affect genes. “This is a major step in understanding the development of cocaine addiction and a first step toward generating ideas for how we might use epigenetic regulation to modulate the development of addiction,” says Peter Kalivas, professor of neuroscience at the Medical University of South Carolina, who was not associated with the study. 

Though we think about our genes mostly in terms of the traits we pass on to our children, they are actually very active in our lives every day, regulating how various cells in our bodies behave. In the brain this can be especially powerful. Any significant experience triggers changes in brain genes that produce proteins — those necessary to help memories form, for example. But, says the study’s lead author, Ian Maze, a doctoral student at Mount Sinai School of Medicine, “when you give an animal a single dose of cocaine, you start to have genes aberrantly turn on and off in a strange pattern that we are still trying to figure out.”
Maze’s research focused on a particular protein called G9a that is associated with cocaine-related changes in the nucleus accumbens, a brain region essential for the experience of desire, pleasure and drive. The role of the protein appears to be to shut down genes that shouldn’t be on. One-time use of cocaine increases levels of G9a. But repeated use works the other way, suppressing the protein and reducing its overall control of gene activation. Without enough G9a, those overactive genes cause brain cells to generate more dendritic spines, which are the parts of cells that make connections to other cells.
Increases in the number of these spines can reflect learning. But in the case of addiction, that may involve learning to connect a place or a person with the desire for more drugs. Maze showed that even after a week of abstinence, mice given a new dose of cocaine still had elevated levels of gene activation in the nucleus accumbens, meaning G9a levels were still low. It is not known how long these changes can last. Maze also showed that when he intervened and raised G9a levels, the mice were less attracted to cocaine.
It’s a big leap from a mouse study to a human study, of course — and an even bigger leap to consider developing a G9a-based treatment for addiction. The protein regulates so many genes that such a drug would almost certainly have unwanted and potentially deadly side effects. But a better understanding of the G9a pathways could lead to the development of safer, more specific drugs. And studying the genes that control G9a itself could also help screen people at risk for cocaine addiction: those with naturally lower levels of the protein would be the ones to watch. Still, there’s a lot to be learned even from further mouse studies — particularly if the work involves younger mice, unlike the adults used in Maze’s research. (See the top 10 medical breakthroughs of 2009.)


“We know that the greatest vulnerability [to addiction] occurs when adolescents are exposed,” says Dr. Nora Volkow, director of the National Institute on Drug Abuse, which funded the study. “Would you see the same results in adolescent [mice]? And what happens during fetal exposure?”
New treatments are definitely needed for cocaine addiction: there are helpful medications for addiction to heroin and similar drugs, but so far, none are particularly useful against stimulants like cocaine and methamphetamine. And with federal reports now showing that more than two-thirds of all cocaine in the country is cut with a veterinary deworming drug called levamisole, which can cause potentially fatal immune-system problems, the risks from cocaine are greater — and the search for new answers more urgent than ever.

 

* By Maia Szalavitz Friday, Jan. 08, 2010

525Cartoon20090427Cutbacks on fireworkscartoon20090507Madoff  y negocios en prisionun madoff menos

Our political system sometimes produces such skewed results that it’s difficult not to blame bloviating politicians. But maybe the deeper problem lies in our brains.

5_11_15_11_25_11_95_11_10

Evidence is accumulating that the human brain systematically misjudges certain kinds of risks. In effect, evolution has programmed us to be alert for snakes and enemies with clubs, but we aren’t well prepared to respond to dangers that require forethought.

If you come across a garter snake, nearly all of your brain will light up with activity as you process the “threat.” Yet if somebody tells you that carbon emissions will eventually destroy Earth as we know it, only the small part of the brain that focuses on the future — a portion of the prefrontal cortex — will glimmer.

“We humans do strange things, perhaps because vestiges of our ancient brain still guide us in the modern world,” notes Paul Slovic, a psychology professor at the University of Oregon and author of a book on how our minds assess risks.
Consider America’s political response to these two recent challenges:

1. President Obama proposes moving some inmates from Guantánamo Bay, Cuba, to supermax prisons from which no one has ever escaped. This is the “enemy with club” threat that we have evolved to be alert to, so Democrats and Republicans alike erupt in outrage and kill the plan.

2. The climate warms, ice sheets melt and seas rise. The House scrounges a narrow majority to pass a feeble cap-and-trade system, but Senate passage is uncertain. The issue is complex, full of trade-offs and more cerebral than visceral — and so it doesn’t activate our warning systems.

“What’s important is the threats that were dominant in our evolutionary history,” notes Daniel Gilbert, a professor of psychology at Harvard University. In contrast, he says, the kinds of dangers that are most serious today — such as climate change — sneak in under the brain’s radar.

Professor Gilbert argues that the threats that get our attention tend to have four features. First, they are personalized and intentional. The human brain is highly evolved for social behavior (“that’s why we see faces in clouds, not clouds in faces,” says Mr. Gilbert), and, like gazelles, we are instinctively and obsessively on the lookout for predators and enemies.
Second, we respond to threats that we deem disgusting or immoral — characteristics more associated with sex, betrayal or spoiled food than with atmospheric chemistry.

“That’s why people are incensed about flag burning, or about what kind of sex people have in private, even though that doesn’t really affect the rest of us,” Professor Gilbert said. “Yet where we have a real threat to our well-being, like global warming, it doesn’t ring alarm bells.”

Third, threats get our attention when they are imminent, while our brain circuitry is often cavalier about the future. That’s why we are so bad at saving for retirement. Economists tear their hair out at a puzzlingly irrational behavior called hyperbolic discounting: people’s preference for money now rather than much larger payments later.

For example, in studies, most Americans prefer $50 now to $100 in six months, even though that represents a 100 percent return.
Fourth, we’re far more sensitive to changes that are instantaneous than those that are gradual. We yawn at a slow melting of the glaciers, while if they shrank overnight we might take to the streets.


In short, we’re brilliantly programmed to act on the risks that confronted us in the Pleistocene Age. We’re less adept with 21st-century challenges.
At the University of Virginia, Professor Jonathan Haidt shows his Psychology 101 students how evolution has prepared us to fear some things: He asks how many students would be afraid to stand within 10 feet of a friend carrying a pet boa constrictor. Many hands go up, although almost none of the students have been bitten by a snake.
“The objects of our phobias, and the things that are actually dangerous to us, are almost unrelated in the modern world, but they were related in our ancient environment,” Mr. Haidt said. “We have no ‘preparedness’ to fear a gradual rise in the Earth’s temperature.”
This short-circuitry in our brains explains many of our policy priorities. We Americans spend nearly $700 billion a year on the military and less than $3 billion on the F.D.A., even though food-poisoning kills more Americans than foreign armies and terrorists. We’re just lucky we don’t have a cabinet-level Department of Snake Extermination.
Still, all is not lost, particularly if we understand and acknowledge our neurological shortcomings — and try to compensate with rational analysis. When we work at it, we are indeed capable of foresight: If we can floss today to prevent tooth decay in later years, then perhaps we can also drive less to save the planet.

* Text By NICHOLAS D. KRISTOF , NYT, July 2, 2009

How can you stay sharp into old age? It is not just a matter of winning the genetic lottery. What you do can make a difference
fit-body-fit-mind_1
As everybody knows, if you do not work out, your muscles get flaccid. What most people don’t realize, however, is that your brain also stays in better shape when you exercise.

And not just challenging your noggin by, for example, learning a new language, doing difficult crosswords or taking on other intellectually stimulating tasks. As researchers are finding, physical exercise is critical to vigorous mental health, too.

Surprised? Although the idea of exercising cognitive machinery by performing mentally demanding activities—popularly termed the “use it or lose it” hypothesis—is better known, a review of dozens of studies shows that maintaining a mental edge requires more than that.

Other things you do—including participating in activities that make you think, getting regular exercise, staying socially engaged and even having a positive attitude—have a meaningful influence on how effective your cognitive functioning will be in old age.

Further, the older brain is more plastic than is commonly known. At one time, the accepted stereotype was that “old dogs can’t learn new tricks.” Science has proved that this dictum must be discarded.

Although older adults generally learn new pursuits more slowly than younger people do and cannot reach the peaks of expertise in a given field that they might have achieved if they had started in their youth, they nonetheless can improve their cognitive performance through effort—forestalling some of the declines in cognition that come with advancing age.

As John Adams, one of the founding fathers and the second U.S. president, put it: “Old minds are like old horses; you must exercise them if you wish to keep them in working order.”
The news comes at a propitious time.

The proportion of older adults in the U.S. and in other industrial nations continues to grow: in 1900, 4.1 percent of U.S. citizens were older than 65, but by 2000 that amount had jumped to 12.6 percent; by 2030, 20 percent of us will be in that category. From a societal point of view, prolonging independent functioning is both a desirable goal in itself and a way of deferring costs of long-term care.

For individuals, maintaining optimal cognitive functioning is worthwhile simply because it promises to enhance quality of life through the years.

Mental Training
How to keep minds keen over an entire life span is a question philosophers have mulled since the earliest writings on record. As Roman orator Cicero put it: “It is exercise alone that supports the spirits, and keeps the mind in vigor.”

Modern research in this field began in the 1970s and 1980s, with studies that demonstrated that healthy older adults can improve performance to a greater extent than had been previously assumed. The earlier research did not fully address certain questions, such as how long adults could retain the new skills they had acquired through training, whether those specifically developed skills would also positively influence other areas of cognition needed in everyday life, and whether the studies done with small numbers of subjects would be broadly applicable to most members of society.

The latest experiments confirm that cognitive training does show substantial benefits for older adults and that these effects can be relatively long-lasting. Around the turn of this past century the federal government’s National Institute on Aging funded a consortium of researchers to conduct a large-scale training study in a sample of older Americans.

In 2002 psychologist Karlene Ball of the University of Alabama at Birmingham and her colleagues published initial results on more than 2,500 individuals older than 65 who had received about 10 sessions of cognitive training. Participants were randomly assigned either to a cognitive-process training group to learn how to excel in one of three areas—memory, reasoning or visual search—or to a control group of subjects who did not receive training. At a follow-up two years later, the team randomly selected a set of the initial participants to get booster training prior to evaluation.

The results showed strong training-effect sizes in each group as compared with controls, along with a pattern of specificity in performance improvements. For example, individuals trained in visual search evinced strong gains in visual search performance but little improvement, relative to controls, on the memory and reasoning tests, a typical finding in training research. Data from retests five years later on the sample found that measurable training benefits were still present after the longer interval.

More impressive, however, are recent training studies that focus on what psychologists call executive function—how a person plans a strategic approach to a task, controls what is attended to, and how he or she manages the mind in the process. Unlike training that focuses on very specific skills, such as memorization strategies, training that aims to help people to control how they think appears to work on broader skills that are helpful in many situations that require thinking.

For instance, psychologist Chandramallika Basak and her colleagues at the University of Illinois recently showed that training in a real-time strategy video game that demands planning and executive control not only improved game performance but enhanced performance on other tasks measuring aspects of executive control.

Other results suggest that psychologists are learning how to train higher-level skills that may have a broader effect on cognitive function.

You don’t have to have specialized training, however, to achieve cognitive gains or ward off cognitive decline. Everyday activities such as reading can help. We reviewed evidence on activity-related cognitive enrichment in more than a dozen studies.

In 2003 neuropsychologist Robert S. Wilson and his colleagues at Rush University Medical Center in Chicago recruited more than 4,000 elderly people from a geographically defined community and rated their frequency of participation in seven cognitive activities (for instance, reading magazines). At three-year intervals for a mean of nearly six years, participants completed an in-home interview that included brief tests of cognitive function. More frequent cognitive activity at the outset was associated with reduced rate of cognitive decline over time.


Getting Physical
Over the past decade several studies have underscored the link between physical activity and cognition. For instance, in a study published in 2001 neuropsychiatrist Kristine Yaffe of the University of California, San Francisco, and her colleagues recruited 5,925 women older than 65 at four different medical centers across the U.S.

The participants were all free of any physical disability that would limit their ability to walk or pursue other physical activities. The volunteers were also screened to ensure that they did not have a cognitive impairment. The researchers then assessed their physical activity by asking the women how many city blocks they walked and how many flights of stairs they climbed daily and gave them a questionnaire to fill out about their levels of participation in 33 different physical activities. After six to eight years, the researchers assessed the women’s level of cognitive function.

The most active women had a 30 percent lower risk of cognitive decline. Interestingly, walking distance was related to cognition, but walking speed was not. It seems that even moderate levels of physical activity can serve to limit declines in cognition in older adults.


Moderate movement is good, but toning your circulatory system with aerobic exercise may be the real key to brain fitness. In a 1995 study of 1,192 healthy 70- to 79-year-olds, cognitive neuroscientist Marilyn Albert of Johns Hopkins University and her colleagues measured cognition with a battery of tasks that took approximately 30 minutes to complete and included tests of language, verbal memory, nonverbal memory, conceptualization and visuospatial ability. They found that the best predictors of cognitive change over a two-year period included strenuous activity and peak pulmonary expiratory flow rate. In an investigation published in 2004 epidemiologist Jennifer Weuve of Harvard University and her colleagues also examined the relation between physical activity and cognitive change over a two-year period in 16,466 nurses older than 70.

Participants logged how much time they spent per week in a variety of physical activities (running, jogging, walking, hiking, racket sports, swimming, bicycling, aerobic dance) over the past year and provided self-reports of walking pace in minutes per mile. Weuve’s group observed a significant relation between energy expended in physical activities and cognition, across a large set of cognitive measures.

The research that we have described thus far has examined mental performance over relatively short periods—just several years. A few studies have begun to look at what happens over longer timescales. In 2003 psychiatrist Marcus Richards of University College London and his colleagues examined in a cohort of 1,919 men and women the influence of self-reported physical exercise and leisure-time activities at age 36 on memory at age 43 and on memory change from ages 43 to 53.

Analyses indicated that engagement in physical exercise and other leisure-time activities at 36 was associated with higher memory scores at 43. Physical activity at 36 was also associated with a slower rate of memory decline from 43 to 53 years of age after adjusting for spare-time activity and other variables. The data also suggested little memory protection for those who stopped exercising after 36 but protection for those individuals who began to exercise after this time.


In 2005 then graduate student Suvi Rovio of the Karolinska Institute in Sweden and her colleagues examined the relation between physical activity at middle age and risk of dementia an average of 21 years later, when the cohort was between 65 and 79 years of age. Subjects indicated how often they participated in leisure-time physical activities that lasted at least 20 to 30 minutes and caused breathlessness and sweating. Conducting such activity at midlife at least twice a week was associated with a reduced risk of dementia in later life. Indeed, participants in the more active group had 52 percent lower odds of having dementia than the more sedentary group did.

Mind-Body Connection
It makes sense that training or participation in mentally stimulating activities would help cognition, but it is perhaps less immediately obvious why physical activity would have such an effect. Consider the increasingly well-documented link between physical activity and disease.

A plethora of studies have examined the health benefits of exercise and a nonsedentary lifestyle for prevention of disease. For example, we now know that physical activity reduces the risk of cardiovascular-related death, type 2 diabetes, colon and breast cancer, and osteoporosis. On the other hand, cardiovascular disease, diabetes and cancer have been associated with compromised cognition. Therefore, you might expect that increased physical activity and exercise would maintain cognition by reducing risk of diseases associated with cognitive decline.

In a study published in 2006 psychologist Stanley J. Colcombe of the University of Illinois and his colleagues examined the influence of fitness training on potential changes in brain structure. The six-month trial included 59 healthy but sedentary community-dwelling volunteers, age 60 to 79.

Brain scans after fitness training showed that even relatively short exercise interventions can begin to restore some of the losses in brain volume associated with normal aging.

Supporting these findings, a large body of nonhuman animal research has demonstrated a number of changes in brain structure and function after animals are exposed to enriched, or complex, environments. Enriched environments usually include running wheels, a multitude of toys and objects to climb that are changed frequently, and animal companions.

Exposure to such environments yields several physiological benefits. First, it increases the formation of new dendrite branches and synapses—the areas of neural cells that receive and send communication signals. It also increases the number of glial cells, which support the health of neurons, and expands the brain’s oxygen-supplying capillary network.

Enriched environments foster the development of new neurons and create a cascade of molecular and neurochemical changes, such as an increase in neurotrophins—molecules that protect and grow the brain.

Doing puzzles and push-ups are helpful for some—but other factors also boost mental fitness. For one, getting involved in social groups both improves cognition in general and seems to help thwart the arrival of dementia. The traditional focus of this research has been on relatively objective measures of social isolation versus connectedness, including the extent to which a person participates in activities that prominently involve social interaction (such as doing volunteer work), the number of friends and relatives an individual contacts regularly (in other words, the size of his or her social network), and marital status.

Findings about the positive aspects of attitudes and beliefs on adult cognition are spottier. In large part, positive beliefs and attitudes may have important indirect effects on cognitive enrichment because of their influence on the kinds of behaviors (for instance, exercise and mentally stimulating activities) that are known to be associated with cognitive enrichment.

More generally, individuals who are optimistic, agreeable, open to new experiences, conscientious, positively motivated and goal-directed are more likely to undergo successful aging, to take advantage of opportunities, to cope more effectively with life circumstances, to effectively regulate emotional reactions to events, and to maintain a sense of well-being and life satisfaction in the face of challenge.


And just as maintaining some activity patterns in old age may reduce risk of cognitive decline, the persistence of other patterns of behavior may actually increase the risk. Chronic psychological distress—resulting from depression, anxiety and negative emotions such as anger and shame—is associated with a variety of negative outcomes in adulthood, including cognitive decline. The tendency to experience psychological distress is often called neuroticism. Studies have consistently found a higher level of neuroticism to be linked to an increased incidence of Alzheimer’s disease and mild cognitive impairment in old age.
Enriching Cognition
Clearly, there is no magic pill or one-shot vaccine that inoculates the individual against cognitive decline in old age. Thus, public policy regarding cognitive enrichment should follow a health prevention model. Policy leaders might promote intellectual activities that are inherently meaningful for older adults, perhaps as embedded in larger social contexts (for example, the Elderhostel movement or adult continuing education). A critical issue for future research will be to understand how an engaged way of life can be promoted and implemented in midlife, during the working years. Given inevitable conflicts between work demands and time available for other roles (parenting, for one) and activities, it would be useful to know whether work-related activity programs (such as availability and use of physical exercise facilities at or near the workplace) could help foster an enriching lifestyle.
At the same time, the public must be aware that there is still much that is not known about cognitive fitness in old age, as well as some controversy about the magnitude and durability of mental exercise outcomes. People are beginning to market computer games and other means of exercising the mind, often making strong claims about the effectiveness of expensive products that have not been backed by actual scientific studies. Consumers should look for evidence demonstrating the benefits of any such products, which may not necessarily incorporate all the features needed to enhance mental fitness in old age.
The next decades offer much promise for expanding our knowledge about aging and cognition. We may soon discover whether the limits on successful cognitive functioning in old age that were once seen as insurmountable can ultimately be viewed as pessimistic assumptions that focused on observable age-related decline rather than the potential for maximizing human performance through cognitive enrichment. Just as advances in medical science may lead to increased longevity through vehicles such as effective treatments for dementia-causing illnesses, advances in psychological science can make important contributions to improving the quality of life of long-living older adults, in part by empirically demonstrating that attitudes and behaviors can promote cognitive functioning in old age and, more generally, by showing how behavioral interventions can help us all age successfully.

* By Christopher Hertzog, Arthur F. Kramer, Robert S. Wilson and Ulman Lindenberger