July 2009

525Cartoon20090427Cutbacks on fireworkscartoon20090507Madoff  y negocios en prisionun madoff menos

I WAS driving home listening to the radio on Thursday when I heard that Michael Jackson was dead. I turned it off and thought about my brother. It’s not that I don’t think about him — I suppose I could say that though he’s only sometimes on my mind, he’s always in it. He moves, or I move him, from an impressionistic figure in a background to a man in sharp focus.


He is in trouble and, now that I think about it, has been for the better part of a quarter-century. Over that time we’ve become strangers. We haven’t had a substantive conversation in years — not about my marriage, my children, our father’s death, or why we’ve grown apart. It has, of course, something to do with logistics, geography, the complications of growing up and old, but it’s mostly because we don’t trust each other, and perhaps never have. I know he’s a liar, and he knows I don’t care about him, or what he thinks of me, what he thinks of anything.

I can’t remember in my adult life when I’ve thought of him fairly: He’s in jail now, far away from home, and it’s difficult to imagine that, now that he’s 46, there’s much more in his future than more of the same. And so I have, unfairly — though for me, practically — decided what remains for him so that I may rationalize, sanctimoniously, how I’ll continue to reject him.

I’ve allowed his troubles, though, to affect how I think about our pasts. Because of who I am now, my troubles as a boy, adolescent, young adult and man are things I’ve overcome: that makes me good, strong. Looking back through the lens of my current success ennobles my past — how I failed, or was failed — to the same degree my brother’s failure derides his.

And as liberal or dynamic-thinking as I’m supposed to be, my decision to reject him is informed by the inhumane notion of tough love: he had his chances, more than I, and look what he’s done with them.

What does this have to do with Michael Jackson? Grimly, I’ve been waiting for — through the hair relaxer, the surgeries, the blanching, the eccentricities and the fall into madness — that news flash. Just as I’ve been waiting for a phone call telling me my brother is dead, or at least, far beyond any possible care — our gap ever widening, irreconcilable.

The Jackson 5 — not the Jacksons, or any other iteration — were like family. My brother, sister, cousins and I grew up with them. When we were young and together, they were all we’d listen to, and really, in pop-culture terms, all we really knew about. They were a conduit — unlike the dead, white authors we were told to read and emulate, and the dead, black martyrs we were to told to mimic and revere — to a larger world. His death, as I’m sure is the case for many others, makes me scan time backward and forward in order to understand, how he, we, came to be here now. My backward trek — unlike my children’s relationship to their time capsule discovery of “Thriller” and “Beat It,” unlike my white high school friends and their adoration of “Billie Jean” — has little connection to those periods, that person.

My Michael — not M. J., or Jacko — has an Afro, a broad nose and deep brown skin. His voice, rather than clipped and formulaic, is clear, ringing and bright. His vocalizations aren’t clicks and chirps, but screeches and moans.

I never believed in anything he sang as an adult. He never seemed to understand, or be convinced himself. But now, listening to him sing as a child, even though I know he was too young to have a clue as to what he was singing about, I feel, just as I felt when I was a boy, that he knew and he felt exactly what he was singing.

Do you remember those Polaroid Instamatics? The ones with the accordion muzzle? With the hot popping one-time flash, the chemical smell of that toxic magic jelly? You had to pull the print from the side of the camera by the tab of the cellophane it was wrapped in, and wait 60 seconds before peeling it open like a Kraft single. I remember counting down, or up, the time, and then adding extra seconds, and still more by shaking the package, or wiping my fingers — “Come on! Come on!” — before looking. There’d been, even in my young life, too many blotchy prints, ghost images, lost moments.

I have a framed photograph in my bedroom. It’s of my brother, David, my sister, Tracey, my two first cousins — Lisa and Russell — and me. It’s at my aunt and uncle’s house. We’re all very young. We’re all dressed in fantastic early-’70s gear. (I’m wearing a bright red long-sleeve jersey with a blue collar, bellbottoms with wide white and blue vertical stripes and moccasins.) We are dancing. I’m doing — although some years before it was named — what can only be called the rock.

Tracey has her arms stretched high and spread wide. Russell, I think, would’ve twirled had he not looked up and seen the camera. Most people who see this photo have the same reaction — a melancholic smile — and then they guess, not the group, because everyone knows, but the song.

I tell them that we were the Cousin 5. Then they state: “You’re Michael.” I tell them no. They scan the 3×5, confused. I point to David, the oldest. He looks, from how the picture was taken, to be on the periphery, but he’s really up front, while we other four are gathered in support. “Who are you, then?”
“I’m Jermaine.”

They snicker. My brother, to most of my friends, has always seemed ridiculous. To me, David’s a combination of all the things I despise in people. I’ve always thought him to be narcissistic, self-serving and grandiose. He took the qualities that made our father charismatic, and made them petty. And although even as a boy I sensed my father was both delusional and a liar, his performances — where we’d go, what we’d do, how we’d make it — were inspirational. My brother sounded like a petty dictator, or a bad con.

After our father left, David — four years older than I — tried to fill that void. He wasn’t good at it. The space was too big and most of what he did for or to me was wrong. I knew I was smarter than him, I knew I’d be bigger than him, and I didn’t understand why I didn’t outrank him. It seemed absurd that he had authority over me in anything.

But I never, fairly, tried to understand — who he was, what he was asked to do — by our parents, by himself. And he was just a boy, how would he even know the question to ask? The accident of his birth threw him into his role — front man — and he assumed it, whether he had the talent or stomach to fit. So I can look at the Polaroid, and snort or rage. I can construct a narrative from that moment onward that conveniently depicts his demise. But I’ve never thought about, for long, why he’s dancing alone.

The photo must have been taken in Waltham, Mass., because there’s a house directly outside the window, and my aunt and uncle’s next place had more space between it and the neighbors. But we’re in that blurred time — ’73 to ’75 — before they left the duplex rental across the street from the railroad tracks, and before my father left home for good.

He never came with us. Sometimes he’d drop us off, or pick us up. When the car worked my mother drove, but usually my uncle made the round trip. Just as I never understood, because my uncle was always very kind to us, that those Saturday trips in and out of Boston — three exits east and west on the turnpike — probably were a pain for him, they were escapes for my mother. Perhaps she spent her time there confiding in her sister, perhaps it was simply that she didn’t want to cook. Whatever the case, we all knew — intellectually or viscerally — that my family was ending. There was no place else for my mother to go.

AND so while my father was off chasing who knows what, or who, wherever, we were there, at my aunt and uncle’s, or on our way there. And winding along the Charles River and approaching the left turn onto their street I’d begin to feel the awful convergence of all the reasons we were rolling up onto the crabgrass and gravel of the curb-less sidewalk in front of their house, the walk along the driveway to the side door, and then my aunt holding the screen door, or the storm door open at the top of the four-step porch. My mother’s tired, familiar greeting. My aunt’s mid-sized smile. She was one of the few people who was truly happy to share, unconditionally, what she had.

Tracey and my cousin Lisa were best friends, 19 days apart in age. Russell, 18 months younger than I, and with our affections, secrets and rivalries, was much more like a brother to me than David. Even with this, there was always an awkward moment at the threshold that could stretch through dinner and into the evening. It could’ve been as simple as that they were engaged in something before we came, and most young children are terrible with transition. It could’ve been that we outnumbered them. I see it now when a large family is at our door. My children take pause and wonder how the people on the stoop will fit.

But I also know that we must have seemed like wild children. At home our mother was outnumbered. Our neighborhood was full of kids, and we’d roam the streets, parks and yards, and she couldn’t keep up. At home I could sleep in gym shorts, or long underwear. I could sneak into the bathroom, or hide under the covers with a flashlight and read as late as I wanted. In the morning I could shuffle around barefoot. Sleeping over at my aunt and uncle’s I had to wear pajamas. Lights-out wasn’t debatable. And in the morning my uncle would make me wear a robe and slippers around the house.

And we, as families, were going in different directions. There’d always been a rift between the adults — real-world practicality versus altruistic erudition. My father read books, argued Socratically, haunted jazz bars, and never saved a cent. My uncle worked hard, and kept whatever fanciful notions he might have had to himself, and had reasonable expectations of the future. They were soon to move from a blue-collar town to a wealthy Boston suburb. We would soon lose our house and our private school scholarships, and have to use their address to attend the good public schools. They had stuff, were getting more, but something we thought we had, every day, disappeared.

Finally, though, there was music. I don’t fully remember, but I think our cousins had a portable stereo. Lisa would get it out and the party would start. The Jackson 5 was the only music we’d play. My family had 45’s, but my cousins had albums. I’d stare at them, read the tiny print, feel the wax paper sleeve to see if it was real. All of the album covers, like the days, have combined into one emblematic memory. And I’m not one to dissect the converged body of the past into pre-gestalt moments.

I don’t know if it was the “Greatest Hits,” but one cover was the color of midnight, and one was like a Creamsicle, but they all had bright photographs of the Jackson 5: Afros, bellbottoms, suede boots, tasseled nubuck vests, stripes, stars and peace signs. Then the crackle of the grooves, then my aunt from the next room: “Not too loud.” The music: piano glissando, guitar, bass — bum bum, bum bum, bum bah bump. Michael: “Uh-huh … let me tell you now…” and we were up — snapping, clapping and rocking. Trying to do twirls and splits barefoot on the thick-pile wall-to-wall. No one would feel those burns until bedtime.

“I Want You Back.” I’d lose track of where I was, whom I was with. Heat, sound and movement, until the song faded. But even then I’d be part lost. I’d construct the pop of tires on gravel, the heavy door slam, footsteps up the porch, the first door whine, the second door jingle, and see my father enter, call for my mother and sing: “I want you back! I want you back!” But my father’s voice, I knew, was coarse-grit baritone, better suited for brooding and regret, rather than epiphany and hope.

“Michael.” David would snap. I’d come back from wherever I was to find everyone staring at me. He’d shrug his shoulders, raise his eyebrows and tap on his temple with his index finger. “We need to do that again.” He’d say at the player, holding the arm, looking at us four, now, flatly. I’d hate him for doing this, for wanting to control those moments, make them his, take them from me.

“Ready?” And we’d pick up our imaginary instruments, take our places and wait. He’d drop the needle and rush to the front.
We’d rehearse until we were soaked. We always did the same routines to the same songs, and David would ignore our ideas about new choreography. Start with “ABC,” slow it down with “I’ll Be There,” blow it out with “Going Back to Indiana.”

It never failed: we’d be set to call the adults in to watch our performance, just as they were getting up to go. We’d have to plead with them to let us do it. We worked so hard: just once, maybe twice, O.K., three times. I’ve conveniently forgotten, but in those moments, and the ones to follow, I felt protected — a part. I forgot about my resentment of my brother. Rather than being alone in my individual reveries, I was with my family.

The role of front man — because of who David was, who we were, and what we, at that time, faced — was his. He was a child and I was a younger child, and in some ways he stopped being one so I wouldn’t have to. I’ve never thanked David for that. I know that the time I’ve wasted ridiculing and shunning him, watching him fall, waiting for that call rather than helping to keep it from coming, can’t be regained, or undone in the years to come by re-rationalizing, glorifying or erasing his actions, but it’s a place to start.

People will point to Michael Jackson’s discography, his surgeries, madness, all that he’s been given, earned and squandered. Many will testify about what he’s given, but I, in this moment, can only think of what he’s, we’ve, lost. The price he’s paid for where he’s been is his life. And I think about David in much the same way — the wake behind, and his narrowing future — where he is now, and how it cost him me.

*Text By MICHAEL THOMAS; NYT, June 28, 2009
Michael Thomas is the author of the novel “Man Gone Down.”

Our political system sometimes produces such skewed results that it’s difficult not to blame bloviating politicians. But maybe the deeper problem lies in our brains.


Evidence is accumulating that the human brain systematically misjudges certain kinds of risks. In effect, evolution has programmed us to be alert for snakes and enemies with clubs, but we aren’t well prepared to respond to dangers that require forethought.

If you come across a garter snake, nearly all of your brain will light up with activity as you process the “threat.” Yet if somebody tells you that carbon emissions will eventually destroy Earth as we know it, only the small part of the brain that focuses on the future — a portion of the prefrontal cortex — will glimmer.

“We humans do strange things, perhaps because vestiges of our ancient brain still guide us in the modern world,” notes Paul Slovic, a psychology professor at the University of Oregon and author of a book on how our minds assess risks.
Consider America’s political response to these two recent challenges:

1. President Obama proposes moving some inmates from Guantánamo Bay, Cuba, to supermax prisons from which no one has ever escaped. This is the “enemy with club” threat that we have evolved to be alert to, so Democrats and Republicans alike erupt in outrage and kill the plan.

2. The climate warms, ice sheets melt and seas rise. The House scrounges a narrow majority to pass a feeble cap-and-trade system, but Senate passage is uncertain. The issue is complex, full of trade-offs and more cerebral than visceral — and so it doesn’t activate our warning systems.

“What’s important is the threats that were dominant in our evolutionary history,” notes Daniel Gilbert, a professor of psychology at Harvard University. In contrast, he says, the kinds of dangers that are most serious today — such as climate change — sneak in under the brain’s radar.

Professor Gilbert argues that the threats that get our attention tend to have four features. First, they are personalized and intentional. The human brain is highly evolved for social behavior (“that’s why we see faces in clouds, not clouds in faces,” says Mr. Gilbert), and, like gazelles, we are instinctively and obsessively on the lookout for predators and enemies.
Second, we respond to threats that we deem disgusting or immoral — characteristics more associated with sex, betrayal or spoiled food than with atmospheric chemistry.

“That’s why people are incensed about flag burning, or about what kind of sex people have in private, even though that doesn’t really affect the rest of us,” Professor Gilbert said. “Yet where we have a real threat to our well-being, like global warming, it doesn’t ring alarm bells.”

Third, threats get our attention when they are imminent, while our brain circuitry is often cavalier about the future. That’s why we are so bad at saving for retirement. Economists tear their hair out at a puzzlingly irrational behavior called hyperbolic discounting: people’s preference for money now rather than much larger payments later.

For example, in studies, most Americans prefer $50 now to $100 in six months, even though that represents a 100 percent return.
Fourth, we’re far more sensitive to changes that are instantaneous than those that are gradual. We yawn at a slow melting of the glaciers, while if they shrank overnight we might take to the streets.

In short, we’re brilliantly programmed to act on the risks that confronted us in the Pleistocene Age. We’re less adept with 21st-century challenges.
At the University of Virginia, Professor Jonathan Haidt shows his Psychology 101 students how evolution has prepared us to fear some things: He asks how many students would be afraid to stand within 10 feet of a friend carrying a pet boa constrictor. Many hands go up, although almost none of the students have been bitten by a snake.
“The objects of our phobias, and the things that are actually dangerous to us, are almost unrelated in the modern world, but they were related in our ancient environment,” Mr. Haidt said. “We have no ‘preparedness’ to fear a gradual rise in the Earth’s temperature.”
This short-circuitry in our brains explains many of our policy priorities. We Americans spend nearly $700 billion a year on the military and less than $3 billion on the F.D.A., even though food-poisoning kills more Americans than foreign armies and terrorists. We’re just lucky we don’t have a cabinet-level Department of Snake Extermination.
Still, all is not lost, particularly if we understand and acknowledge our neurological shortcomings — and try to compensate with rational analysis. When we work at it, we are indeed capable of foresight: If we can floss today to prevent tooth decay in later years, then perhaps we can also drive less to save the planet.

* Text By NICHOLAS D. KRISTOF , NYT, July 2, 2009

My mother had always feared domestic animals, but now as a plump neighborhood cat ran up our driveway, she gazed at the feline, and revealed that 70 years ago she had had a pet cat. Her 87-year-old eyes teared up. Her cat was white, she said, and so thin you could see its ribs.

un madoff menos

Still, she loved to cuddle it. It wasn’t a house cat – it couldn’t have been, because she was imprisoned at the time, in a forced-labor camp the Nazis set up in Poland, the country where my mother was born and raised. Back then she was as emaciated as the cat, but still she shared her food with it. It gave her comfort she said, and it was a way of fighting back, to help this animal that, like her, the Germans planned to let die.

The need for control can inspire great achievements, such as dams, medicines and chocolate soufflés, but it can also lead to sub-optimal behavior

The psychologist Bruno Bettelheim concluded that survival in Nazi concentration camps depended on “one’s ability to arrange to preserve some areas of independent action, to keep control of some important aspects of one’s life despite an environment that seemed overwhelming.”

Studies suggest that, even in normal conditions, to be happy, humans must feel in control. We are currently confronting economic hardship that, though a far cry from the horrors of World War II, has eroded the feeling of self-determination for many of us.

Eliminate control, and people experience depression, stress and the onset of disease. In a study of elderly nursing home patients[1] , one group was told they could decide how their room would be arranged, and could choose a plant to care for.

Another group had their rooms set up for them and a plant chosen and tended to for them. Eighteen months later 15 percent of the patients in the group given control had died, compared with 30 percent in the passive group.

The need for control can inspire great achievements, such as dams that prevent flooding, medicines to ease our lives, and perfectly confected chocolate soufflés. But it can also lead to sub-optimal behavior.

Though people generally view “control freaks” in a negative light, that need makes us all vulnerable to making bad decisions – especially when it comes to money. Studies show that people feel more confident they’ll win at dice if they toss the dice themselves than if others toss them [2], and that they are likely to bet more money if they make their wager before the dice are tossed than afterward (where the outcome has been concealed)[3].

They’ll value a lottery ticket more if they can choose it than if it is given to them at random[4].

And in a well-known 1975 study in which Yale University students were asked to predict the results of coin tosses, a significant number of presumably intelligent Yalies believed their performance could improve through practice, and would have been hampered if they’d been distracted.[5]

In each of these situations, the subjects knew that the enterprises in which they were engaged were unpredictable and beyond their control. When questioned, for example, none of the lottery players said they believed that being allowed to choose their card influenced their probability of winning. Yet on a deep, subconscious level they must have felt it did, because they behaved as if it did.

That people are prone toward feeling in control even when they are not probably endowed our species with an advantage at some point in our evolution. Even today, a false sense of control can be beneficial in promoting a sense of well-being, or allowing us to maintain hope that a bad situation can improved.

My mother’s illusion came to an end when, one day, her labor camp cat stopped coming. She never learned exactly what happened to it. Unfortunately, that became a template for nameless outcomes by which her sister, her father, and most of her friends disappeared.

Of her many illusions of youth that the Nazis snuffed out, the feeling that she could control her destiny was one of the most difficult to accept. But for my mother, and for all those who lived through similar experiences, surviving meant not only possessing a special toughness of body, but also of mind. She found a way to face the world without the illusion of control, of dealing with life as it comes, day to day, without expectation.

On a far different scale, we face losses today. To economists our plight is a “severe downturn,” but to me it feels like a roller coaster ride in which I discover, first, that I have no seat belt, and then, that the concession operator is Norman Bates. Given my jitters, it is a comfort to know that my mother survived a far worse experience and yet maintained the capacity to be happy when, for instance, her grandchildren hug her, or she discovers a tasty new sugar-free dessert. But more important is what I’ve learned from the fact that the current events don’t seem to bother her.

It’s not that my mother hasn’t lost money, or that she doesn’t need it. She isn’t bothered because her early experiences of utter powerlessness taught her to give herself up to what she calls fate. Understanding my own need for control – and exactly why I cannot have it – I now take comfort in letting go of the illusion, and accepting that despite all my efforts and planning some aspects of my future are beyond my sphere of influence. That realization has given me permission not to kick myself for the losses I have incurred. That can be a liberating thought in trying times like these, or any times at all.

* Text  By Leonard Mlodinow June 15, 2009

For the curious reader, here are the studies referred to above:
[1] Ellen Langer and Judith Rodin (1977). Long-Term Effects of a Control-Relevant Intervention With the Instituitonalized Aged. Journal of Personality and Social Psychology 35, 12, 897-902)
[2] Dunn, D. S., & Wilson, T. D. (1990). When the stakes are high: A limit to the illusion of control effect. Social Cognition, 8, 305–323
[3] L.H. Strickland, R.J. Lewicki, and A.M. Katz (1966). Temporal orientation and perceived control as determinants of risk-taking. Journal of Experimental Social Psychology, 2, 143-151.
[4]Ellen J. Langer (1975),The illusion of control. Journal of Personality and Social Psychology 32, 2, 311-328.
[5] Langer, E. J. & Roth, J. (1975). Heads I win, tails it’s chance: The illusion of control as a function of the sequence of outcomes in a purely chance task. Journal of Personality and Social Psychology 34, 191-198

Leonard Mlodinow teaches randomness to future experimenters at Caltech. His books include “The Drunkard’s Walk: How Randomness Rules Our Lives” and “Euclid’s Window: The Story of Geometry from Parallel Lines to Hyperspace.”

fixedCartoon20090627Cartoon20090626cartoons_03Madoff  y negocios en prisionun madoff menos

For all the talk of his sustained adolescence, no performer made a more compelling entrance into manhood than Michael Jackson did with the release of his 1979 album, “Off the Wall” just a couple of weeks before his 21st birthday.


It was all the more stunning because we had watched his childhood and adolescence. He wasn’t an apparition rising out of obscurity, like Elvis Presley. To become who he was in “Off the Wall,” he had to annul — if not destroy — the performer he had been in the Jackson 5.

And yet the change was organic as well as deliberate. Michael Jackson grew into his body, and out of that new body emerged a wholly new idea of what pop music, and the movement it generates, might be. It can be hard to remember now, 30 years later, just how ubiquitous the hits from that album, especially “Rock With You,” really were.

In soul, in rock ’n’ roll, and in pop, there is a long tradition of men singing in high voices, the height of the voice suggesting the pitch of the singer’s fervor. Michael Jackson made the sweetness of that high voice guttural and demanding. He showed that it was rooted in his feet and hips and hands. He re-sexualized it in a way that you could never really mistake — then — as androgynous.

Very few artists — certainly very few child stars — have ever redefined themselves as thoroughly or as successfully as Michael Jackson did. His second act was better than any number of first acts put together. The uncanny thing wasn’t just his physical transformation, his hypnotic new ability to move. It was the certainty of “Off the Wall” and its sequel “Thriller” that this was the music we wanted to hear. He knew, too, that this was a music we wanted to visualize, to see formalized and set loose in dance. In a sense, he was loosing his transformation upon the rest of us, expecting us to be caught up in the excitement the music caused in him. And we were.
Michael Jackson came to be synonymous with transformation — ultimately, with an eerie stasis that comes from seeking transformation all the time. The alchemy of change worked longer and better for him — through the ’80s and into the early ’90s — than it has for almost any other artist. And yet somehow all the changes always take us back to the album in which Michael Jackson grew up.

By VERLYN KLINKENBORG , NYT, Editorial, June 27, 2009

GROWING up in Texas, I knew a lot of girls like Farrah Fawcett, and I hated them. They had everything I didn’t: blond hair, blue eyes, the power, seemingly, to get anything and everything they wanted in my small public high school — boys, head cheerleader, the ability to decide, in a twinkling, who was cool and who wasn’t.


My mother told me not to worry — my time would come, she said — but what did she know? I was a dark, brooding teenager, and everywhere I turned there was a poster of a beaming woman with wild blond hair, her smile as wide as the Texas sky, in a low-cut scarlet bathing suit that, every man now over the age of 40 can tell you, revealed what was in 1976 the scandalous hint of a nipple.

Texas has produced a lot of beauties, but Farrah Fawcett of Corpus Christi was the one who dominated my late adolescence. Sometime after my mother tried — O.K., at my behest — to give me a Farrah feather-cut that wilted immediately in the summer heat, she came to stand for everything I wanted to escape in my home state.

The oppressiveness, the conformity, the vanity, the insincerity required of Texas women — smiling when you didn’t mean it, looking happy to see someone you really weren’t happy to see, never appearing in public without your face on (hers was a brilliantly contrived natural look) — acting, in general, as if you were always giving the best party in the world. It always seemed to me to be too much work.

Of course, I had it wrong about Farrah Fawcett. For a while she did get everything she must have wanted, or that people thought she should have wanted. She was the iconic beauty of her time and her various acting comebacks — instead of a go-for-it crime fighter she became a battered woman, a Nazi hunter and Robert Duvall’s unforgiving wife in “The Apostle” — proved that she was more than that.

But by then, no one cared. I wrote a magazine profile of her in 2000, and she spent a lot of time avoiding my softball questions by locking herself for extended periods in various bathrooms: at the Beverly Hills Hotel, at her home high atop Beverly Hills, in a movie theater restroom in Century City, where we’d gone for some premiere.

She was a tiny thing, fragile as a sparrow, disoriented. I rode with her to the doctor for some kind of much-needed injection, and took no pleasure in the trip.

Maybe, as some have suggested, this was all an act — being flighty Farrah Fawcett was her best role — but even if that were true, her choice of that act was instructive. It made you want to take care of her, to be careful in your approach, not to push or probe too much, because she might break.

Over time, I’ve made peace with the blond beauties of my childhood, and see that they have some essential qualities she lacked: an instinct for self-preservation, an ability to laugh through the worst of it, toughness and self-respect — these might have helped Farrah Fawcett get by after her beauty faded and the crowd moved on. But by then she was a creature of Hollywood, not Texas, and, unlike me, had left home for good.

By MIMI SWARTZ ; NYT,June 28, 2009
Mimi Swartz is an executive editor at Texas Monthly magazine.

cartoons_04cartoons_02cartoons_01North Koreapierna mapa italia

My friends Farhad and Mahnaz are the quintessential Iranian couple. They are both engineers with a shared passion for hiking and movies and have been smitten with each other for six years — but Farhad and Mahnaz can’t afford to get married because even a one-bedroom apartment is beyond their reach, despite their both having decent middle-class jobs.

This reality has preyed on their relationship, compelling them to consider leaving Iran. And they blame the government for their situation.


“We aren’t lazy, and we aren’t aiming for anything so high,” says Mahnaz.

These days, the phrase “marriage crisis” pops up in election debates, newspapers and blogs and is considered by government officials and ordinary Iranians alike to be one of the nation’s most serious problems. It refers to the rising number of young people of marrying age who cannot afford to marry or are choosing not to tie the knot.

By official estimates, there are currently 13 million to 15 million Iranians of marrying age; to keep that figure steady, Iran should be registering about 1.65 million marriages each year. The real figure is closer to half that. (See pictures of “The Long Shadow of Ayatullah Khomeini.”)neda_agha_soltan_05

Why does this matter? Because Iran’s government cannot afford to further alienate the young people that comprise more than 35% of its population. The young are already seething over their government’s radical stance in the world and its trashing of the economy, and their anger easily expresses itself politically.

As they decide how to vote in Friday’s presidential election, young people like Farhad and Mahnaz are likely to base their decision in part on who they think will address the problem closest to their heart. (Read “North Korea Wipes Out Iran [EM] from the World Cup.”)

Iran used to be a society in which people married young. In a Muslim culture that viewed premarital sex and dating as taboo, this was pretty much a social imperative. My mother married at 28, and in the 1970s that meant she had brushed up against spinsterhood. But today, Iranian women are attending university in unprecedented numbers — they account for over 60% of students on Iranian campuses — and typically enter the workforce after graduating. This has turned their focus away from the home sphere, made marriage a less urgent priority and changed women’s expectations of both marriage and prospective husbands. (See pictures of “Mahmoud Ahmadinejad: Iranian Paradox.”)

With young people pursuing more liberal lifestyles and shunning the traditional mores of their parents’ generation, the marrying age is steadily climbing. This terrifies Iran’s religious government, which still peddles the virtue of chastity and views young people’s shifting attitudes toward sexuality as a direct threat to the Islamic Revolution’s core values. “The sexual bomb we face is more dangerous than the bombs and missiles of the enemy,” said Mohammad Javad Hajj Ali Akbari, head of Iran’s National Youth Organization, late last year.


Unfortunately for the government, the mismanagement of Iran’s economy — with its high inflation, unemployment rates and soaring real estate prices — has deepened the marriage crisis, and with it the resentment among young Iranians. (See pictures of the global financial crisis.)

Amir Hekmati is a determined 31-year-old civil servant from Tehran’s Narmak neighborhood. He earns the equivalent of $500 a month and has saved assiduously. He’s also managed to secure a loan from the ministry where he works and a small sum from his parents, but even with that he can’t muster enough to buy a studio apartment in an outlying district of the city. Two women he admired turned down his marriage proposals on the grounds that he did not already have his own place. “If women would just agree to be girlfriends and date, we wouldn’t be forced to pursue marriage in the first place,” he complained.

Hekmati’s experience is typical of young Iranians, who are finding themselves increasingly priced out of the marriage market. During the tenure of President Mahmoud Ahmadinejad, real estate prices have soared across the country, but especially in Tehran, where they have risen as much as 150%. Economists have blamed the spike on Ahmadinejad’s disastrous economic policies.

The President flooded the economy with capital through a loan scheme, cut interest rates 2% and embarked on huge state construction projects that drove up the price of building materials. Those changes prompted many investors to move out of the stock market and the banking system and into real estate, which was considered a safer bet. Apartment prices in the capital more than doubled between 2006 and 2008. (See pictures of health care in Iran.)

The real estate boom was a disaster for middle-income Iranians, particularly young men seeking marriage partners. And many of those who have married and moved in with in-laws are finding that inflation is eating away at their savings, meaning it will take years, rather than months, to get their own place. The resulting strains are breaking up existing marriages — this past winter, local media reported that a leading cause of Iran’s high divorce rate is the husband’s inability to establish an independent household. Many others are concluding that marriage is best avoided altogether. (See the Top 10 Ahmadinejad-isms.)

Ahmadinejad’s government response to the crisis included a plan, unveiled in November 2008 by the National Youth Organization, called “semi-independent marriage.” It proposed that young people who cannot afford to marry and move into their own place legally marry but continue living apart in their parents’ homes. The announcement prompted swift outrage. Online news sites ran stories in which women angrily denounced the scheme, arguing that it afforded men a legal and pious route to easy sex while offering women nothing by way of security or social respect. The government hastily dropped the plan.

As Iranians head to the polls on Friday, Ahmadinejad faces the prospect that the very same broad discontent with the economy that propelled him to victory in 2005 could now help unseat him. Samira, a 27-year-old who works in advertising, recently became engaged and is among the millions of young Iranians who are eyeing the candidates through the lens of their own marital concerns. “Ahmadinejad promised he would bring housing prices down, but that didn’t happen at all,” she says.

If left to their own salaries, she explains, she and her fiancé will never be able to afford their own place. That’s a key reason they’re voting for Mir-Hossein Mousavi, the leading reformist candidate, who has made the economy the center of his platform. Like many young Iranians, they hope a new President will make marriage a possibility once more.

* Text By Azadeh Moaveni; NYT , Jun. 09, 2009

cartoon20090617Health care bizcartoon20090630fiat chrislercartoons_05cartoons_06

So I’ve received a fair bit of correspondence denouncing me for saying that we have to do something about climate change. Among the various insults is the claim that I’m just another Malthus — which is interesting.

Here’s a chart from Brad DeLong, showing population versus real wages in Britain. It was only in the late 17th century that Britain began to diverge from a simple population-wages curve; other parts of the world stayed Malthusian much longer.


Leave aside the climate science issues. What very few people realize is that Malthus was right about most of human history — indeed, he was right about roughly 58 out of 60 centuries of civilization: living standards basically did not improve from the era of the first Pharaohs to the age of Louis XIV, because any technological gains were swallowed up by population pressure. We only think Malthus got it wrong because the two centuries he was wrong about were the two centuries that followed the publication of his work.

UNLUCKY fishermen are all alike: We don’t know how to see. My friend Jud has outfished me in all but one or two of the hundred times we’ve gone to the ocean and bay beaches and kettle ponds on Cape Cod. By both study and exercise, he knows the culture of striped bass better than I know my own nose. But to call him “lucky” would begrudge him a talent that I have never seen in anyone else and that lives underneath skill or knowledge.

One July night, on a falling tide that sifted through the granite jetty in the west end of Provincetown, we fished the same 10-foot sluice, with the same tackle and the same flies (he ties them for me), and I watched in outrage as he caught 20 stripers to my two.

Another night, on Long Point, the finger of sand that curls into Provincetown harbor at the far end of Cape Cod, the stripers were chasing alewife, peanut bunker and other baitfish through the current that rips the point on a rising tide. I caught the first fish of the night, a 32-inch bass, enormous for me and for the lightweight rods we were using.

It took 20 minutes to land. Jud yelped in amusement and then caught eight more just like it, while I stood cursing and changing flies by the light of the town, two miles across the dark harbor. What he can do and I can’t is face a piece of water and so absorb himself in the place that he seems to share the consciousness of the fish in it.

If you have seen a school of 10,000 sand eels swerving as one animal under a wharf, you have seen that individuals can integrate their senses into a collective mind. Without the benefit of language, they share all the most important news: where to find food, light, threat, rocks. Human beings usually experience this common mind only under thestress of love or panic.

My friend pulls his hat brim down to deflect the sun, as everybody does, and makes the double-haul cast — a move in which the non-dominant hand jerks down and up on the line, both on the forward and back casts. Think of a man doing the polka with his arms. It isn’t as hard as it sounds; it just helps him reach the fish, not find them.

For all I know, he may, more often than not, see only a confluence of light and current, and point his desire at that spot, so that he believes he sees the fish before his eyes detect the animal itself. But I can’t deny that wherever he puts the lure, the fish find it.

We’ve evolved a neocortex that presents us with an awareness of past and future at the cost of forgetting where we are right now. Jud seems to switch that faculty off in favor of an older, lower brain. Like a sand eel in the school, he sees with 10,000 pairs of eyes. Many times when he was catching fish and I wasn’t, I’ve asked, “How do you know where the fish are?” And he’s said, “I see them.”
I may have glimpsed for myself what he sees, but only once. On an early summer afternoon we were fishing for brook and rainbow trout in the mid-Cape, at Cliff Pond. In reality, except after heavy rains, it’s two ponds split by a narrow sand bar.
More than 300 of these kettle ponds perforate the Cape.

They formed around 10,000 years ago. As the Laurentide ice sheet retreated into Canada, it left behind chunks of ice as thick as 60 feet that the force of the glacier had plowed into the earth. The sediment outflow from the melting Laurentide sheet covered the blocks of ice, so they lay hidden and insulated for 1,000 years or more beneath the soil.

As the climate warmed further, the blocks melted, the sediment crusts collapsed, and the deep holes that the blocks had formed began to fill with ground water and rain. In general, streams neither feed nor drain the ponds, and in the absence of wind they lie as still as mirrors.

Oak and pine trees ring Cliff Pond so tightly that if a wading fisherman tries to cast much farther than 10 feet, he snags his fly in the heavy brush during the back cast.
I was having a miserable afternoon, yanking one errant fly after another from the pine boughs. Jud came around the corner, having caught half a dozen brook trout and let them go. He saw my irritation and suggested another spot.

We climbed around an oak grove and onto the sand bar that divides the water. Not much high vegetation grows on the bar, so if you face east you can back cast as far as you like without snagging a tree, and fish the smaller pond with ease. The sun was going down in the drizzle. A screeching racket erupted, from the nearby marsh it seemed, but also from everywhere at once.
“What are those?” I asked.

He said, “peepers,” a frog smaller than your thumbnail that can scream as loud as an air raid siren. They lived all over the marsh, he said; but wherever I looked, I couldn’t find them.
We knew the fish were roaming the inlet we faced; he’d seen them there, but he left me alone and fished from the other side of the marsh.

I cast long and short, played the surface with a caddis fly, switched to a nymph to fish the bottom, strategized to no end, but nothing doing. The sun behind me threw my long shadow on the water and shot through a billion droplets hovering over the pond. I kept on wading deeper, thinking harder, catching nothing.

Anyone who fishes is an animist, and anyone who is frustrated while fishing becomes an egoist. So when a rainbow appeared over the far woods, I believed the cornball god of the place was having a laugh at my expense. But who can look away from a rainbow?

I stopped awhile and took it in, backing out of the weeds into shallower water, shaking my sore arm. The bright arc rose from one flank of the distant forest and fell into another. Above the uppermost red band, a secondary arc emerged — thicker, the colors reversed, with red on the underside, purple on top — and disappeared. The low clouds rumbled.

And all at once, with no invitation, the place penetrated me. My mind coextended with the woods and the pond. All my senses sent their data not to the front office of the brain for analysis and criticism, but to a room far below, to the body’s mind. The squishy silt beneath my feet smelled of leaf rot, the wind of ozone. The hidden throng of peepers rang from all quarters. The cold sun struck me in the back of the neck.
My fly line lay coiled in the black water. I threw it behind me, threw it forward, letting a few yards out, then cast backward again.
I had no awareness of future or past. I had forgotten everything I knew. My pores were soaked with the place.
The fly shot out, settled on the pond, and sank beneath the stippled surface. Nothing emanated from me but one thing, a passion that rose from the bottom of my lungs and out my throat into the whistling air: it was the bottomless desire, in the bottomless present, to catch a fish. I stripped the line once between the fingers of my right hand.
The line jerked and went taut. And I yanked up on the rod. And the line dived. I stripped again and drew up the rod. The pond cracked.
And a trout pitched itself out of the water and screwed through the yellow air.

By SALVATORE SCIBONA;Provincetown, Mass. June 28, 2009
Salvatore Scibona is the author of “The End.”

cartoon20090516525cartoon20090518USA se va de irakJun18#32edificios de cartoncartoons_07

How can you stay sharp into old age? It is not just a matter of winning the genetic lottery. What you do can make a difference
As everybody knows, if you do not work out, your muscles get flaccid. What most people don’t realize, however, is that your brain also stays in better shape when you exercise.

And not just challenging your noggin by, for example, learning a new language, doing difficult crosswords or taking on other intellectually stimulating tasks. As researchers are finding, physical exercise is critical to vigorous mental health, too.

Surprised? Although the idea of exercising cognitive machinery by performing mentally demanding activities—popularly termed the “use it or lose it” hypothesis—is better known, a review of dozens of studies shows that maintaining a mental edge requires more than that.

Other things you do—including participating in activities that make you think, getting regular exercise, staying socially engaged and even having a positive attitude—have a meaningful influence on how effective your cognitive functioning will be in old age.

Further, the older brain is more plastic than is commonly known. At one time, the accepted stereotype was that “old dogs can’t learn new tricks.” Science has proved that this dictum must be discarded.

Although older adults generally learn new pursuits more slowly than younger people do and cannot reach the peaks of expertise in a given field that they might have achieved if they had started in their youth, they nonetheless can improve their cognitive performance through effort—forestalling some of the declines in cognition that come with advancing age.

As John Adams, one of the founding fathers and the second U.S. president, put it: “Old minds are like old horses; you must exercise them if you wish to keep them in working order.”
The news comes at a propitious time.

The proportion of older adults in the U.S. and in other industrial nations continues to grow: in 1900, 4.1 percent of U.S. citizens were older than 65, but by 2000 that amount had jumped to 12.6 percent; by 2030, 20 percent of us will be in that category. From a societal point of view, prolonging independent functioning is both a desirable goal in itself and a way of deferring costs of long-term care.

For individuals, maintaining optimal cognitive functioning is worthwhile simply because it promises to enhance quality of life through the years.

Mental Training
How to keep minds keen over an entire life span is a question philosophers have mulled since the earliest writings on record. As Roman orator Cicero put it: “It is exercise alone that supports the spirits, and keeps the mind in vigor.”

Modern research in this field began in the 1970s and 1980s, with studies that demonstrated that healthy older adults can improve performance to a greater extent than had been previously assumed. The earlier research did not fully address certain questions, such as how long adults could retain the new skills they had acquired through training, whether those specifically developed skills would also positively influence other areas of cognition needed in everyday life, and whether the studies done with small numbers of subjects would be broadly applicable to most members of society.

The latest experiments confirm that cognitive training does show substantial benefits for older adults and that these effects can be relatively long-lasting. Around the turn of this past century the federal government’s National Institute on Aging funded a consortium of researchers to conduct a large-scale training study in a sample of older Americans.

In 2002 psychologist Karlene Ball of the University of Alabama at Birmingham and her colleagues published initial results on more than 2,500 individuals older than 65 who had received about 10 sessions of cognitive training. Participants were randomly assigned either to a cognitive-process training group to learn how to excel in one of three areas—memory, reasoning or visual search—or to a control group of subjects who did not receive training. At a follow-up two years later, the team randomly selected a set of the initial participants to get booster training prior to evaluation.

The results showed strong training-effect sizes in each group as compared with controls, along with a pattern of specificity in performance improvements. For example, individuals trained in visual search evinced strong gains in visual search performance but little improvement, relative to controls, on the memory and reasoning tests, a typical finding in training research. Data from retests five years later on the sample found that measurable training benefits were still present after the longer interval.

More impressive, however, are recent training studies that focus on what psychologists call executive function—how a person plans a strategic approach to a task, controls what is attended to, and how he or she manages the mind in the process. Unlike training that focuses on very specific skills, such as memorization strategies, training that aims to help people to control how they think appears to work on broader skills that are helpful in many situations that require thinking.

For instance, psychologist Chandramallika Basak and her colleagues at the University of Illinois recently showed that training in a real-time strategy video game that demands planning and executive control not only improved game performance but enhanced performance on other tasks measuring aspects of executive control.

Other results suggest that psychologists are learning how to train higher-level skills that may have a broader effect on cognitive function.

You don’t have to have specialized training, however, to achieve cognitive gains or ward off cognitive decline. Everyday activities such as reading can help. We reviewed evidence on activity-related cognitive enrichment in more than a dozen studies.

In 2003 neuropsychologist Robert S. Wilson and his colleagues at Rush University Medical Center in Chicago recruited more than 4,000 elderly people from a geographically defined community and rated their frequency of participation in seven cognitive activities (for instance, reading magazines). At three-year intervals for a mean of nearly six years, participants completed an in-home interview that included brief tests of cognitive function. More frequent cognitive activity at the outset was associated with reduced rate of cognitive decline over time.

Getting Physical
Over the past decade several studies have underscored the link between physical activity and cognition. For instance, in a study published in 2001 neuropsychiatrist Kristine Yaffe of the University of California, San Francisco, and her colleagues recruited 5,925 women older than 65 at four different medical centers across the U.S.

The participants were all free of any physical disability that would limit their ability to walk or pursue other physical activities. The volunteers were also screened to ensure that they did not have a cognitive impairment. The researchers then assessed their physical activity by asking the women how many city blocks they walked and how many flights of stairs they climbed daily and gave them a questionnaire to fill out about their levels of participation in 33 different physical activities. After six to eight years, the researchers assessed the women’s level of cognitive function.

The most active women had a 30 percent lower risk of cognitive decline. Interestingly, walking distance was related to cognition, but walking speed was not. It seems that even moderate levels of physical activity can serve to limit declines in cognition in older adults.

Moderate movement is good, but toning your circulatory system with aerobic exercise may be the real key to brain fitness. In a 1995 study of 1,192 healthy 70- to 79-year-olds, cognitive neuroscientist Marilyn Albert of Johns Hopkins University and her colleagues measured cognition with a battery of tasks that took approximately 30 minutes to complete and included tests of language, verbal memory, nonverbal memory, conceptualization and visuospatial ability. They found that the best predictors of cognitive change over a two-year period included strenuous activity and peak pulmonary expiratory flow rate. In an investigation published in 2004 epidemiologist Jennifer Weuve of Harvard University and her colleagues also examined the relation between physical activity and cognitive change over a two-year period in 16,466 nurses older than 70.

Participants logged how much time they spent per week in a variety of physical activities (running, jogging, walking, hiking, racket sports, swimming, bicycling, aerobic dance) over the past year and provided self-reports of walking pace in minutes per mile. Weuve’s group observed a significant relation between energy expended in physical activities and cognition, across a large set of cognitive measures.

The research that we have described thus far has examined mental performance over relatively short periods—just several years. A few studies have begun to look at what happens over longer timescales. In 2003 psychiatrist Marcus Richards of University College London and his colleagues examined in a cohort of 1,919 men and women the influence of self-reported physical exercise and leisure-time activities at age 36 on memory at age 43 and on memory change from ages 43 to 53.

Analyses indicated that engagement in physical exercise and other leisure-time activities at 36 was associated with higher memory scores at 43. Physical activity at 36 was also associated with a slower rate of memory decline from 43 to 53 years of age after adjusting for spare-time activity and other variables. The data also suggested little memory protection for those who stopped exercising after 36 but protection for those individuals who began to exercise after this time.

In 2005 then graduate student Suvi Rovio of the Karolinska Institute in Sweden and her colleagues examined the relation between physical activity at middle age and risk of dementia an average of 21 years later, when the cohort was between 65 and 79 years of age. Subjects indicated how often they participated in leisure-time physical activities that lasted at least 20 to 30 minutes and caused breathlessness and sweating. Conducting such activity at midlife at least twice a week was associated with a reduced risk of dementia in later life. Indeed, participants in the more active group had 52 percent lower odds of having dementia than the more sedentary group did.

Mind-Body Connection
It makes sense that training or participation in mentally stimulating activities would help cognition, but it is perhaps less immediately obvious why physical activity would have such an effect. Consider the increasingly well-documented link between physical activity and disease.

A plethora of studies have examined the health benefits of exercise and a nonsedentary lifestyle for prevention of disease. For example, we now know that physical activity reduces the risk of cardiovascular-related death, type 2 diabetes, colon and breast cancer, and osteoporosis. On the other hand, cardiovascular disease, diabetes and cancer have been associated with compromised cognition. Therefore, you might expect that increased physical activity and exercise would maintain cognition by reducing risk of diseases associated with cognitive decline.

In a study published in 2006 psychologist Stanley J. Colcombe of the University of Illinois and his colleagues examined the influence of fitness training on potential changes in brain structure. The six-month trial included 59 healthy but sedentary community-dwelling volunteers, age 60 to 79.

Brain scans after fitness training showed that even relatively short exercise interventions can begin to restore some of the losses in brain volume associated with normal aging.

Supporting these findings, a large body of nonhuman animal research has demonstrated a number of changes in brain structure and function after animals are exposed to enriched, or complex, environments. Enriched environments usually include running wheels, a multitude of toys and objects to climb that are changed frequently, and animal companions.

Exposure to such environments yields several physiological benefits. First, it increases the formation of new dendrite branches and synapses—the areas of neural cells that receive and send communication signals. It also increases the number of glial cells, which support the health of neurons, and expands the brain’s oxygen-supplying capillary network.

Enriched environments foster the development of new neurons and create a cascade of molecular and neurochemical changes, such as an increase in neurotrophins—molecules that protect and grow the brain.

Doing puzzles and push-ups are helpful for some—but other factors also boost mental fitness. For one, getting involved in social groups both improves cognition in general and seems to help thwart the arrival of dementia. The traditional focus of this research has been on relatively objective measures of social isolation versus connectedness, including the extent to which a person participates in activities that prominently involve social interaction (such as doing volunteer work), the number of friends and relatives an individual contacts regularly (in other words, the size of his or her social network), and marital status.

Findings about the positive aspects of attitudes and beliefs on adult cognition are spottier. In large part, positive beliefs and attitudes may have important indirect effects on cognitive enrichment because of their influence on the kinds of behaviors (for instance, exercise and mentally stimulating activities) that are known to be associated with cognitive enrichment.

More generally, individuals who are optimistic, agreeable, open to new experiences, conscientious, positively motivated and goal-directed are more likely to undergo successful aging, to take advantage of opportunities, to cope more effectively with life circumstances, to effectively regulate emotional reactions to events, and to maintain a sense of well-being and life satisfaction in the face of challenge.

And just as maintaining some activity patterns in old age may reduce risk of cognitive decline, the persistence of other patterns of behavior may actually increase the risk. Chronic psychological distress—resulting from depression, anxiety and negative emotions such as anger and shame—is associated with a variety of negative outcomes in adulthood, including cognitive decline. The tendency to experience psychological distress is often called neuroticism. Studies have consistently found a higher level of neuroticism to be linked to an increased incidence of Alzheimer’s disease and mild cognitive impairment in old age.
Enriching Cognition
Clearly, there is no magic pill or one-shot vaccine that inoculates the individual against cognitive decline in old age. Thus, public policy regarding cognitive enrichment should follow a health prevention model. Policy leaders might promote intellectual activities that are inherently meaningful for older adults, perhaps as embedded in larger social contexts (for example, the Elderhostel movement or adult continuing education). A critical issue for future research will be to understand how an engaged way of life can be promoted and implemented in midlife, during the working years. Given inevitable conflicts between work demands and time available for other roles (parenting, for one) and activities, it would be useful to know whether work-related activity programs (such as availability and use of physical exercise facilities at or near the workplace) could help foster an enriching lifestyle.
At the same time, the public must be aware that there is still much that is not known about cognitive fitness in old age, as well as some controversy about the magnitude and durability of mental exercise outcomes. People are beginning to market computer games and other means of exercising the mind, often making strong claims about the effectiveness of expensive products that have not been backed by actual scientific studies. Consumers should look for evidence demonstrating the benefits of any such products, which may not necessarily incorporate all the features needed to enhance mental fitness in old age.
The next decades offer much promise for expanding our knowledge about aging and cognition. We may soon discover whether the limits on successful cognitive functioning in old age that were once seen as insurmountable can ultimately be viewed as pessimistic assumptions that focused on observable age-related decline rather than the potential for maximizing human performance through cognitive enrichment. Just as advances in medical science may lead to increased longevity through vehicles such as effective treatments for dementia-causing illnesses, advances in psychological science can make important contributions to improving the quality of life of long-living older adults, in part by empirically demonstrating that attitudes and behaviors can promote cognitive functioning in old age and, more generally, by showing how behavioral interventions can help us all age successfully.

* By Christopher Hertzog, Arthur F. Kramer, Robert S. Wilson and Ulman Lindenberger