Friday, March 22nd, 2013


When Mort Zuckerman, the New York City real-estate and media mogul, lavished $200 million on Columbia University in December to endow the Mortimer B. Zuckerman Mind Brain Behavior Institute, he did so with fanfare suitable to the occasion: the press conference was attended by two Nobel laureates, the president of the university, the mayor, and journalists from some of New York’s major media outlets.

6f7b288abf6ed85d94c3a4895eb7d_h338_w600_m6_lfalse

 

Many of the 12 other individual charitable gifts that topped $100 million in the U.S. last year were showered with similar attention: $150 million from Carl Icahn to the Mount Sinai School of Medicine, $125 million from Phil Knight to the Oregon Health & Science University, and $300 million from Paul Allen to the Allen Institute for Brain Science in Seattle, among them. If you scanned the press releases, or drove past the many university buildings, symphony halls, institutes, and stadiums named for their benefactors, or for that matter read the histories of grand giving by the Rockefellers, Carnegies, Stanfords, and Dukes, you would be forgiven for thinking that the story of charity in this country is a story of epic generosity on the part of the American rich.

generosity

It is not. One of the most surprising, and perhaps confounding, facts of charity in America is that the people who can least afford to give are the ones who donate the greatest percentage of their income. In 2011, the wealthiest Americans—those with earnings in the top 20 percent—contributed on average 1.3 percent of their income to charity. By comparison, Americans at the base of the income pyramid—those in the bottom 20 percent—donated 3.2 percent of their income. The relative generosity of lower-income Americans is accentuated by the fact that, unlike middle-class and wealthy donors, most of them cannot take advantage of the charitable tax deduction, because they do not itemize deductions on their income-tax returns.

ricos-y_pobres_3 ricos-y-pobres-246x300

But why? Lower-income Americans are presumably no more intrinsically generous (or “prosocial,” as the sociologists say) than anyone else. However, some experts have speculated that the wealthy may be less generous—that the personal drive to accumulate wealth may be inconsistent with the idea of communal support. Last year, Paul Piff, a psychologist at UC Berkeley, published research that correlated wealth with an increase in unethical behavior: “While having money doesn’t necessarily make anybody anything,” Piff later told New York magazine, “the rich are way more likely to prioritize their own self-interests above the interests of other people.” They are, he continued, “more likely to exhibit characteristics that we would stereotypically associate with, say, assholes.” Colorful statements aside, Piff’s research on the giving habits of different social classes—while not directly refuting the asshole theory—suggests that other, more complex factors are at work. In a series of controlled experiments, lower-income people and people who identified themselves as being on a relatively low social rung were consistently more generous with limited goods than upper-class participants were. Notably, though, when both groups were exposed to a sympathy-eliciting video on child poverty, the compassion of the wealthier group began to rise, and the groups’ willingness to help others became almost identical.

Royal-smackdown

Last year, not one of the top 50 individual charitable gifts went to a social-service organization or to a charity that principally serves the poor and the dispossessed.

If Piff’s research suggests that exposure to need drives generous behavior, could it be that the isolation of wealthy Americans from those in need is a cause of their relative stinginess? Patrick Rooney, the associate dean at the Indiana University School of Philanthropy, told me that greater exposure to and identification with the challenges of meeting basic needs may create “higher empathy” among lower-income donors. His view is supported by a recent study by The Chronicle of Philanthropy, in which researchers analyzed giving habits across all American ZIP codes. Consistent with previous studies, they found that less affluent ZIP codes gave relatively more. Around Washington, D.C., for instance, middle- and lower-income neighborhoods, such as Suitland and Capitol Heights in Prince George’s County, Maryland, gave proportionally more than the tony neighborhoods of Bethesda, Maryland, and McLean, Virginia. But the researchers also found something else: differences in behavior among wealthy households, depending on the type of neighborhood they lived in. Wealthy people who lived in homogeneously affluent areas—areas where more than 40 percent of households earned at least $200,000 a year—were less generous than comparably wealthy people who lived in more socioeconomically diverse surroundings. It seems that insulation from people in need may dampen the charitable impulse.

millonario

 

 

Wealth affects not only how much money is given but to whom it is given. The poor tend to give to religious organizations and social-service charities, while the wealthy prefer to support colleges and universities, arts organizations, and museums. Of the 50 largest individual gifts to public charities in 2012, 34 went to educational institutions, the vast majority of them colleges and universities, like Harvard, Columbia, and Berkeley, that cater to the nation’s and the world’s elite. Museums and arts organizations such as the Metropolitan Museum of Art received nine of these major gifts, with the remaining donations spread among medical facilities and fashionable charities like the Central Park Conservancy. Not a single one of them went to a social-service organization or to a charity that principally serves the poor and the dispossessed. More gifts in this group went to elite prep schools (one, to the Hackley School in Tarrytown, New York) than to any of our nation’s largest social-service organizations, including United Way, the Salvation Army, and Feeding America (which got, among them, zero).

pobreza-riqueza

Underlying our charity system—and our tax code—is the premise that individuals will make better decisions regarding social investments than will our representative government. Other developed countries have a very different arrangement, with significantly higher individual tax rates and stronger social safety nets, and significantly lower charitable-contribution rates. We have always made a virtue of individual philanthropy, and Americans tend to see our large, independent charitable sector as crucial to our country’s public spirit. There is much to admire in our approach to charity, such as the social capital that is built by individual participation and volunteerism. But our charity system is also fundamentally regressive, and works in favor of the institutions of the elite. The pity is, most people still likely believe that, as Michael Bloomberg once said, “there’s a connection between being generous and being successful.” There is a connection, but probably not the one we have supposed.

 

By Ken Stern’s book, With Charity for All: Why Charities Are Failing and a Better Way to Give, was published in February 2013

Elizabeth Warren said that a much higher baseline would be appropriate if wages were tied to productivity gains.

Image: Office worker
Digital Vision-Getty Images-Getty Images

What if U.S. workers were paid more as the nation’s productivity increased?
If we had adopted that policy decades ago, the minimum wage would now be about $22 an hour, said Sen. Elizabeth Warren (D-Mass.) last week. Warren was speaking at a hearing held by the Senate’s Committee on Health, Education, Labor and Pensions.

Warren was talking to Arindrajit Dube, a University of Massachusetts Amherst professor who has studied the issue of minimum wage. “With a minimum wage of $7.25 an hour, what happened to the other $14.75?” she asked Dube. “It sure didn’t go to the worker.”
The $22 minimum wage Warren referred to came from a 2012 study from the Center for Economic and Policy Research. It said that the minimum wage would have hit $21.72 an hour last year if it had been tied to the increases seen in worker productivity since 1968. Even if the minimum wage got only one-fourth the pickup as the rate of productivity, it would now be $12.25 an hour instead of $7.25.
Some of the news media took this to mean that Warren is calling for a minimum-wage increase to $22 an hour. That doesn’t appear to be the case. She seems to be merely pointing out that the minimum wage has grown more slowly than other facets of the economy.
Warren is taking some hits on Twitter for her comments. One user describes her as “clueless and out of touch” while another calls her “delusional.” But other users are praising her arguments as “compelling,” saying she is “asking the right questions regarding minimum wage.”
By Kim  Peterson

Folks ages 29 to 37 have watched their net worth plummet over the past few decades, and there are several reasons.

Even after a damaging economic crisis and recession, some American households are recovering nicely. If you’re 47 years old or older, you’ve actually done pretty well in the past few decades.
But younger generations are just getting poorer, according to a new study from the Urban Institute. People younger than 47 just haven’t been able to accumulate much money or build up their net worth through homebuying or other investments.
The authors looked at how Americans’ average net worth has changed from 1983 through 2010 and found a dramatic difference between older and younger generations.
Those 56 to 64 and those older than 74 more than doubled their net worth, with gains of 120% and 149%, respectively. The picture was still rosy for those 47 to 55 and 65 to 73, with a rise in net worth of 76% and 79%.
But all that progress comes to a halt with younger generations. Those 38 to 46 saw their net worth rise by just 26%, and those 29 to 37 saw their net worth drop by 21%.
Why are young people getting left behind? One of the study’s authors, Gene Steuerle, says there are several factors:
The housing bubble. Younger homeowners were more likely to have the steepest mortgage balances and the least home equity built up. Consider a home that fell in value by 20%, Steuerle writes. A younger owner with only 20% equity would see a 100% drop in housing net worth, but an older owner with the mortgage paid off would see only a 20% drop.
The stock market. Older investors were more likely to invest in bonds and other assets that have recovered or gone up in value since the Great Recession.
Lower employment. Younger Americans are seeing higher unemployment rates. They’re also seeing lower relative minimum wages.
Less savings. Younger people are seeing a bigger cut of their pay taken out to pay for Social Security and health care.
“Maybe, more than just maybe, it’s time to think about investing in the young,” Steuerle writes.

By Kim Peterson

The Pentagon halted the use of mortar shells pending an investigation after a shell exploded during training in Nevada, killing eight Marines and injuring seven.

 83ca315b49d59b2979893d955480cf_h338_w600_m6_lfalse

HAWTHORNE, Nevada — A mortar shell explosion killed eight U.S. Marines and injured seven more during mountain warfare training in the Nevada desert, prompting the Defense Department to halt the use of the weapons worldwide until an investigation can determine their safety, officials said Tuesday.

The explosion occurred Monday night at the Hawthorne Army Depot, a facility used by troops heading overseas. The rescue of the wounded Marines was complicated by the remoteness of the site, which is favored because the harsh geography simulates conditions in Afghanistan.

The mortar round exploded in its firing tube during the exercise, said Brigadier General Jim Lukeman at a news conference in North Carolina, where the Marines are based. He said investigators are trying to determine the cause of the malfunction.

The Pentagon expanded a temporary ban to prohibit the military from firing any 60mm mortar rounds until the results of the investigation. The Pentagon earlier had suspended use of all high-explosive and illumination mortar rounds that were in the same manufacturing lots as ones fired in Nevada

8 Marines killed in Nevada…

8 Marines killed in Nevada training exercise

It was not immediately clear whether more than a single round exploded, a Marine Corps official said, speaking on condition of anonymity because the official wasn’t authorized to speak about an ongoing investigation.

The Marine Corps said early Tuesday that seven Marines were killed. Eight men under the age of 30 were taken to Renown Regional Medical Center in Reno. One of them died, four were in serious condition, two were in fair condition and another was discharged, said spokesman Mark Earnest.

John Stroud, national junior vice commander in chief for the Veterans of Foreign Wars, began a memorial event in Hawthorne on Tuesday night by saying “one of the critical has passed,” bringing the death toll to eight. Mourners then laid eight floral arrangements at a park where a flag flew at half-staff within sight of the Hawthorne depot’s boundary.

Stroud said he spoke with Marine officers from Camp Lejune who gave him the news before the ceremony. Messages left for a spokesman for the 2nd Marine Expeditionary Force were not immediately returned.

The force did issue a statement Tuesday evening saying an additional Marine has been reported as injured.

The identities of those killed won’t be released until 24 hours after their families are notified.

“We send our prayers and condolences to the families of Marines involved in this tragic incident,” said the 2nd Marine Expeditionary Force commander, Maj. Gen. Raymond C. Fox. “We mourn their loss, and it is with heavy hearts we remember their courage and sacrifice.”

The 60mm mortar traditionally requires three to four Marines to operate, but it’s common during training for others to observe nearby. The firing tube a shell some 14 inches (355 millimeters) in length.

The mortar has changed little since World War II and remains one of the simplest weapons to operate, which is why it is found at the lowest level of infantry units, said Joseph Trevithick, a mortar expert with Global Security.org.

Still, a number of things could go wrong, including a fuse malfunctioning, a problem with the barrel’s assembly or a round prematurely detonating inside the tube, Trevithick said.

The Marine Corps official said an explosion at the point of firing in a training exercise could kill or maim anyone inside or nearby the protective mortar pit and could concussively detonate any mortars stored nearby in a phenomenon known as “sympathetic detonation.”

The official said a worldwide moratorium after such an accident is not unusual and would persist until the investigation determines that the weapon did not malfunction in ways that would hurt other Marines or that mortars manufactured at the same time as the one involved in the accident were safe.

The moratorium could last for weeks or months.

The Hawthorne Army Depot stores and disposes of ammunition. It has held an important place in American military history since WWII, when it became the staging area for ammunition, bombs and rockets for the war.

Retired Nevada state archivist Guy Rocha said he was unaware of any other catastrophic event at the depot over the years it served as a munitions repository.

Associated Press writers Pauline Jelinek, Michelle Rindels and Ken Ritter contributed to this report.

 US war costs: Abraham Lincoln with his bodyguard Allan Pinkerton and Maj. Gen. John A. McClernand in Antietam, Md., shortly after the battle there. IMAGE

Time Life Pictures photo. US war costs: Abraham Lincoln with his bodyguard Allan Pinkerton and Maj. Gen. John A. McClernand in Antietam, Md., shortly after the battle there. IMAGE
Post-service compensation costs for U.S. veterans have totaled more than $50 billion since 2003, a new study by The Associated Press shows.

If history is any judge, the U.S. government will be paying for the Iraq and Afghanistan wars for the next century as service members and their families grapple with the sacrifices of combat.

An Associated Press analysis of federal payment records found that the government is still making monthly payments to relatives of Civil War veterans — 148 years after the conflict ended.

At the 10-year anniversary of the start of the Iraq War, more than $40 billion a year is going to compensate veterans and survivors from the Spanish-American War from 1898, World War I and II, the Korean War, the Vietnam War, the two Iraq campaigns and the Afghanistan conflict. And those costs are rising rapidly.

U.S. Sen. Patty Murray said such expenses should remind the nation about war’s long-lasting financial toll.

“When we decide to go to war, we have to consciously be also thinking about the cost,” said Murray, D-Wash., adding that her WWII veteran father’s disability benefits helped feed their family.

Alan Simpson, a former Republican senator and veteran who co-chaired President Barack Obama’s deficit committee in 2010, said government leaders working to limit the national debt should make sure that survivors of veterans need the money they are receiving.

“Without question, I would affluence-test all of those people,” Simpson said.

War costs: Chart showing U.S. government veteran expenditures. IMAGEAP Chart: U.S. Department of Veterans Affairs. War costs: Chart showing U.S. government veteran expenditures. IMAGE

With greater numbers of troops surviving combat injuries because of improvements in battlefield medicine and technology, the costs of disability payments are set to rise much higher.

The AP identified the disability and survivor benefits during an analysis of millions of federal payment records obtained under the Freedom of Information Act.

To gauge the postwar costs of each conflict, the AP looked at four compensation programs that identify recipients by war: disabled veterans; survivors of those who died on active duty or from a service-related disability; low-income wartime vets over age 65 or disabled; and low-income survivors of wartime veterans or their disabled children.

THE IRAQ WARS AND AFGHANISTAN

So far, the wars in Iraq, Afghanistan and the first Persian Gulf conflict in the early 1990s are costing about $12 billion a year to compensate those who have left military service or family members of those who have died.

Those post-service compensation costs have totaled more than $50 billion since 2003, not including expenses of medical care and other benefits provided to veterans, and are poised to grow for many years to come.

The new veterans are filing for disabilities at historic rates, with about 45 percent of those from Iraq and Afghanistan seeking compensation for injuries. Many are seeking compensation for a variety of ailments at once.

Experts see a variety of factors driving that surge, including a bad economy that’s led more jobless veterans to seek the financial benefits they’ve earned, troops who survive wounds of war, and more awareness about head trauma and mental health.

VIETNAM WAR

It’s been 40 years since the U.S. ended its involvement in the Vietnam War, and yet payments for the conflict are still rising.

Now above $22 billion annually, Vietnam compensation costs are roughly twice the size of the FBI’s annual budget. And while many disabled Vietnam vets have been compensated for post-traumatic stress disorder, hearing loss or general wounds, other ailments are positioning the war to have large costs even after veterans die.

Based on an uncertain link to the defoliant Agent Orange that was used in Vietnam, federal officials approved diabetes a decade ago as an ailment that qualifies for cash compensation — and it is now the most compensated ailment for Vietnam vets.

The VA also recently included heart disease among the Vietnam medical problems that qualify, and the agency is seeing thousands of new claims for that condition. Simpson said he has a lot of concerns about the government agreeing to automatically compensate for those diseases.

 

“That has been terribly abused,” Simpson said.

Since heart disease is common among older Americans and is the nation’s leading cause of death, the future deaths of thousands of Vietnam veterans could be linked to their service and their benefits passed along to survivors.

A congressional analysis estimated the cost of fighting the war was $738 billion in 2011 dollars, and the postwar benefits for veterans and families have separately cost some $270 billion since 1970, according to AP calculations.

WORLD WAR I, WORLD WAR II AND THE KOREAN WAR

World War I, which ended 94 years ago, continues to cost taxpayers about $20 million every year. World War II? $5 billion.

Compensation for WWII veterans and families didn’t peak until 1991 — 46 years after the war ended — and annual costs since then have declined by only about 25 percent. Korean War costs appear to be leveling off at about $2.8 billion per year.

Of the 2,289 survivors drawing cash linked to WWI, about one-third are spouses, and dozens of them are over 100 years in age.

Some of the other recipients are curious: Forty-seven of the spouses are under the age of 80, meaning they weren’t born until years after the war ended. Many of those women were in their 20s and 30s when their aging spouses died in the 1960s and 1970s, and they’ve been drawing the monthly payments since.

CIVIL WAR AND SPANISH-AMERICAN WAR

There are 10 living recipients of benefits tied to the 1898 Spanish-American War at a total cost of about $50,000 per year. The Civil War payments are going to two children of veterans — one in North Carolina and one in Tennessee— each for $876 per year.

Surviving spouses can qualify for lifetime benefits when troops from current wars have a service-linked death. Children under the age of 18 can also qualify, and those benefits are extended for a lifetime if the person is permanently incapable of self-support due to a disability before the age of 18.

Citing privacy, officials did not disclose the names of the two children getting the Civil War benefits.

Their ages suggest the one in Tennessee was born around 1920 and the North Carolina survivor was born around 1930. A veteran who was young during the Civil War would likely have been roughly 70 or 80 years old when the two people were born.

That’s not unheard of. At age 86, Juanita Tudor Lowrey is the daughter of a Civil War veteran. Her father, Hugh Tudor, fought in the Union army. After his first wife died, Tudor was 73 when he remarried her 33-year-old mother in 1920. Lowrey was born in 1926.

Lowrey, who lives in Kearney, Mo., suspects the marriage might have been one of convenience, with her father looking for a housekeeper and her mother looking for some security. He died a couple years after she was born, and Lowrey received pension benefits until she was 18.

Now, Lowrey said, she usually encounters skepticism from people after she tells them she’s a daughter of a Civil War veteran.

“We’re few and far between,” Lowrey said.

Weighty choices can be shifted by surprising factors.

how-your-moral-decisions-shaped-by-mood_1

Imagine you’re standing on a footbridge over some trolley tracks. Below you, an out-of-control trolley is bearing down on five unaware individuals standing on the track. Standing next to you is a large man. You realize that the only way to prevent the five people from being killed by the trolley is to push the man off the bridge, into the path of the trolley. His body would stop the trolley, saving the lives of the five people further down the track.

What would you do? Would you push the man to save the others? Or would you stand by and watch five people die, knowing that you could have saved them? Regardless of which option you choose, you no doubt believe that it will reflect your deeply held personal convictions, not trifles such as your mood.

Well, think again. In a paper published in the March edition of the journal Cognition, a group of German researchers have shown that people’s mood can strongly influence how they respond to this hypothetical scenario. Though this general observation is well-known in the literature on moral judgments and decision making, the current paper helps to resolve a question which has long lurked in the background. That is, how does this happen? What is the mechanism through which moods influence our moral decisions?

Early research showed a difference between personal moral decisions, such as the footbridge problem above, and impersonal moral decisions, such as whether to keep money found in a lost wallet. Areas of the brain usually characterized as responsible for processing emotional information seemed to be more strongly engaged when making these personal as opposed to impersonal moral decisions, they found. These scientists concluded that emotions were playing a strong role in these personal moral judgments while the more calculating, reasoning part of our mind was taking a siesta.

Unfortunately, given the various shortcomings of previous investigations on this particular topic, there are a variety of other explanations for the observation that emotions, or the more general emotional states known as moods, affect how people may respond to the footbridge scenario.

For example, moods could influence the thought process itself.  This is the “moral thought” hypothesis: just as something like attention may change our thought process by biasing how we perceive two choices, mood could also bias our thought process, resulting in different patterns of moral thinking. This is different from the “moral emotion” hypothesis, which suggests that emotions directly change how we feel about the moral choice. That is, our good mood could making us feel better (or worse) about potentially pushing, and therefore more (or less) likely to do it. Resolving this ambiguity with neuroimaging studies such as the one detailed above is difficult because of fMRI’s low temporal resolution – a brain scan is similar to taking a camera with the exposure set to a couple of seconds. This makes it difficult to faithfully capture events which happen quickly, such as whether moods change the experience of the decision, or if they directly influence the thought process.

To test these competing ideas, participants were first put into a specific mood by listening to music and write down an autobiographical memory. Those in the positive mood condition listened to Mozart’s Eine Kleine Nachtmusic and wrote down a positive memory, while those in the negative mood condition listened to Barber’s Adagio for Strings, Opus 11 and wrote down a negative memory. The participants in the neutral mood condition listened to Kraftwerk’s Pocket Calculator and wrote about a neutral memory.

After this mood induction procedure, participants were then presented with the trolley scenario. Some participants were asked: “Do you think it is appropriate to be active and push the man?” while others were asked “Do you think it is appropriate to be passive and not push the man?”.

Participants in a positive mood were more inclined to agree to the question, regardless of which way it was asked. If asked if it was okay to push, they were more likely to push. If asked if it was okay not to push, they were more likely to not push. The opposite pattern was found for those in a negative mood.

If mood directly changed our experience of potentially pushing — the moral emotion hypothesis — then putting people in a positive mood should have made them more likely to push, no matter how the question was asked. The ‘moral thought’ hypothesis, on the other hand, accounts for these results quite nicely. Specifically, it is known from previous research that positive moods validate accessible thoughts, and negative moods invalidate accessible thoughts. So, for example, if I ask you if it’s okay to push, you will begin to consider the act of pushing, making this thought accessible. If you’re in a positive mood, that mood acts on this thought process by making you more likely to feel as though this is an acceptable behavior – it validates the thought of pushing. On the other hand, if I were to ask if it is okay to not push, the positive mood should validate the thought of not pushing, leading you to feel like not pushing is an acceptable behavior. Negative mood, which invalidates accessible thought, has a parallel effect, but in the opposite direction. Thus, this idea fits well with the observed pattern of results in this experiment.

These findings raise some further questions, some of which psychologists have been attempting to answer for a long time. Emotions and logical thought are frequently portrayed as competing processes, with emotions depicted as getting in the way of effective decision-making. The results here are another demonstration that instead of competing, our emotions and our cognitions interact and work closely to determine our behaviors. In fact, some researchers have recently begun to suggest that the division between these two is rather tough to make, and there may not actually be any meaningful difference between thought and emotion. After all, if moods and emotions play a fundamental role in information processing, what differentiates them on a functional level from other basic kinds of cognitive processes, such as attention or memory? This paper obviously doesn’t resolve this issue, but it is certainly another piece of the puzzle.

It would also be exciting, as the authors say, to see how more specific emotions might influence our moral decision-making. Anger and sadness are both negative emotions, but differ in important ways. Could these subtle differences also lead to differences in how we make moral judgments?

This paper demonstrates that our professed moral principles can be shifted by subtle differences in mood and how a question is posed. Though there are plenty of implications for our daily lives, one that arguably screams the loudest concerns the yawning gap between how humans actually think and behave, and how the legal system pretends they think and behave. The relative rigidity of western law stands in stark contrast to the plasticity of human thought and behavior. If a simple difference in mood changes how likely one person is to throw another over a footbridge, then does this imply that the law should account for a wider variety of situational factors than it does presently? Regardless of how you feel, it is clear that this paper, and behavioral science in general, should contribute to the decision. Having a legal system based on reality is far preferable to one based on fantasy.

 

By Travis Riddle (March 2013)

ABOUT THE AUTHOR(S)

Travis Riddle is a doctoral student in the psychology department at Columbia University. His work in the Sparrow Lab focuses on the sense of control people have over their thoughts and actions, and the perceptual and self-regulatory consequences of this sense of control.