The US National Security Agency (NSA) has upset a great many people this year. Since June, newspapers have been using documents leaked by former intelligence worker Edward Snowden to show how the secretive but powerful agency has spied on the communications of US citizens and foreign governments. Last month, the media reported that the NSA, which is based in Fort Meade, Maryland, had undermined Internet security standards. The revelations have sparked international outrage at the highest levels — even the president of Brazil cancelled a visit to the United States because of the spying.



Yet amid the uproar, NSA-supported mathematicians and computer scientists have remained mostly quiet, to the growing frustration of others in similar fields. “Most have never met a funding source they do not like,” says Phillip Rogaway, a computer scientist at the University of California, Davis, who has sworn not to accept NSA funding and is critical of other researchers’ silence. “And most of us have little sense of social responsibility.”

Mathematicians and the NSA are certainly interdependent. The agency declares that it is the United States’ largest maths employer, and Samuel Rankin, director of the Washington DC office of the American Mathematical Society, estimates that the agency hires 30–40 mathematicians every year. The NSA routinely holds job fairs on university campuses, and academic researchers can work at the agency on sabbaticals. In 2013, the agency’s mathematical sciences programme offered more than US$3.3 million in research grants.

Furthermore, the NSA has designated more than 150 colleges and universities as centres of excellence, which qualifies students and faculty members for extra support. It can also fund research indirectly through other agencies, and so the total amount of support may be much higher. A leaked budget document says that the NSA spends more than $400 million a year on research and technology — although only a fraction of this money might go to research outside the agency itself.


Many US researchers, especially those towards the basic-research end of the spectrum, are comfortable with the NSA’s need for their expertise. Christopher Monroe, a physicist at the University of Maryland in College Park, is among them. He previously had an NSA grant for basic research on controlling cold atoms, which can form the basis of the qubits of information in quantum computers. He notes that he is free to publish in the open literature, and he has no problems with the NSA research facilities in physical sciences, telecommunications and languages that sit on his campus. Monroe is sympathetic to the NSA’s need to track the develop­ment of quantum computers that could one day be used to crack codes beyond the ability of conventional machines. “I understand what’s in the newspapers,” he says, “but the NSA is funding serious long-term fundamental research and I’m happy they’re doing it.”

Dena Tsamitis, director of education, outreach and training at Carnegie Mellon University’s cybersecurity research centre in Pittsburgh, Pennsylvania, also wants to maintain the relationship. She oversees visitors and recruiters from the NSA but her centre gets no direct funding. She says that her graduate students understand the NSA’s public surveillance to be “a policy decision, not a technology decision. Our students are most interested in the technology.” And the NSA, she says — echoing many other researchers — “has very interesting technology problems”.


The academics who are professionally uneasy with the NSA tend to lie on the applied end of the spectrum: they work on computer security and cryptography rather than pure mathematics and basic physics. Matthew Green, a cryptographer at Johns Hopkins University in Baltimore, Maryland, says that these researchers are unsettled in part because they are dependent on protocols developed by the US National Institute of Standards and Technology (NIST) to govern most encrypted web traffic. When it was revealed that the NSA had inserted a ‘back door’ into the NIST standards to allow snooping, some of them felt betrayed. “We certainly had no idea that they were tampering with products or standards,” says Green. He is one of 47 technologists who on 4 October sent a letter to the director of a group created last month by US President Barack Obama to review NSA practices, protesting because the group does not include any independent technologists.

Edward Felten, who studies computer security at Princeton University in New Jersey, says that the NSA’s breach of security standards means that cryptographers will need to change what they call their threat model — the set of assumptions about possible attacks to guard against. Now the attacks might come from the home team. “There was a sense of certain lines that NSA wouldn’t cross,” says Felten, “and now we’re not so sure about that.”


Ann Finkbeiner, Nature,  October 8, 2013

The first permanent British settlers in North America turned to cannibalism to survive harsh conditions, finds an analysis of human remains with sharp cuts and chopping blows.


Excavated last year from a dump at James Fort in Jamestown, Va., the fragmented remains belonged to a 14-year-old girl and date back to the “starving time” winter of 1609-1610, when three-quarters of the colonists died.


Found with several butchered horse and dog bones, the skeletal remains — a tibia (shin bone) and a skull — featured a series of marks that provide grisly evidence of the dead girl becoming food for the starving colonists.


The researchers were first struck by four shallow chops to the forehead which indicate a hesitant, failed attempt to open the skull.

“The bone fragments have unusually patterned cuts and chops that reflect tentativeness, trial and complete lack of experience in butchering animal remains,” Doug Owsley, a forensic anthropologist at the Smithsonian National Museum of Natural History in Washington, D.C., said in a statement.


“Nevertheless, the clear intent was to dismember the body, removing the brain and flesh from the face for consumption,” he added.

Arrival of wives for the settlers at colonial Jamestown Virginia

At last the attempt succeed. A series of deep, forceful chops from a small hatchet or cleaver to the back of the head split the skull open. Flesh was removed from the face and throat using a knife, as sharp cuts and punctures marking the sides and bottom of the mandible, reveal.


The highly fragmented skeleton did not allow the researchers to establish the cause of death of the girl, although a combination of digital and medical technologies made it possible to reconstruct her likeness.

The research team has named her “Jane.”


Based on the anthropological evidence of her diet and the archaeological layer where her remains were found, Owsley and colleagues believe “Jane” arrived in Jamestown in August 1609, just months before the deadly “starving time” had begun.

According to Jim Horn, Colonial Williamsburg’s vice president of research and historical interpretation and an expert on Jamestown history, the “starving time” was brought about by a series of disasters that struck the community two years after it was established in 1607. These included disease, a serious shortage of provisions, and the siege of the native tribes Powhatan.

“Survival cannibalism was a last resort; a desperate means of prolonging life at a time when the settlement teetered on the brink of extinction,” Horn said.


Of about 300 English settlers living at James Fort in the winter of 1609, only about 60 survived to the spring.

The researchers believe it’s likely that Jane wasn’t a lone case and several other dead bodies were cannibalized.

Indeed, numerous account describing cannibalism surfaced among the survivors soon afterward Lord De La Warr saved Jamestown by sailing into the settlement with food and new colonists.


The facial reconstruction of Jane will be on display on May 3 at the exhibition “Written in Bone: Forensic Files of the 17th Century Chesapeake” in the National Museum of Natural History.



Food for hungry mouths, feed for animals headed to the slaughterhouse, fiber for clothing and even, in some cases, fuel for vehicles—all derive from global agriculture. As a result, in the world’s temperate climes human agriculture has supplanted 70 percent of grasslands, 50 percent of savannas and 45 percent of temperate forests. Farming is also the leading cause of deforestation in the tropics and one of thelargest sources of greenhouse gas emissions, a major contributor to the ongoing maul of species known as the “sixth extinction,” and a perennial source of nonrenewable groundwater mining and water pollution.

To restrain the environmental impact of agriculture as well as produce more wholesome foods, some farmers have turned to so-called organic techniques. This type of farming is meant to minimize environmental and human health impacts by avoiding the use of synthetic fertilizers, chemical pesticides and hormones or antibiotic treatments for livestock, among other tactics. But the use of industrial technologies, particularly synthetic nitrogen fertilizer, has fed the swelling human population during the last century. Can organic agriculture feed a world of nine billion people?

In a bid to bring clarity to what has too often been an emotional debate, environmental scientists at McGill University in Montreal and the University of Minnesota performed an analysis of 66 studies comparing conventional and organicmethods across 34 different crop species. “We found that, overall, organic yields are considerably lower than conventional yields,” explains McGill’s Verena Seufert, lead author of the study to be published in Nature on April 26. (Scientific American is part of Nature Publishing Group.) “But, this yield difference varies across different conditions. When farmers apply best management practices, organic systems, for example, perform relatively better.”

In particular, organic agriculture delivers just 5 percent less yield in rain-watered legume crops, such as alfalfa or beans, and in perennial crops, such as fruit trees. But when it comes to major cereal crops, such as corn or wheat, and vegetables, such as broccoli, conventional methods delivered more than 25 percent more yield.

The key limit to further yield increases via organic methods appears to be nitrogen—large doses of synthetic fertilizer can keep up with high demand from crops during the growing season better than the slow release from compost, manure or nitrogen-fixing cover crops. Of course, the cost of using 171 million metric tons of synthetic nitrogen fertilizer is paid in dead zones at the mouths of many of the world’s rivers. These anoxic zones result from nitrogen-rich runoff promoting algal blooms that then die and, in decomposing, suck all the oxygen out of surrounding waters. “To address the problem of [nitrogen] limitation and to produce high yields, organic farmers should use best management practices, supply more organic fertilizers or grow legumes or perennial crops,” Seufert says.

In fact, more knowledge would be key to any effort to boost organic farming or its yields. Conventional farming requires knowledge of how to manage what farmers know as inputs—synthetic fertilizer, chemical pesticides and the like—as well as fields laid out precisely via global-positioning systems. Organic farmers, on the other hand, must learn to manage an entire ecosystem geared to producing food—controlling pests through biological means, using the waste from animals to fertilize fields and even growing one crop amidst another. “Organic farming is a very knowledge-intensive farming system,” Seufert notes. An organic farmer “needs to create a fertile soil that provides sufficient nutrients at the right time when the crops need them. The same is true for pest management.”

But the end result is a healthier soil, which may prove vital in efforts to make it more resilient in the face of climate change as well as conserve it. Organic soils, for example, retain water better than those farms that employ conventional methods. “You use a lot more water [in irrigation] because the soil doesn’t have the capacity to retrain the water you use,” noted farmer Fred Kirschenmann, president of Stone Barns Center for Food and Agriculture at the “Feeding the World While the Earth Cooks” event at the New America Foundation in Washington, D.C., on April 12.

At the same time, a still-growing human population requires more food, which has led some to propose further intensifying conventional methods of applying fertilizer and pesticides to specially bred crops, enabling either a second Green Revolution or improved yields from farmlands currently under cultivation. Crops genetically modified to endure drought may also play a role as well as efforts to develop perennial versions of annual staple crops, such as wheat, which could help reduce environmental impacts and improve soil. “Increasing salt, drought or heat tolerance of our existing crops can move them a little but not a lot,” said biologist Nina Fedoroff of Pennsylvania State University at the New America event. “That won’t be enough.”

And breeding new perennial versions of staple crops would require compressing millennia of crop improvements that resulted in the high-yielding wheat varieties of today, such as the dwarf wheat created by breeder Norman Borlaug and his colleagues in the 1950s, into a span of years while changing the fundamental character of wheat from an annual crop to a perennial one. Then there is the profit motive. “The private sector is not likely to embrace an idea like perennial crop seeds, which do not require the continued purchase of seeds and thus do not provide a very good source of profit,” Seufert notes.

Regardless, the world already produces 22 trillion calories annually via agriculture, enough to provide more than 3,000 calories to every person on the planet. The food problem is one of distribution and waste—whether the latter is food spoilage during harvest, in storage or even after purchase. According to the Grocery Manufacturers Association, in the U.S. alone, 215 meals per person go to waste annually.

“Since the world already produces more than enough food to feed everyone well, there are other important considerations” besides yield, argues ecologist Catherine Badgley of the University of Michigan, who also compared yields from organic and conventional methods in a 2006 study (pdf) that found similar results. Those range from environmental impacts of various practices to the number of people employed in farming. As it stands, conventional agriculture relies on cheap energy, cheap labor and other unsustainable practices. “Anyone who thinks we will be using Roundup [a herbicide] in eight [thousand] to 10,000 years is foolish,” argued organic evangelist Jeff Moyer, farm director the Rodale Institute, at the New America Foundation event.

But there is unlikely to be a simple solution. Instead the best farming practices will vary from crop to crop and place to place. Building healthier soils, however, will be key everywhere. “Current conventional agriculture is one of the major threats to the environment and degrades the very natural resources it depends on. We thus need to change the way we produce our food,” Seufert argues. “Given the current precarious situation of agriculture, we should assess many alternative management systems, including conventional, organic, other agro-ecological and possibly hybrid systems to identify the best options to improve the way we produce our food.”


By David Biello  | S.A. / April 25, 2012