Neuroscience and National Security: The Complex Relationship between Science and the Military

The military commonly enlists science in its efforts. But when science is humanity, the relationship gets a little stickier

Neuroscience and national security go together somewhat uneasily. Stick the two in a single sentence, and University of Pennsylvania historian Jonathan Moreno starts getting e-mails from all kinds of people who are sure they’ve been brainwashed by the CIA. (It might not help his inbox that he wrote a book called Mind Wars: Brain Research and National Defense.)

“It’s hard to talk about these issues in part because we have kind of a paranoid popular-culture background,” Moreno said. Maybe you’ve seen The Manchurian Candidate, or, more recently, The Men Who Stare at Goats.

Neuroscience and national security, though, sit at the forefront of the complex relationship between science and the military, bedfellows that have produced not just compelling fiction, but also real dilemmas for the researchers who bridge them.

The American Association for the Advancement of Science hosted a conference today of its year-old Science and Human Rights Coalition, a group whose joint concerns are embodied most starkly in the application of science to war.

“The human rights frame is almost completely missing from this discussion,” said Len Rubenstein, the former executive director of Physicians for Human Rights and now a visiting scholar at Johns Hopkins. He spoke at the conference’s opening session. “The question of research for military purposes and scientific activity for military purposes is usually viewed either through the frame of professional ethics or scientific integrity. If human rights is introduced at all, it comes through the question of human subjects research with the Nuremberg Code.”

Scientists ought to consider, he argues, the broader question of human rights in work that ranges from weapons development to anthropology. As the science and potential military applications have grown more sophisticated, it follows that the ethics are now more complex, too.

Researchers, for instance, are already mulling whether beta-blockers could be used to reduce feelings of guilt in soldiers who do the unpleasant work of interrogation. Conversely, scientists wonder if oxytocin could induce trust in the interrogated. And what if neuro-imaging could help indicate what combatants are thinking? Or if brain monitoring could track how soldiers handle stress in training?

“We’re moving clearly more and more in the direction of being able to manage neural activity, manage behavior, attitudes and perception at a distance,” Moreno said.

Rubenstein, in response, pointed to the little-recognized Article 22 of the Universal Declaration of Human Rights. It entitles a person to social security indispensable for both dignity and “the free development of his personality.”

“When we have weapons that are deliberately designed to change people’s personalities, to manipulate people’s personalities, we have a problem,” Rubenstein said. “Not only an ethical problem, not only a national security problem, we have a human rights problem.”

It’s not that human rights are opposed to national security, Rubenstein argues; this is why the Geneva Conventions attempt to regulate conduct in war, not oppose war all together. From there, the distinctions are important. Weapons incapable of discriminating between combatants and civilians — like land mines or cluster bombs — violate human rights, he said, suggesting scientists who contribute to developing them must bear this in mind.

The most public example of murky scientific involvement in warfare has come from the Pentagon’s Human Terrain System, a controversial program to embed anthropologists with soldiers in Afghanistan and Iraq. The Department of Defense billed the program, which was unveiled in 2007, as a path toward greater cultural understanding and, ultimately, less violence.

But the American Anthropological Association roundly denounced the program. The participating anthropologists typically wear military uniforms and sometimes carry firearms. The military has insisted the program isn’t designed to gather intelligence for combat, but the AAA questioned how the one can ever be separated from the other in the context of war.

The Human Terrain System, the AAA concluded, violates many of the association’s main ethical tenets, including the obligation to do no harm and to obtain “informed consent” from subjects — something it may be impossible to give when facing a scientist in uniform.

In the new worlds of asymmetrical warfare, counterterrorism and neuroscience, however, all of the ethical guidelines may not yet be written.


Articles by: Emily Badger

Disclaimer: The contents of this article are of sole responsibility of the author(s). The Centre for Research on Globalization will not be responsible for any inaccurate or incorrect statement in this article. The Centre of Research on Globalization grants permission to cross-post Global Research articles on community internet sites as long the source and copyright are acknowledged together with a hyperlink to the original Global Research article. For publication of Global Research articles in print or other forms including commercial internet sites, contact: [email protected]

www.globalresearch.ca contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to our readers under the provisions of "fair use" in an effort to advance a better understanding of political, economic and social issues. The material on this site is distributed without profit to those who have expressed a prior interest in receiving it for research and educational purposes. If you wish to use copyrighted material for purposes other than "fair use" you must request permission from the copyright owner.

For media inquiries: [email protected]