WHEN MISTAKES ARE
MADE, WHAT DOES
Killer Robots have Eyes & Brains, But No Heart!
Tuesday, November 20, 2012 Full Show
Killer Robots: HRW and Nobel Laureate Jody Williams Urge Ban on Modern Warfare’s Next Frontier
Nobel Peace laureate Jody Williams is joining with Human Rights Watch to oppose the creation of killer robots — fully autonomous weapons that could select and engage targets without human intervention. In a new report, HRW warns such weapons would undermine the safety of civilians in armed conflict; violate international humanitarian law; and blur the lines of accountability for war crimes. Fully autonomous weapons do not exist yet, but high-tech militaries are moving in that direction with the United States taking the lead. Williams, winner of the 1997 Nobel Peace Prize for her work with the International Campaign to Ban Landmines, joins us along with Steve Goose, director of Human Rights Watch’s Arms Division. [Includes rush transcript]
Jody Williams, winner of the 1997 Nobel Peace Prize for her work with the International Campaign to Ban Landmines. She is the chair of the Nobel Women’s Initiative.
Steve Goose, director of Human Rights Watch’s Arms Division.
- As Obama Expands Drone War, Activists & Victims’ Advocates Join D.C. Summit on Growing Civilian Toll Apr 27, 2012 | Story
- Training Terrorists in Nevada: Seymour Hersh on U.S. Aid to Iranian Group Tied to Scientist Killings Apr 10, 2012 | Story
- Jeremy Scahill: Why is President Obama Keeping Yemeni Journalist Abdulelah Haider Shaye in Prison? Mar 15, 2012 | Story
- U.N. Special Rapporteur Calls for Global Protection of Gaza Civilians from U.S.-Backed Israeli Assault Nov 19, 2012 | Story
- Palestinian Civilians Bear the Brunt of Unrelenting Bombings in U.S.-Backed Attack on Gaza Nov 19, 2012 | Story
- “My Name is Jody Williams: A Vermont Girl’s Winding Path to the Nobel Peace Prize.” By Jody Williams. (University of California
- Jeremy Scahill and Dennis Kucinich: In Obama’s 2nd Term, Will Dems Challenge U.S. Drones, Killings? Nov 07, 2012 | Story
- Study Finds U.S. Drone Strikes in Pakistan Miss Militant Targets and “Terrorize” Civilians Sep 26, 2012 | Story
- As Obama Expands Drone War, Activists & Victims’ Advocates Join D.C. Summit on Growing Civilian Toll Apr 27, 2012 | Story
- Kathy Kelly on Afghan Humanitarian Crisis, Civilian Casualties and Drone Warfare Mar 12, 2012 | Story
This transcript is available free of charge. However, donations help us provide closed captioning for the deaf and hard of hearing on our TV broadcast. Thank you for your generous contribution. Donate >
AMYGOODMAN: Killer robots? This sounds like science fiction, but a new report says fully autonomous weapons are possibly the next frontier of modern warfare. The report released Monday by Human Rights Watch and Harvard Law School is called “Losing humanity: The case against killer robots.” According to the report, these weapons would undermine the safety of civilians in armed conflict, violate international humanitarian law, and blur the lines of accountability for war crimes. Scholars, such as Noel Sharkey, Professor of Artificial Intelligence and Robotics at the University of Sheffield in England, also note that robots cannot distinguish between civilians from combatants.
AMYGOODMAN: Fully autonomous weapons don’t exist yet, but high-tech militaries are moving in that direction with the United States spearheading these efforts. Countries such as China, Germany, Israel, South Korea, Russia and Britain are also experimenting with the technology. Human Rights Watch and Harvard Law School’s International Human Rights Clinic are calling for an international treaty preemptively banning fully autonomous weapons. They’re also calling for individual nations to pass laws to prevent development, production and use of such weapons at the domestic level. For more, we’re joined by two guests in Washington. Steve Goose, Director of Human Rights Watch’s Arms Division, which released the new report on killer robots and Jody Williams, the Nobel Peace Prize winner who won the Peace Prize in 1997 for her work with the International Campaign to Ban Land Mines. She’s also chair of the Nobel Women’s Initiative. Her forthcoming book is called, “My Name is Jody Williams: A Vermont Girl’s Winding Path to the Nobel Peace Prize.” Steve Goose and Jody Williams, we welcome you to Democracy Now! Jody, just describe what these weapons are. Killer robots, what do you mean?
JODYWILLIAMS: Killer robots, when I first say that to people, they automatically think drones. Killer robots are quite different in that there is no man in the loop. As we know with drones, a human being has to fire on the target. A drone is actually a precursor to killer robots, which are weapons systems — will be weapons systems that can kill on their own with no human in the loop. It’s really terrifying to contemplate.
AMYGOODMAN: Steve Goose, lay them out for us, and who is developing them.
It is for the future. Most roboticists think it will take at least ten, maybe twenty, thirty years before these things might come on line. Although others think more crude versions could be available in just a number of years.
AMYGOODMAN: I want to go to a clip from a video created by Samsung Techwin Company, which talks about weapons of the future.
AMYGOODMAN: Steve Goose, explain.
STEVEGOOSE: Well, the system they’re talking about is not yet fully autonomous, although it could become a fully autonomous system. For that particular one, you still have the potential for a human to override the decision of the robotic system. Even there, there could be problems because the chance that a human will actually override a machine’s decision is unlikely. This is the kind of system that we’re looking at that could become fully autonomous, where you take the human completely out of the picture. They’re programmed in advance but can’t react to the variables that require human judgment.
AMYGOODMAN: Who is driving this? Who benefits from killer robots, as you call them?
STEVEGOOSE: Well, no doubt there is a lot of money to be made in these developments. Research labs for the militaries as well as universities and private companies are all engaged so far. We know billions are going into development of autonomous weapons systems; some fully autonomous and some semi-autonomous, even drones even fall under that category. Certainly, they’re going to be people in the military to see this as a great advantage because you’re taking the human off the battlefield, so you are reducing military casualties. The problem there is that you’re putting civilians at greater risk in shifting the burden of war from the military, which is trained to do this, to civilians.
AMYGOODMAN: I want to turn to Tom Standage, the Digital Editor at the Economist. He points out there might be a moral argument for robot soldiers, as you’ve pointed out, Steve.
TOMSTANDAGE: If you could build robot soldiers, you’d be able to program them not to commit war crimes, not to commit rape, not to burn down villages. They wouldn’t be susceptible to human emotions like anger or excitement in the heat of combat. So, you might actually want to entrust those decisions to machines rather than to humans.
AMYGOODMAN: Let’s put that to Jody Williams.
JODYWILLIAMS: You just had Noel Sharkey; you quoted him and Noel, who is a roboticist himself, talks about the ridiculousness of considering that robotic weapons can make those kinds of decisions. He contends that it is a simple yes-no by a machine. I think another part of the notion that people don’t think about when they want to promote developing robots is that robots cannot feel compassion or empathy. Oftentimes in war, a soldier will factor in other circumstances and making a decision whether or not to kill. A robot – it’s very unlikely they would be able to do that. Another point though, for me is, if we proceed to the point of having fully autonomous killer robots that can decide who, what, where, when, how to kill, we are putting humans at — technology will no longer be serving humans. Humans will be serving technology. When American soldiers will not have to face death on the battlefield, how much easier will it be for this country to go to war when we already go to war so easily? It really frightens me.
AMYGOODMAN: Jody, you 1997 Nobel Peace Prize for your work around land mines. You headed up the international campaign to ban land mines. Do you see a trajectory here in this, what, fifteen years?
JODYWILLIAMS: In terms of getting rid of weapons or in terms of –?
AMYGOODMAN: In terms of the development of weapons and also the movement that resists it.
JODYWILLIAMS: Yes, I do. It’s obvious that weapons are going to continue to be researched and developed, unless there are organizations like Human Rights Watch, like the Nobel Women’s Initiative and others around the world that are willing to come together and try to stop them. We who are starting to speak out against killer robots envision a very similar campaign to the campaign that banned land mines, which, by the way, also received the peace Prize in 1997. We are already working to bring together organizations to create a steering committee that would bring together a campaign to stop killer robots and strategizing at the national, regional, and international levels with partner governments, just like we did with land mines, and then Steve Goose and Human Rights Watch and other organizations did with their successful bid to ban cluster munitions in 2008.
AMYGOODMAN: Certainly the U.S. is on the forefront of robotic weapons. Certainly, drones fit into that category. We’re only beginning to see the movement as people laid their bodies on the line at Creech and Handford and upstate New York, the places where the drones are controlled from. But what about the U.S.’s role in all this? Let me put that question to Steve Goose.
STEVEGOOSE: You raised drones again and Human Rights Watch has criticized how drones have been used by the Obama administration, criticized it quite extensively but we draw a distinction here. We think of the fully autonomous weapons, the killer robots, as being beyond drones. With drones, you do have a man in the loop makes decisions about what to target and when to fire, but with fully autonomous weapons, that is going to change. It will be the robot itself that makes those determinations, which makes it – it’s the nature of the weapon, which is not really the main problem with drones. The United States — we’re getting mixed signals. Clearly, the U.S. is putting a lot of money into this and a lot of the DOD’s, the military’s planning documents show that there are a great many who believe that this is the future of warfare, that they envision moving ever more into the autonomous region and that fully autonomous weapons are both desirable and the ultimate goal.
AMYGOODMAN: Explain how a robot makes this decision.
STEVEGOOSE: It has to be programmed. It would be programmed in advance and they will give it sensors to detect various things and give it an algorithm of choices to make. But it won’t be able to have the kind of human judgment that we think is necessary to comply with international humanitarian law, where you have to be able to determine in a changing complex battlefield situation whether the military advantage might exceed the potential cost to civilians, the harm to civilians. You have to be a to make the distinction between combatants and civilians. A simple thing as simple like that could be very difficult in a murky battlefield. So, we don’t think—
AMYGOODMAN: Steve Goose, what about hacking?
STEVEGOOSE: Hacking can be a problem. The thing is, even if a killer robot sustains injuries on the battlefield, that might affect how it would be able to respond properly or there could be counter measures that are put forward by the enemy to do this as well.
A Day Job Waiting for a Kill Shot a World Away
Hancock Field, near Syracuse – Where’s the BRAVERY!
Partial Transcript below; to read entire Post, go to URL above!
HANCOCK FIELD AIR NATIONAL GUARD BASE, N.Y. — From his computer console here in the Syracuse suburbs, Col. D. Scott Brenton remotely flies a Reaper drone that beams back hundreds of hours of live video of insurgents, his intended targets, going about their daily lives 7,000 miles away in Afghanistan. Sometimes he and his team watch the same family compound for weeks.
Related in Opinion
· Opinionator | The Stone: The Moral Hazard of Drones (July 22, 2012)
They call this drone the “Reaper”
(as in “Grim Reaper”) –
Americans are truly a
benevolent and kind hearted
People – aren’t we!
Heather Ainsworth for The New York Times
The Reaper is among the drones that pilots at Hancock operate, killing insurgents and protecting American troops overseas.
Readers shared their thoughts on this article.
“I see mothers with children, I see fathers with children, I see fathers with mothers, I see kids playing soccer,” Colonel Brenton said.
When the call comes for him to fire a missile and kill a militant — and only, Colonel Brenton said, when the women and children are not around — the hair on the back of his neck stands up, just as it did when he used to line up targets in his F-16 fighter jet.
Afterward, just like the old days, he compartmentalizes. “I feel no emotional attachment to the enemy,” he said. “I have a duty, and I execute the duty.”
Drones are not only revolutionizing American warfare but are also changing in profound ways the lives of the people who fly them.
Colonel Brenton acknowledges the peculiar new disconnect of fighting a telewar with a joystick and a throttle from his padded seat in American suburbia.
Partial Transcript; to read entire Post, go to URL above!
TABACCO: In 1945, the United States was the only country with Atom Bombs – we had just 2! Today, how many countries have Atom Bombs?
In 2012, the United States has a Monopoly on Drones and a HEADSTART on KILLER DRONES! But Nuclear Material is UNNECESARY! If we are using them on people in other countries, how do we object when other countries or Terrorists use them here!
Try to imagine “WAR OF THE WORLDS” on Main Street – but without Extraterrestrials!
Tabacco: I consider myself both a funnel and a filter. I funnel information, not readily available on the Mass Media, which is ignored and/or suppressed. I filter out the irrelevancies and trivialities to save both the time and effort of my Readers and bring consternation to the enemies of Truth & Fairness! When you read Tabacco, if you don’t learn something NEW, I’ve wasted your time.
Tabacco is not a blogger, who thinks; I am a Thinker, who blogs. Speaking Truth to Power!
In 1981′s ‘Body Heat’, Kathleen Turner said, “Knowledge is power”.