Anti-landmine Campaigners Target War Robots w/poll

Original article by Jason Mick via Dailytech.com:

Warbots. Not just human controlled, but autonomous Warbots. Think about that for a minute

A group that has long focused its lobbying efforts on stopping the proliferation of land mines is turning its attention to a surprising new target: war robots. In the first known instance of a non-government group protesting against war robot technology, the London-based charity, Landmine Action, hopes to ban autonomous killing robots in all 150 countries currently bound by the current land mine treaty.

Just what we need. Warbots allow militaries to kill without costing any friendly lives (as long as the friendly lives are not in the killing zones). It’s a dream come true. Let’s face it, if we support military action even when our armed forces are at risk, think how we’ll feel if we can just send in warbots. War, all of the time with no risk to our armed forces.

While all machine gun-packing robots currently are human-controlled, the U.S. Department of Defense has expressed interest in deploying autonomous robot warriors onto the battlefield in the near future. Last month DailyTech reported that Noel Sharkey, a robot researcher at Sheffield University, expressed controversial concerns about the ethics of autonomous war robots. He stated that such robots might be capable of “war crimes”.

Imagine putting a warbot on trial for war crimes. Would the people who order their deployment be charged? It’s a frightening scenario.

Sharkey’s speech inspired Landmine Action to take action against the war robots. Richard Moyes, Landmine Action’s director of policy and research, says the fight against autonomous killers is not a policy switch. He says the organization has already fought cluster bombs, which use infrared sensors and artificial intelligence to decide when to detonate. Landmine Action believes that taking the targeting decision out of human hands, and putting it in a machine’s is a deadly one.

Moyes explains, “That decision to detonate is still in the hands of an electronic sensor rather than a person. Our concern is that humans, not sensors, should make targeting decisions. So similarly, we don’t want to move towards robots that make decisions about combatants and noncombatants.”

Don’t worry, everyone’s suspect! It’ll be interesting to see how the warbot makes its decisions as to who to target. Who knows, maybe it’ll be so ethical that it will refuse to shoot anyone.

Many in the robotics community express agreement with Sharkey’s sentiment. Peter Kahn, researcher on social robots from the University of Washington, states that he believe Sharkey to be correct and hopes that robotics researchers will stop taking government money to design war robots. He argued to his colleagues at a conference on Human-Robot Interaction in Amsterdam, “We can say no. And if enough of us say it we can ensure robots are used for peaceful purposes.”

However, most in the robotics community feel this is impossible as most robotics research is funded by the Defense Department. Says one anonymous U.S. researcher at the conference, “If I don’t work for the DoD, I don’t work.”

Speaking of ethics…show me the money!

Many, however, remain skeptical of the wisdom of deploying increasingly intelligent robots onto warzones across the world. They point to the many science fiction scenarios, which depict humanity at war with killer robots of their own creation. While this may seem farfetched, the issue of war robots is becoming a serious one that the world’s brightest minds are trying to grapple with.

Warbots, coming to a warzone near you.  The question is who is doing the programming of said Warbots.  Are you willing to trust any government in a Warbot’s programming?

Originally posted here: http://rjones2818.blogspot.com/2008/…

Is the Pony/Pie/Hide rating system too cutsie?

View Results

Loading ... Loading ...

2 comments

  1. Maybe it’s a Borg pony, but ponies none the less!

Comments have been disabled.