Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

UglyGreed

(7,661 posts)
Wed Nov 12, 2014, 06:21 AM Nov 2014

Fearing Bombs That Can Pick Whom to Kill

On a bright fall day last year off the coast of Southern California, an Air Force B-1 bomber launched an experimental missile that may herald the future of warfare.

Initially, pilots aboard the plane directed the missile, but halfway to its destination, it severed communication with its operators. Alone, without human oversight, the missile decided which of three ships to attack, dropping to just above the sea surface and striking a 260-foot unmanned freighter.

Warfare is increasingly guided by software. Today, armed drones can be operated by remote pilots peering into video screens thousands of miles from the battlefield. But now, some scientists say, arms makers have crossed into troubling territory: They are developing weapons that rely on artificial intelligence, not human instruction, to decide what to target and whom to kill.

As these weapons become smarter and nimbler, critics fear they will become increasingly difficult for humans to control — or to defend against. And while pinpoint accuracy could save civilian lives, critics fear weapons without human oversight could make war more likely, as easy as flipping a switch.

http://www.nytimes.com/2014/11/12/science/weapons-directed-by-robots-not-humans-raise-ethical-questions.html?_r=0

4 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Fearing Bombs That Can Pick Whom to Kill (Original Post) UglyGreed Nov 2014 OP
The weapons are getting smarter. Nuclear Unicorn Nov 2014 #1
As a programmer - Oh, fuck. djean111 Nov 2014 #2
Was Terminator a prophecy? Renew Deal Nov 2014 #3
'Killer robots' need to be strictly monitored, nations warn at UN meeting UglyGreed Nov 2014 #4

UglyGreed

(7,661 posts)
4. 'Killer robots' need to be strictly monitored, nations warn at UN meeting
Fri Nov 14, 2014, 08:01 AM
Nov 2014

Countries warn of potential dangers of autonomous weapons systems they say are at risk of violating international and humanitarian law

Killer robots” – autonomous weapons systems that can identify and destroy targets in the absence of human control – should be strictly monitored to prevent violations of international or humanitarian law, nations from around the world demanded on Thursday.

The European Union, France, Spain, Austria, Ireland, the Netherlands, Croatia, Mexico and Sierra Leone, among other states, lined up at a special UN meeting in Geneva to warn of the potential dangers of this rapidly advancing technology. Several countries spoke of the need for ongoing scrutiny to ensure that the weapons conformed to the Geneva conventions’ rules on proportionality in war.

The Spanish delegation went further, invoking the possibility of a new arms race as developed countries scrambled to get ahead. Ireland, the Netherlands and other countries called for “meaningful human control” of lethal weapons to be enshrined in international law, although the meeting also admitted that the precise definition of that principle had yet to be clarified.


http://www.theguardian.com/world/2014/nov/13/killer-robots-strictly-monitored-un-meeting-geneva

Latest Discussions»General Discussion»Fearing Bombs That Can Pi...