General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsToby Walsh, A.I. Expert, Is Racing to Stop the Killer Robots (yes, killer robots)
By Claudia Dreifus
July 30, 2019
Toby Walsh, a professor at the University of New South Wales in Sydney, is one of Australias leading experts on artificial intelligence. He and other experts have released a report outlining the promises, and ethical pitfalls, of the countrys embrace of A.I.
Recently, Dr. Walsh, 55, has been working with the Campaign to Stop Killer Robots, a coalition of scientists and human rights leaders seeking to halt the development of autonomous robotic weapons.
We spoke briefly at the annual meeting of the American Association for the Advancement of Science, where he was making a presentation, and then for two hours via telephone. Below is an edited version of those conversations.
You are a scientist and an inventor. How did you become an activist in the fight against killer robots?
It happened incrementally, beginning around 2013. I had been doing a lot of reading about robotic weaponry. I realized how few of my artificial intelligence colleagues were thinking about the dangers of this new class of weapons. If people thought about them at all, they dismissed killer robots as something far in the future.
From what I could see, the future was already here. Drone bombers were flying over the skies of Afghanistan. Though humans on the ground controlled the drones, its a small technical step to render them autonomous.
So in 2015, at a scientific conference, I organized a debate on this new class of weaponry. Not long afterward, Max Tegmark, who runs M.I.T.s Future of Life Institute, asked if Id help him circulate a letter calling for the international community to pass a pre-emptive ban on all autonomous robotic weapons.
I signed, and at the next big A.I. conference, I circulated it. By the end of that meeting, we had over 5,000 signatures including people like Elon Musk, Daniel Dennett, Steve Wozniak.
</snip>
More scary stuff at link...
lapfog_1
(29,199 posts)and you have a killer robot.
The only difference between a drone and killer robot is who pulls the trigger after finding a target. With a drone, there is a human operator... with an AI the computer decides.
That said, many drone operators are really just playing what looks like a electronic game... a first "shooter" game... and have become so desensitized to pulling the trigger to eliminate the "enemy" that they might as well be AI.
Not to mention that I am certain that the RND guys at DARPA have already build killer robots and drones.
3Hotdogs
(12,375 posts)Calculating
(2,955 posts)How the hell do you get PTSD sitting in an air-conditioned room in no danger whatsoever, and basically playing a flight simulator game? What an insult to the troops on the ground who are actually in harm's way.
LanternWaste
(37,748 posts)How the hell do you get PTSD sitting in an..."
oasis
(49,382 posts)DetlefK
(16,423 posts)Google has programmed Alphastar, a learning AI based on a neural-networks-architecture.
Google is currently testing Alphastar's capabilities by having it learn and play the videogame "Starcaft II". Starcraft II is a real-time startegy game with armies and bases and decisions being made on scouting, limited information and guesswork. Being able to guess your opponent's next step from the tiniest snippet of information is absolutely crucial in Starcraft II tournaments.
Alphastar learned playing Starcraft II by playing it via trial&error for the equivalent of 200 human years. In its first official game, it beat Mana, one of the Top 5 Starcraft II players in the world. Alphastar displayed unorthodox strategies that no human player had ever thought about and micromanaged its units to perfection. Mana eventually won a rematch.
Google is currently writing a research-paper on Alphastar's abilities. To that effect, Alphastar is currently playing against top-ranked players. So far, it has won about 95% of matches.
Now imagine an AI in an airplane-drone that has spent 200 years learning how to win an aerial dogfight against other planes.
Imagine an AI in an armed quadcopter-drone or in tank-like tracked body that has spent 200 years learning how to stalk and kill infantry in an urban environment.
JonathanDough
(9 posts)hunter
(38,311 posts)But mostly "dog fights" are a thing of the past.
Missiles get smarter and faster and the side that's not wasting any resources keeping plots alive in the air wins.
Not that it matters. 80% of the U.S. military is a twisted public works agency and money laundering operation.
nycbos
(6,034 posts)Javaman
(62,530 posts)Dennis Donovan
(18,770 posts)...we used poisonous gasses. And we poisoned their asses.
Johnny2X2X
(19,066 posts)We're still decades from having truly sentient AIs, but the AIs we already have are immensely capable of analyzing data and detailing conclusions from that data.
The question isn't what will I do when an AI replaces me at work? The question is what will the country do when an AI replaces an entire industry?
And I've got some news, it's not the manual labor that other automation has replaced, it's the jobs that take brain power that AIs will replace. Accountant and Software Engineer are first up. The entire product engineering methodology has been tailored to eliminate the need for software engineering expertise as it is, we're not far off from being able to have knowledgeable engineers write the requirements of the system and feed them to a program to write and test the software wholesale, complete with all necessary artifacts for any certification of the product. People are outraged that Boeing outsourced software work to India for $9 an hour, but the development process they use has made the software work less and less critical. The 737Max was not a software error, it was a design flaw.
People seem outraged, but I work in the field, I don't care who codes something, it could be coded by a middle school class, as long as I can prove the coding standard was followed, and it passes the requisite testing to satisfy the approved design, I can get something certified to fly by the FAA. Do I want experienced software engineers? Sure I do, they save money by writing code that is more easily testable and has less rework, but it's a cost benefit analysis every time.
ProfessorPlum
(11,257 posts)I've long thought about this - the perils of going after "enemy humans" and easily that could become "all humans". Philip K Dick had several stories along those lines.
Dennis Donovan
(18,770 posts)blugbox
(951 posts)Project Insight.
(whispered) Hail Hydra