Musk, Hawking Warn of Global Arms Race With Weapons Using A.I.

Elon Musk, CEO of Tesla Motors Inc., unveils the company’s newest products, Powerwall and Powerpack in Hawthorne, Calif., Thursday, April 30, 2015. (AP Photo/Ringo H.W. Chiu)
Start your day with TPM.
Sign up for the Morning Memo newsletter

LONDON (AP) — Scientists and tech experts — including professor Stephen Hawking and Apple co-founder Steve Wozniak — warned Tuesday of a global arms race with weapons using artificial intelligence.

In an open letter with hundreds of signatories, the experts argued that if any major military power pushes ahead with development of autonomous weapons, “a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.”

Some people have argued in favor of robots on the battlefield, saying their use could save lives. Such weapons are still years away.

But the scientists warned that, unlike nuclear weapons, once they are developed they will require no costly or hard-to-obtain raw materials — making it possible to mass-produce them.

“It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc.,” the letter said.

The signatories included leading figures globally in academia and business studying artificial intelligence — the idea that computer systems could replicate tasks normally requiring human intelligence, such as language translation or visual perception. They were joined by philosophers, historians, sociologists and geneticists.

Those signing letter included Elon Musk, Tesla Motors CEO; Demis Hassabis, who founded Google DeepMind; and Noam Chomsky, an emeritus professor at MIT.

Sean O hEigeartaigh, the executive director of Cambridge University’s Center for the Study of Existential Risk, said that he is hoping for a discussion on whether autonomous weapons should fall into the same category as chemical weapons and blinding lasers — namely that they be shunned.

“It’s imperative to hear the voices of the scientists,” he said of the many who have devoted their lives to having such systems benefit humanity.

Copyright 2015 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Latest News
3
Show Comments

Notable Replies

  1. Yes, that’s a scary looking future. Drones are already bad enough in terms of terrorizing a population. Wait till little ones are as cheap as a Claymore mine. Damn.

  2. I think the danger is weapons who choose their own targets and take decision on killing them away from humans. Then politicians won’t have to take responsibility for outrageous deaths, just blame the AI. “See, I didn’t kill massive numbers of innocents, the AI in the weapons did.” A kind of horror-movie reversal of the old NRA slogan “Guns don’t kill people …”

Continue the discussion at forums.talkingpointsmemo.com

Participants

Avatar for system1 Avatar for condew Avatar for carlosfiance

Continue Discussion
Masthead Masthead
Founder & Editor-in-Chief:
Executive Editor:
Managing Editor:
Deputy Editor:
Editor at Large:
General Counsel:
Publisher:
Head of Product:
Director of Technology:
Associate Publisher:
Front End Developer:
Senior Designer: