Hawking, Musk: “Starting a military A.I. arms race is a bad idea”

This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

Stephen Hawking. (Courtesy: Flickr/NASA HQ PHOTO)

(CBS News) – The latest warning from renowned physicist Stephen Hawking and tech entrepreneur Elon Musk about the dangers of artificial intelligence targets its potential on the battlefield.

In an open letter from the Future of Life Institute (FLI), a research institute focused on mitigating possible threats “from the development of human-level artificial intelligence,” Musk and Hawking, among many others, paint the gruesome image of a world in which we allow artificial intelligence systems to kill without human intervention.

Musk and Hawking are scientific advisors to FLI, and Musk donated $10 million to the organization in January.

“If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow,” the letter, released Monday, reads.

AI, the letter argues, could make fully autonomous weapons, such as armed drones that can search out and attack targets based on a defined set of criteria, “feasible within years, not decades.” They will be cheap and easy to mass-produce, it says, and should be banned before they become a reality.

The letter, sponsored and cosigned by dozens upon dozens of professors and scientists, including Apple cofounder Steve Wozniak and MIT’s Noam Chomsky, states simply: “Starting a military AI arms race is a bad idea.”

Musk has previously likened AI to “summoning the demon,” and said that it could be “more dangerous than nukes.”

Note: The above text is excerpted from an original report appearing at CBSNews.com. Click here to read the full story.

Notice: you are using an outdated browser. Microsoft does not recommend using IE as your default browser. Some features on this website, like video and images, might not work properly. For the best experience, please upgrade your browser.