Elon Musk continues his criticism of artificial intelligence, calling for immediate regulation of this nascent field
If not regulated or controlled soon, Artificial Intelligence (AI) will become an "immortal dictator" and there will be no escape for humans, Tesla and SpaceX founder Elon Musk has warned.
In a new documentary on AI, Musk said: "At least when there's an evil dictator, that human is going to die. But for an AI there would be no death. It would live forever, and then you'd have an immortal dictator, from which we could never escape".
"If AI has a goal and humanity just happens to be in the way, it will destroy humanity as a matter of course without even thinking about it. No hard feelings," Musk told Chris Paine, the director of the new documentary titled "Do You Trust This Computer?"
Paine had earlier interviewed Musk for the documentary titled "Who Killed The Electric Car?". Musk has always been a critic of AI and asked for stiff regulations to curb the technology. In a recent tweet, Musk said that people should be more concerned with AI than the risk posed by North Korea.
"If you're not concerned about AI safety, you should be. Vastly more risk than North Korea," Musk tweeted. Musk has also quit the board of OpenAI, a non-profit AI research company he co-founded that aims to promote and develop friendly AI that benefits the humanity.
In 2014, Musk said AI was humanity's biggest existential threat later adding that the United Nations needed to act to prevent a killer robot arms race. Musk has been speaking frequently on AI and has called its progress the "biggest risk we face as a civilisation". "AI is a rare case where we need to be proactive in regulation instead of reactive because if we're reactive in AI regulation it's too late," he said.
In a recent verbal spat with Facebook CEO Mark Zuckerberg who is a big advocate for the AI technology, Musk said: "I've talked to Mark about this (AI). His understanding of the subject is limited".
Zuckerberg replied: "I think people who are naysayers and try to drum up these doomsday scenarios — I just, I don't understand it. It's really negative and in some ways, I actually think it is pretty irresponsible".